var/home/core/zuul-output/0000755000175000017500000000000015153544122014527 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015153556127015503 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000340250015153556027020263 0ustar corecoreܮikubelet.log_o[;r)Br'o -n(!9t%Cs7}g/غIs,r.k9Gfͅ J苴I_翪|mvſFެxۻf+ovpZj!Kޒ/h3_.gSeq5v(×_~^ǿq]n>߮}+ԏbś E^"Y^-Vۋz7wH׋0g"ŒGǯguz|ny;#)a "b BLc?^^4[ftlR%KF^j 8DΆgS^Kz۞_W#|`zIlp_@oEy5 fs&2x*g+W4m ɭiE߳Kfn!#Šgv cXk?`;'`&R7߿YKSLSWɀ;$#LcdHM%vz_. o~I|3j dF{ "IΩ?PF~J~ ` 17ׅwڋًM)$Fiqw7Gt7L"u 0V9c  ˹dvYļU[ Z.׿/h QZ*U1|t5wKOؾ{mk b2 ܨ;RJK!b>JR*kl|+"N'C_#a7]d]sJg;;>Yp׫,w`ɚ'd$ecwŻ^~7EpQС3DCS[Yʧ?DDS aw߿)VxX帟AB}nyи0stĈCo.:wAZ{sy:7qsWctx{}n-+ZYsI{/.Ra9XcђQ0FK@aEDO2es ׇN# ZF͹b,*YVi+$<QMGhC}^}?BqG!(8l K3T[<~6]90}(*T7siv'=k 9Q2@vN ( R=XVy#iEW&q]#v0nFNV-9JrdK\D2s&[#bE(mV9ىN囋{W5e1߯F1>9r;:J_T{*T\hVQxi0LZD T{ /WHc&)_`i=į`PÝr JovJw`纪}PSSii4wT (Dnm_`c46A>hPr0ιӦ q:Np8>R'8::8g'h"M{qd 㦿GGk\(Rh07uB^WrN_Ŏ6W>Bߔ)bQ) <4G0 C.iTEZ{(¥:-³xlՐ0A_Fݗw)(c>bugbǎ\J;tf*H7(?PЃkLM)}?=XkLd. yK>"dgӦ{ qke5@eTR BgT9(TڢKBEV*DDQ$3gFfThmIjh}iL;R:7A}Ss8ҧ ΁eor(Ё^g׬JyU{v3Fxlţ@U5$&~ay\CJ68?%tS KK3,87'T`ɻaNhIcn#T[2XDRcm0TJ#r)٧4!)'qϷכrTMiHe1[7c(+!C[KԹҤ 0q;;xG'ʐƭ5J; 6M^ CL3EQXy0Hy[``Xm635o,j&X}6$=}0vJ{*.Jw *nacԇ&~hb[nӉ>'݌6od NN&DǭZrb5Iffe6Rh&C4F;D3T\[ bk5̕@UFB1/ z/}KXg%q3Ifq CXReQP2$TbgK ء#AZ9 K>UHkZ;oﴍ8MEDa3[p1>m`XYB[9% E*:`cBCIqC(1&b f]fNhdQvݸCVA/e.# Okx܍>М>ӗom$rۇnu~Y݇̇TIwӜ'}׃nxuoỴRZ&Yzbm ]) %1(Y^9{q"4e?x+ [Vz;E|d1&ږ/0-Vb=SSO|k1A[|gbͧɇد;:X:@;afU=Sru CK >Y%LwM*t{zƝ$;ȾjHim @tBODɆj>0st\t@HTu( v e`H*1aK`3CmF1K>*Mk{_'֜dN${OT-n,'}6ȴ .#Sqη9]5zoX#ZVOy4%-Lq6dACYm*H@:FUф(vcD%F"i ' VVdmcOTKpwq.M?m12N[=tuw}opYG]2u<ΰ+a1tHayɒ aY(P*aaʨ@ΰ<pX X{k[%Egl1$9  ֲQ$'dJVE%mT{z`R$77.N|b>harNJ(Bň0ae3V#b,PY0TEu1L/]MTB4$`H6NI\nbǛ*AyA\(u|@ [h-,j7gDTÎ4oWJ$j!f{g6R/wD_tՄ.F+HP'AYM; R j"b~ I,($F{ձ <%fpG"m%6PGEH^*JLJ)oE&&v3,JE- ZK䚒}0BOnYr猸p$nu?ݣ RF]᎒Hw2k혿q}lrCy u)xF$Z83Ec罋}[εUX%}< ݻln"sv&{b%^AAoۺ(I#hKt~6S#c"Qlx9C~Cn$(6Q`2ێ(i"i .^riδtZL!`n4"Z XnskOa\WJ}%b[uemNi_󈛥^g+!SKq<<78NBx;c4<ニ)H .Pd^cRq+E--ۥ_F]a|v@|3p%kzh|k*BBRib\J3Yn|뇱[FfP%M:<`pz?]6laz5`ZQs{>3ư_o=%*[.\MA/Xp9VqNo}#ƓOފO&uUm$[[-HI4QCZ5!N&D[uiXk&2Lg&Ս76v_cd[2nM|[<\t=3^qOp0y}|B}yu}뚬"P.ԘBn방u<#< A Q(j%e1!'kqiP(-ʢ-b7$66|*f\#} YmKM$ 6z]j `7ruuŨԀ![Z !iHlf[7Ua6BEZd9 NpydqZrS6U A~@Ve Ȇ*d96 FuQƈkmb]sl pRpvUEM.wuZ6]( 1aVf~|^C.SS P󂏛o n8Dkb^s&a[s~W &ɿ^\r\ߺnq\V@z5=\#|-3ڝa$NM[xS 0FWPM]gl}>6sUD5f p6mD[%Z\vm̓G!n&.TU n$%rIwPO(fwnv :Nc=X~axfo;Vw}wvRS1q!z[p r`jﺧK]0/k<%do@˭d- T5 $4ذ:[a>֋&"_ }õp nzX>B[2_@L `CZÆz3~ }[ tŪ)۲-s %G!?xL 2ڲ]>Ni+m^CM&7Tj7ȗE!C6P}H`kN(ߌ:-N@SĔRwtce܂m((5z΍[`-fYX_pL o+1wu(#'3"fxɷ]c9,W˲|~{;Vm >|WAޭi`@bIEOJLi[? 1A ۥ͟յt9 ",@9Q+iq) j.8`hiPܮbC7~n b?`CtjT6l>X+,Qb#Nn 7feXÅ0+>8ök)6vnM!5Cu>8WqiW`m] leu]WQz.=N:5Gܽfq+5Lg-ʄvן\5C\-&W-qQ4Mv8pS俺k}R0`pl.xe`-u+[w 5쭅]nϏT:6Tn@.>C@{P 9::3(6e™nvOσ =?6ͪ)Bppًu_w/m/0}T>CUX\!xl=ZVM\a_6h㗶E۶{O#X26.Fٱq1M )O0BďJrDd\TDFMEr>8yܑ4> >X1 smD) ̉TީXi߃ʟ~㍖›f!OI1R~-6͘!?/Vvor4~6I@GNݖ-m[d<.l9fbn,GeO2sٟ+;U t>x]U5g B(, qA9r;$IN&CM(F+ hGI~Q[, qnriY]3_P${,<\V}3*UP0Sp8:>m(Zx ,c|!0=0{ P*<7.Va*H7MQK"<O%MTTtx袥:2JޚݶKd7UZihRk7!VDqiގ\<:Ѓ3"gJJčE&>&EI|I˿j2ǯɘCGa9C1L <{fm&'ui'k$DA' elW@Tiv{ !]Ճ`!OTnPFp':η4dMF=O_z;jXh ׄ{t.ocxeBeߛ:ύq;Clf> "NqK$2$ Ri ?2,ᙌUK@-V3ʱd:4Kwm2p,cVvmvװ<}IZ᢬-'~DWc;j FRrI5%N/KT F w#X6ShgчsSW_^8'{RY.~XfWY/Uog:smBi1 YBX4),[c^54cg(s$sN' 88`wC3TY f.s{hkN.g^s7`zDc D]ͺjgw07'㤸z YJ\Hb9Ɖ„2Hi{(2HFE?*w*hy4ޙM^٫wF(p]EwQzr*! 5F XrO7U[!gJ^.a&HߣaaQÝ$џwyz4}1!gP_oIyMsPIȰքbFEűEK . ˇj".BBX9%5{M Q 1yV|[jfLs FN*t*oӐ57\Vlhl'jZp1~#aY뼧iùth'*sI~'x#bv՗*1֌IN}YN+VqozmyY4Ӳ2S<(j%\IGλ`ʌ=]ח?Ώ2J0BLzv8D<%P\MUfN-0]"DBQlt #;NMBQ&{&x6 ݲc|!t(ڀ ~/lˣaVۤkĨdԖ)RtS2 "E I"{;ōCb{}ex&Td3>@)p$`Xs@7,?ע|tC3.㤣TiHEIǢƅaeGF$ u2`d)/-st{E1kٌS*#¦۵_Vu3ЩpRIDr~ZMap@OTi& J1'%x߭&;~+WrVXSZEY|RyYc]/mm}àpGg.S[@OM voiĸ&Tt (6{\٢K 5XGU/m >6JXa5FA@ q}2BڒK(DjrO' ,: k5u]#LLb FDK|ק_;%X6Q@d 8&a)a.#ۿD> vfA{$g ăyd) SK?ɗ ۓ,j|z6OSu;BKŨʐPqO K\{jDiy@}b|Z79ߜih(+PKO;!o\戔-QB EM;oH$$]?4~YrXY%Ο@oHwlXiW\ΡbN}l4VX|"0]! YcVi)@kF;'ta%*xU㔸,A|@WJfVP6`ڼ3qY.[U BTR0u$$hG$0NpF]\ݗe$?# #:001w<{{B\rhGg JGIެE.:zYrY{*2lVǻXEB6;5NE#eb3aīNLd&@yz\?))H;h\ߍ5S&(w9Z,K44|<#EkqTkOtW]﮶f=.*LD6%#-tңx%>MZ'0-bB$ !)6@I<#`L8턻r\Kuz*]}%b<$$^LJ<\.Yf18m@n1YHR=53hHT( Q(e@-#!'^AK$wTg1!H$|HBTf̋ Y@Mwq[Fī h[W,Ê=j8&d ԋU.I{7O=%iG|xqBչ̋@1+^.r%V12, _&/j"2@+ wm 4\xNtˆ;1ditQyc,m+-!sFɸv'IJ-tH{ "KFnLRH+H6Er$igsϦ>QKwҰ]Mfj8dqV+"/fC Q`B 6כy^SL[bJgW^;zA6hrH#< 1= F8) 򃟤,ŏd7>WKĉ~b2KQdk6՛tgYͼ#$eooԦ=#&d.09DHN>AK|s:.HDŽ">#%zNEt"tLvfkB|rN`)81 &ӭsēj\4iO,H̎<ߥ諵z/f]v2 0t[U;;+8&b=zwɓJ``FiQg9XʐoHKFϗ;gQZg܉?^_ XC.l.;oX]}:>3K0R|WD\X^tO/o۷~|hw^5wE of7cꃱ.)7.u/}tPTGc 5tW> l/`I~>|灹mQ$>N |gZ ͜IH[RNOMTq~g d0/0Љ!yB.hH׽;}VLGp3I#8'xal&Ȑc$ d7?K6xAH1H#:f _tŒ^ hgiNas*@K{7tH*t쬆Ny497ͩ KVsVokwW&4*H'\ d$]Vmr달v9dB.bq:__xW|1=6 R3y^ E#LB ZaZd1,]ןkznxtK|v+`VZ3JϧC^|/{ś}r3 >6׳oƄ%VDSWn 0,qh! E-Z%ܹpU:&&fX+EǬ.ťqpNZܗÅxjsD|[,_4EqgMƒK6f/FXJRF>i XʽAQGwG%mgo 恤hˍJ_SgskwI\t`ﶘ080ƱQŀllKX@116fqo>NrU Ѣ9*|ãeeH7.z!<7zG4p9tV|̢T`˖E ;;,tTaIUle*$!>*mBA2,gJIn_kSz)JC]?X(OPJS3.}clݨ{e!MB,cB߮4af祋,1/_xq=fBRO0P'֫-kbM6Apw,GO2}MGK'#+սE^dˋf6Y bQEz}eҏnr_ ^O^W zw~Ȳ=sXअy{E|\΋"?|NKfֱn !¶6!=?8[Y|-ɬeǪzd;-s~CM>e:9[_v~\:P ؇'k01Q1jlX)/ΏL+NhBUx~Ga>Z"Q_wjTLRˀtL L+BT҂ll魳cf[L̎`;rK+S- (J[(6 b F? ZvƂcW+dˍ-m𢛲@ms~}3ɱ© R$ T5%:zZ甎܋)`ŰJ38!;NfHohVbK :S50exU}W`upHЍE_fNTU*q%bq@/5q0);F74~'*z[\M-~#aSmMÉB2Nnʇ)bAg`u2t"8U [tJYSk, "vu\h1Yhl~[mhm+F(g 6+YtHgd/}7m]Q!Mę5bR!JbV>&w6οH+NL$]p>8UU>Ѫg39Yg>OF9V?SAT~:gGt $*}aQ.Zi~%K\rfm$%ɪq(%W>*Hg>KStE)KS1z2"h%^NEN?  hxnd/)O{,:خcX1nIaJ/t4J\bƀWc-d4M^d/ ʂK0`v%"s#PCoT/*,:[4b=]N&, ,B82^WK9EHLPm))2.9ȱ  QAcBC-|$M\^B!`}M^t+C~Lb }D>{N{Vt)tpDN,FCz~$)*417l;V iэ(_,j]$9O+/Sh]ice wy\Mڗ$,DJ|lj*à␻,?XAe0bX@ h0[}BU0v']#Vo !ې: Z%ƶ(fl>'"Bg< 0^_d0Y@2!ӸfZ{Ibi/^cygwדzY'Ź$:fr;)ٔf ՠ3Kcxwg*EQU{$Sڸ3x~ 5clgSAW"X Pҿ.ظwyV}̒KX9U1>V..W%GX +Uvzg=npu{do#Vb4ra\sNC/T"*!k愨}plm@+@gSUX覽t01:)6kSL9Ug6rEr(3{ xRP8_S( $?uk| ]bP\vۗ晋cgLz2r~MMp!~~h?ljUc>rw}xxݸǻ*Wu{}M?\GSߋ2ꮺ5w"7U0)lۨB0ח*zW߬V}Z۫ܨJ<]B=\>V7¯8nq~q?A-?T_qOq?5-3 |q|w.dަ'/Y?> (<2y. ">8YAC| w&5fɹ(ȊVã50z)la.~LlQx[b&Pĥx BjIKn"@+z'}ũrDks^F\`%Di5~cZ*sXLqQ$q6v+jRcepO}[ s\VF5vROq%mX-RÈlб 6jf/AfN vRPػ.6<'"6dv .z{I>|&ׇ4Ăw4 [P{]"}r1殲)ߚA 2J1SGpw>ٕQѱ vb;pV ^WO+į1tq61W vzZ U'=҅}rZ:T#\_:ď);KX!LHuQ (6c94Ce|u$4a?"1] `Wa+m𢛲`Rs _I@U8jxɕͽf3[Pg%,IR Ř`QbmүcH&CLlvLҼé1ivGgJ+u7Τ!ljK1SpHR>:YF2cU(77eGG\ m#Tvmە8[,)4\\=V~?C~>_) cxF;;Ds'n [&8NJP5H2Զj{RC>he:ա+e/.I0\lWoӊĭYcxN^SPiMrFI_"*l§,̀+ å} .[c&SX( ( =X?D5ۙ@m cEpR?H0F>v6A*:W?*nzfw*B#d[se$U>tLNÔ+XX߇`cu0:U[tp^}{>H4z 4 (DtH-ʐ?sk7iIbΏ%T}v}e{aBs˞L=ilNeb]nltwfCEI"*S k`u ygz[~S [j3+sE.,uDΡ1R:Vݐ/CBc˾] shGՙf 2+);W{@dlG)%عF&4D&u.Im9c$A$Dfj-ء^6&#OȯTgرBӆI t[ 5)l>MR2ǂv JpU1cJpրj&*ߗEЍ0U#X) bpNVYSD1౱UR}UR,:lơ2<8"˓MlA2 KvP8 I7D Oj>;V|a|`U>D*KS;|:xI/ió21׭ȦS!e^t+28b$d:z4 .}gRcƈ^ʮC^0l[hl"য*6 ny!HQ=GOf"8vAq&*țTOWse~ (5TX%/8vS:w}[ą qf2Lυi lm/+QD4t.P*2V J`\g2%tJ4vX[7g"z{1|\*& >Vv:V^S7{{u%[^g=pn]Y#&ߓTί_z7e&ӃCx;xLh+NOEp";SB/eWٹ`64F 2AhF{Ɩ;>87DǍ-~e;\26Lة:*mUAN=VޮL> jwB}ѹ .MVfz0Ïd0l?7- }|>TT%9d-9UK=&l&~g&i"L{vrQۻou}q}hn+.{pWEqws]]|/ǫ\}/J.MLmc ԗWrU}/Ǜ+sYn[ﯾeywyY]]¨Kpx c./mo;ߟRy*4݀wm&8֨Or4 &+Bs=8'kP 3 |}44S8UXi;f;VE7e4AdX-fS烠1Uܦ$lznlq"җ^s RTn|RKm;ԻZ3)`S!9| ?}m*2@"G{yZ${˪A6yq>Elq*E< NX9@: Ih~|Y4sopp|v1f2춓t$?m\+*/RML;~WS.%yl_ %d5(fM_7}\F&Q4IxɎ]c)ZYUIe6^O%cU-iXu U!*ڀwDUVI-OM9vX)2-nGk="*FCfslMeۓKy/yH"cU&`-|h D<xTcUeL{;V3. n$m;m#2M\l6Ns&?4_WW5z 畈Kk|x4JgcXӲ^2k)$fg}Z&7侀9NJ gNI:%e'_0v>7,3eEύzm1y2qr-?v|?/<"7"j! ŕZĎbfl[a|`mθY2{j=cNb7f@0iҨk,f^{v̷ɗs^r|yK9 Z5䈼M&X$<}wǓgO-g^w͙K mp?ca;V]qSvKUrB_XIOSX9I_297-5 A]]3Ew
  • M(&'Q_>oooNp){'P`|2v4 <ǮPqH7xFݫ-@h 4FqS2hv =PIϊ<#!^ys w\_(L>] Q7UF'>w!__ID#z޷C,~r9KXjH{@x?ӼN拦vW ;Szy^Mj>"Js@VC72'6Β(  \ߊקz@w=&6Y¶l90zzu PX__.PHǵv8򯳋6.ɀ*6˼-ap [|oi=jKp1r"_h #z?MU`p`|l6GsL.(@3tޝ~h'7 QoQP"+_ʵ2H(@_ j!mZP~A Is#14+S&sYye'pQ>~J:B6UE[^䴬ʍ.ڍad .jr ys)&QECq8|iwY2GjT*YVB2Ш e^o>`fu~UpR ei~zg:'֯RyC2\8Zf.J4M !Á_ϓ<\AB-ɜWѭ-p2޿ 6Q\[d#D%@3񇨋t3! su8y!X[A0/o Hw%8@9&|%uEW itHUDaAD}`WD2TDM'bB+ `PrB$/w'CF!\ PC'ɪ5U.bXO>F^wpւ|łB"㗶bbÈcoSP<;w/'A9BGTEm3h{0ׂm^XȃIu ݼo yHf~8DiYu#M9>L]x{Mq%rZD:>iZ#E0ԀyddcH/9D-GRHRTR}#O!3XJ.Z!Mh4o$\>>Uf`fWPw:h΋l&up$z(uƐ;#D6>á<U' U]l.W{4|ND+x GHR.j~pQ.&` eNӑBJq /bm.x  w$VqXtkT~C }Ya=ӌeR]Kswިu \z*ϐF~)߿ڻx{*y,]@_X@f.)'rfH;<1CdV$['5X\8JG\DY$ g݁Ю$Hv7̻Q5 uFZZڻSVWص #ŭKBE.7b7Zi 0 j.v>e,\Ճw".XϺ']? #ZUy@HF-FRiRQ[VHHvT )y!Wi68>~jX:gi,]+ܑ Mym-ꊣwv`SdX{G0uҽL:ϼշ3@cFR٪10ȫo:Wx#yTwM+m$Z*À FtyaOcc} hi ª. PkZRhodm%lRg(L^7J|nE.cjk.zŕCKB6Wf--hR~7ҵ,Y+*s #Jr~gپųP=]".Ox&.Xbzʷ&s~CNȂLjnx^O9/ZWu1,n,y£Z,sGΌE2_H!\$pGiqEz U#jQnD搓6܃XPj-2R_wcM'ӵqatW}SDmkdٷ!5ץ[hKSWڗ+<ƴ&j&*gs]`,in.~Ai-!m}͍&w}=8mf,B_WD從WcpNdG5GYݒHJ*)7ۑ@ LG Rm#DCCf#'R ,wwZ2W tQkK,9DI[ΖXhOA D ^IeWf\s/%B!(/?`zS#[ te.ۦ>ܴJL B{jyԲ|ںb"z\TLvR?~2*OuOg'E}j Cs֪2ISPwzun[rڈ[ -u,̨ѠPa#?r%B;y{*Z)\4djT+v0$m1|,!,qh zDa0ߧ<08ąL!쮣BB\k.tZf:=SGc - ֬pAzl#--`+@ѕE=T) 4"En1uZzio@^Xm^L BK[WgY7eZCZ*fѮ3h|b5+Y ,./&iMz;.jwy=<:>ZQ O",f ϗx(=%BBGZ@#]˞8Xze.3Kwᶖ=uAY3ǜw} {HŝI# ,GnsU#S-n ]73 lHۦBd=v\,yчi< @UxMZw;ҝyU0ɨDEF۳=IXWF?\ˡ$Wvm+q;Tu,b~;2bGHs {O$_B9Z%Qm^L^k-ڶY.By/>zX6~d&Ta Dc( v O8}pTp 1CXi}G=PCDx< At d%! C6 oH7m ȂCF8HBMiOPHg}ڊ?,zc{:pZBs/O\3luV?%F\Dh<ܴ^1up6w #K_r5q C M\p 1U[)@A%o0me+$@@1iG!8 5޳}6P`C8L<<)E'7'8 ! &"Jn @CZj{_lƼ9iǨ<"Tn`ݔsc|f6,޿"C m~u%qu=3@f*$?~L:-@FP Q=@Y#b\s @hm5!~GLYv<$꛴-\'A0nGyGl|U}¡?}MqnV 3swϒ ל$ug'4ER6Iԑ'.H dW2蘙{MI$v,bJxk1".)(|DާݍdsrKB]V}({ϳ?WȘ>|qNxnΆH(:Oh:9oY}ThhV>Wq] G=N+1 B1uRd ~֔lv*+6o8 'i\ ϧ=%TlQzPW⪮'s.!"BYuoډbWto5ﻼo ߣp i ,ӻ,<9ӳwGã?k`<+IciI0ǃPuH`\] RGJj'B:%옋:2жM .##)q&T~L@دgBG22atG̱g;=XmI'9wcYlT\`L{z8Bhvp"J5Ӱb;"{ paWEL/XΛ47% դMXbsv˦q^aY5yzӫ4[jfm>foIijeҸy5V~+Օ f}?/x+ GjEhJ ^0cz:Jדj%Hb4X\$}ّq8$aG9b3J *yqo@h?%KZc4'`d _mh0]\kN l+h$ fpr\%BrN1M'bI<3XJ+aXDh]f*`=?SMG.ި=|岪hE w%S}}aJ#=ɒ vEU>'iq{tv̮9+ a/|79vo\>-4p73e))$RU?Vu[ͲqjԓzJ- ޠg>;{iNj{KA:H̑~fi,ꭍ۳w{\߳mU*mP^>4CK|MH,@5vOFKgtB:nᾦ_!Gwc'j.B?28@tdÊ Bh-X \u5SM=ٗ\X7k/kX`t>##(9,+kU-oݫTy%>;:[6IUt8#9Ъx2c|8:o-p8V#iqw}1Ae-9ZSBH3bٌvK}+rnj5 (/@{8+EKpCO[}o~:!,$b|4PQ:x_ (`<xRG2!΃=]\Wz=?Ȑ_,"r}k>Ŵ-tfujy'~pzuٯu684 ʹ:,n**uTuu9}$m R~Ӄ.<)jWe ִ ʷ oO(|GB6RM:?WʷUҵO۝;wuVu ٞVi:;RlAJO#ݑPw B*b B*v$TlA\%TnAܞP4B厄-V ۞Piz;mAJO#ߑP BUB- '4x[nAh= w$4@.:t0jFm4%PIq3aV/Boi t&bR\t8& t&(Ѯ:@rvϮ%ZzUˣ5,&K2;bq1A-]&؅_bɠQ_6j~>}'/F ]!|益tFe^e|2BGwL 8:v ˞^0(ŕXqtYxŨW0M<8QC[,?-d UVjѦqOR<-= >ckDI"[9ŠjGM㻆4DY~*TD39T .aA[a|&W4-ׁ%36ŔWPMYxVdNkϪe4j'~ӯ<Ǐ<eBޝl> 22w%-m:u}ݳ^OEdJ0gM|]m֊&~s@2C[mlz2iFr}ý1)bW6\2bqgQVOR*jYU'px4' 9) !FMC%!Oedp1/{:'gY2랆GzŴr)3c[e“!|k6|ڳ`k#T =I4A@C('2 ~$R:%z5='w[ M'ĬyTyZslhty!ћsJqxbJ 8l4Ld$=ph洮pd9A(O FƄ9# ކ.4jе`cF8bɔ{'EŊ5ń/*9 X 07 5+vx 8 (&rXZ{^9w\| +"׼VNJ[Ipg DW >m澂g| m੗~F| gc #+}nQxӿ98f5r<졽=,8MKr0ZMd1jj||)x­ tZ1~G;Tk5r7f-굂JʼWhJwy_DJyпgiA*QGkBrjBpgjb8Z{@-l8&"]򾄏:'1v&AĘ5L?ml-:VwVbІpS_z #`Rӆm(l!iΌ!Tٴ0D誨f=uAE{& L%9N{x*r#8'-n&7qbD*&fX9" ☞ Rbfd^lwFS-znc)䞳!գp-t$G۸ @SFt F[C0h=IJ1V7hkݕgZ: ecoei(u=Q5_fl^p@ _#ՖV-jZBzj.5 iLElGe|:B-j6ȕx:B-s\V .k3Jވ7l.d!^NUS~yQQf.$.kA@tf#!ըGhCp\.`5Z Vr= qV.^)gGhU %xPzwwyLł^e8CJ5B^ u(/mkV9ͅ¯15*o\l5.bvCxf%S-4-88ڛE܆gdFJRL5%_شlqx`mdR<ªy{j8z:v  I7R;bsZiTjўYiy]pVF{G.8e1+#<)YWdXe m6룦޸0n;g:_M9o`r amN 0ٌV-W 8B{ũ؋`HI HՓ-htAiD-Y\)sj_-`x8CQMkzȁ*7}j,Uq`@%ޑ#1#C<Ny^Ǵu ٶ{aT1x$WJm~;\Rl>βa$%#=L`GZA{i<ݸ@;)Chqaa;Z"b~]]ocɍ/\u ԧli$6}YWupT ;:鞄-WC汽՗6}M?sePb 2m_3I̿L@_,ݟؗעƟؗM!yTqih;쭯OW\9%g9#oI3Ckm,4$?Mܜ.r6R=1%fhSKZ]w9W_9TWsh7m֦JO o[Z!Џ9ѧ OMV_~㘮 4fj2Nw7Kphʒ!k ] &m%+ !fGgC .gޱ4tC1a`7k`Y/TVHScv%%xat`n ~6~*85^\ 4).p#gn(*FkCjdTqL354).q|w]py8Zd*ZBa ܀/ U3 \ S\TA?'\Wh .EX8vQvʡW,(-D"^B4 )hN|@gPxDiA A%!Z x Qt3܎J" QV@DڤR-^Q5;7ttPKgAq7p e#D3:P(A<Ӝ% L/*:9DRt `/WiBY3H1dL38[6g8]P|pd(%"A@-JV\A[_]pTcGCV9pƶ8d5;IÑb#]cٵH4zRcmFn 9TZQup)fo4BknEQ͡SMSUOax̮xޱʅ#Ѯ0S5ݾt/&!)ROyui.ĪtŒ%a9Ԉm[dEaa۠#J f˒O$|;1{\C[G@.($pBtzrx삣ϠħQ_28' -DGBt޹G=֑|筂țE*6!_>nJ_[5. D:,y`H4K0B킛[(+)e r5'%RjK"zE4)8v_&ڍ} mDkd=>`8 d FmGKfmkvqp1 r 1b@ot_ #xaXzjuz wH( Haԩ2!ʜ58@ d) QJj P,u.-)CÏim@>Zr/B1KvJ՚o=X,XR>I3͜6 p5N5KF` zpYԲs'BR7ΈETûp'QwipvY& qnp+i&}b#DlT>"X8]gZ)DYl#.p|>]烛Fk^| 1z6/BcD 5킣0I %3;mϞc+2CR%7Kׂw4Ã7.L;4z8fJހpN"cbX_I R,j;cCX'.;اY$& v}-M׋v$Ji@`xmkP]q|}~vQE]3|vQϜ~I_M^ې.H.}|ӭG+ 3rHQ7tλEiޤ]A\Iקrw&bp=jlxi _$3%*u[pj.f9 aJw+G$\A9%awx=2M9[ j!8tȔ3M}q;B :R7t7'l8BTXy x+-OOzug((&PEbT"x*T?`k'n_;^i ȯz]!εP&T|j3Dsf.G?ۻ.83K0օ6mU"@^,k6KK0B>[SDfa?(2x$לfBb2˺ itB-71jLB_t(Q]K 肣ٍZ2B6+D- %eB bws7~^;_jQ(#.F.l`.7}!G*s\ |.aۻ.DFfn 0)N. !|m$VcOમ:@hJM3H)5=ӳ؍<­k?Z)WΤlQ곛&/F8R NDy9V1R1PIutJc>b榻Pn5x68W7OW?66) ~r#+F18A((9V6І(ݺRh.;O: 㧛'Z姛簻Y=خnW?1nxU֚P>Ǻ(P8#ѵdńJy5F9[7M eGY4RcF7zM.)LGɚdU]py^զzρEsV/CKȐW/ E2F13:Ƨ}wHr- >.jX#I[;-} )%n6Na }s6r:p(\!K "&%-D-3ssGX1ƳI=y]fd>\۹6W:o#UVEmfJݦSoc?t,gY1BEu?Bq W ~Y |ˋUݮBV#-r $YZh>(j[Ty]*7ƅ2@dڔB& C/B6q"XPUڪoW8cK"R?wrf,x2D 3ɾq U(f`}ۮ,;x]oirnI + ZOCT!7UIc{MtqkIk3B,@)^,'ݫ#790{3_M  ЯDA!}/ ^⩮Gt& Gie6yIĨ;)5rذcD3i?}䋻 TUQC*;OsBRz)f 35d\=zRl5Kk B|]F\JpE1/%m6.0\Vpf,$G 噃+xU«fwĝ\65Lo&_GhI(|f<>99zD2s1Ko,ںVeyV$Oic[Z`.)g5jWo `38VPxEK%xHyHfPsN9J:pIп9x5e<;?-vwkZIlE5<4)&Pk\Yąbr;e]5ڶ7P]G/Akg!yko J~`ןbDf!+Xok֪&3,#lKzʗS#׋[щP,Az\A\=66qAlB9DX={(ʖ/Au&֤T+$&ͤO JxE/cr6* A%aq]Ӯv$5_V!--Hf6]ܚi)ɭa"vq0VrƑ$ߺ.8j(+k}}7:"XdY f$Ot:1ltQ,]^}/!Kl]p .ߝ[sw1Ṉ18X9q<<O{ȡ.!l>uY{>xw>U|<*o^UE:eZ8x;S!gu(`ȏ8(vѪnKzR H;XT)9UӼ6Hi:cg_ֈ7BItnƺå]/ G][e(EkW$޵7mc~\N&v=zGVL=)H2%Y=ؖHsp^8$coUxxօd˭xΗJK?ppn.IW"iwNi&n4Lgy̋9x_ӄlU-ێA0ڥ|8H|5v{~z&z~sU70bj%edN8O~0I|I6G;It^70i6,f u_O1<1fJC|NP+H͉Emc /d/W'p-Ak5,_ @BFwqT5FEk8ꚋG>!3?`0#~^9l$,\:Ӌi(x  $QwE,&s:֫IJE2#x8asv5HQ)7NhD X9IJ>(fv c>L=k`4"D(%Drzh_-kg̃T#Bp'Z1PUI`0]c`c/h6I.c~+?9H|CuozK~=][)48v6fMk+C]8vyÂ"|L)yM - ѳd}?T,y?<`~\g&yd <$|=S`r2)g(=9`p8>DɆv|}/7`ӱA 'ML(}'n ^zZ. =Te Cp Y]*7Ce h~ G3w`ﶍ Z%X-8UrC|hq΢1/~.h:Qae!zM 0Q:aG<"~5vL_077%4D#0]f]o|1S67^o:rD=  s(;gNT Uueto]H E7\[ ]̙ &a/!ݙq>|NqŁ\V7QͤȬ_ו?s{yz\ qFq1jD\}~E\DA[8_c.J >./a4("RէUfҸ'^a pU]=9|qM+GET}{KNG7 1_ =} ^''`忙 `' [>R%ݐ_CK08+!\.!H{Onp+aD T>P6s2 ZSЧJG.~Siף-f3p qoEZշlւ|{v:5lhq\@ 5e 娿2{efŨ7}!X=aFr<ˠ 4|/bE9Q?B ^ N8o|wʈT $*6tGc,{[|S31piP0_ߒU'$h+5sZ,^3e:d| Z軥eG ˅!p10 ʬ1Zc%"*gJyF;#ئ!mYּnkXF-m~6IIT0;X9Lg VICMk;71K+W 2BTHGX0cSO@dOq {N82i9x}.\;or6?ڤ "ԊZ<O|+*훫41Y()zޏ-guMݩ&DZf\m0>opO0vX4,wP!'/Q/~߫0~?xWv-\70ݻ.v _U/݃70݃"s3|/c=/AFa4nu?͸qa}/P$#!5'CG1cXy3⃸oz'r;x 6]('ўx{Gku{ +Ycȝ9mdVWRi\)vN Xê6 !լ="[j27H.U8M\˓ +RpFdL8IHƍI`1XpFr #*O-T6"Z~PVM5m6H"(?!Hp{]UZG0QZGQZ5a@dv2t]Q.\*ogqo,R0e"dpjBxlSǚ(.,,g yB6g4|) 2yiolgkY}0Y;V\ȸ:L^ˋ^ x\@=jf S$Tn?,3B@DͯM& _ V .mPI4B"m hKx{Uڀ-X78F-G fR.[Zŗ^wyqfxqvU|\(>UH:lD)D޵ #SlK[FGbbQ#5,24 {ź dmDu1.645Fqf2<11)M5Tp{uc:UZUZ1A`L2X'$%&&۔MtFiAKX%PXBP2b~QP$NSa9G [23g@21IqMVaVf,S uq"g1TA"ՙN%5& **dt \IxCeRbKe4݄L |YjRs¡#c E,ik#W@g7*m8=mU?HDm&u[A2,g%ra58<@w:cuAch4ۮBA1 Tl"L2ʴDg$6| ܆'0j? FMڠ_S4#Uڠ-̈́j6h ?iV-RZ XLhܦ74èeR%kgWN*RI qǖa,UܫfM!2 6n&׷nXoˋts"υǥnSH#nҺԮJ'K\O'QrUz>Z8~@W9gq\.͊X.{1Ĉry#?\׍R@kd?'֦'|m:/3GdK@c͟|O~ZpBυ]^Eg"6*TgåzxbxvM f o/x"}|Sϻae6GO}W9juy=Yx #wϟ&~{bbd6<\^͟T#o@snδN50W~w*nY&a1ĝ-;Y'g Za-]B _c^ 436,_H\BM;nm<ȕ^_H6^hjݼx/nA:Aepͧ_„Ro„[+Ch s; nY| s,x!*g)o; T[ {-က\; ko)wE bT3>s叀qsXPNG3x |h.W OfHհ.߾Ǵw Q}KnvIׯ_OrΟN2G.;fhqܚcs.-ՂoF\sE(TS:-_JV{sCO$]L ÏwuaCۙ ƷT9ݾ%=,ӷ;1w3c| Q곱σѡfy;k SV:&/\\%Ѿݵle܄.ú7LqvKBv1}IT"dp~]-ʁe{Nzǟ><ݜ~)8&uJ{\X+ щeDaJq %EbN4)!JRŸJJ%u@9^ k-\{dUm>~;E:@&Ø6 `0ruXC%6`&0rO]ڶ}Vj6*+PYj36o߿sg w,t2_xz&5Xi*K7#Z~v<^cr9WRFX$gos8~5h 0`i߀Y+u:;ݒ4d6Z\aZI6~aڝmi,OAƏ&Ėk?/q|[lx2KtQm jCu$|l __mr [斔M7J]wH^`fn˱`b^ΖU\菦K`ԏWF8L#xmy9 p8F7Ez `MBǗ.n )LJj5m|0='&u~ުv̪-9 昳?$f7@ lO7zaawU~i}ݳ,[k; p0L m|[/MΧ}D3w2Uw OmG OXc*PޟkKn9ur3ʚ In8#s(59\d j)Vܒy1-1vvvᮢ]KvA+ڿ%8hAKz%JB'LkW4DK31kI<02f  i޽%%[%Z_ ȑI6jAE +R-@05ӑhPj0k6(DD{AiH[{ZN&I#}O;\İ8p?n,o`ro{^oHj˳9&,›1ڟdYe_tyxgnʵ!# !S*&7 5 a .:_H|.0 {?Eyk+j%#A*F<<լxlăV LL4lzMKI ٥=TV qğKWv1 `~.aG(mxuɬYEcda'dVE U4H/!MX̪IdRgZ3;ҘnIJoܷ65E ˮط@yEs;sŇ)Q5 MCI{cAԖ7Zb[QEɔx+22x"L%q8ヴJJ8=xL!#DS8j pJLjۆ)0B\~xtL1 EyuIs@05#sZJ_?,|f»J9LhX բi$#!~񡔀:t2%k|͊Ig R 0bV8Oz"밫2U]h3.ω-stzdm%X$[ĥrA dw*CT#wUS t","+CSyqCJSUzG)R}> r!TnR) `j*j0qwJw GM!Sg}DFMkm*jqt T9S-JH n6@Ь3 ٌ̻(Z#>0r\q肢ZgqOPZŠSXF-w25?ä=FB$$b`(rK޾{E~hs3]G/8|i0fƔ/]CgG\svmG`'XUãz><;,ΆG_[WcpnG?%;z[zy69N.⩣vx B7qO+Aۘ;s&K:? $.;G?-4̕#$ȏõ"ftWC{%Y~~BQuQY96 xtNoHh9?HV TLKD@s!1|E^o??4ݽH?FT| ëݗQDP=y?KstV0Os i?%a/k9v7,Z?{Wȍo;dɪ 6aq8 "_uc`3,ٲ=m&-Z ۢ,çz[3w?pӣVsOrVoyA_68A.*_)X&q Iͥj؊e| Т~~GsUAvgeDrFU-PuQ*G9&* adV*-Yiu;02O Gs9*;Ui%x5;e)yx_BW}xH$)³אM_<E#AZ" O_C'#ȡ]S)`hW0js  ,콸֙}t])8孇vEޥ6qxhn0412lۻnѷ.u~HWp;?{v c-Y89R{O]6|/4F)bs}d}ؓK}8E)bd8=v)p˝iJSLӑ*)^bb=^Fp8OKSȜrJj>d>l&sz~#*jƹjv9ڸN?LX/2C|34R=QS(72r8G|IqG :\ '0aFy PM\DiIo^MzοP~<{*?8px>ݗO/UySLmoH_bNd$1Y毂$Yޘťy{I{c>$QXM@|؁8 >l{77>h "^RVӸ x5u+V%{> mBd7k{mB쵀p^ȘhR?C^V\q)b:e)M`Z R/z+zaGFCMkD>uxB*Z|[DZV.}5w?z3ʯ~RdHy U4Zkj,QϽQXr J tjKRUVΑ8Cʢ,,V#٢hrNQ. #VKrÂȋqÂfT &_`η)+ Yd <EK UN9ړt6f*$;1MZJl[ gE 1.I`=[zNJ:Q`3F3WNeةȎyOuX]+?~UTwݭ4yn;rʁkd=HNY>y6V;}>5ߚ<ӿO_~VR[BKz3]U07꺸~`ʕ7;Dh j4m`RKD۸+5nkQ e>VwZw{TzNCFYp`vLAZY(5^ =jTVisUF;X } 4?)6$t*Qjއضr~3R+ti*av2PJvuƵR'@њJkJS8BmU1 x-E AvBs_nxl<{i~,#X LZکuATISkjepJ"W{ҚEe-z\A\+W).V,px$hdY9cz#j7jg?mWb v㹍4N`&2z}_AVaO4/=T6cȵS[U3Zw/աuHuUBCGmmڢ8zY՛uذv1 jPncaQ1@5>jp KcVMv OSz6ִAP;㶍[2Ȭ@AփY" AB˜11٧O= ՞9ZSxGv1Yr9OUURkjUWI^f28p 7oTLP zN!W9xN>44f.]] k9W 1sZKz/JA}mmyMi8&ڰF x1_cx\(=.]cDaDCfDa "aEOp6E;v=T~10>z0fKksׂñ% ҞDlɬ+Q+*jd+պE[N5slٝIpAGd=`%._!byuH$k:DDyS]oĈ״(f|9;7n$pd o{u#{ #X#= ݯֻ5[#ͣ$pƣR.n 'F 8}WmlJ&*iR9W*|%Dʽm,V2] ej%MLSZWmj+BUӬ:7jA]Sهh62{ʝ7Cfʔ4!2<g`^ߓ REnP6Ќt\=kӌS׌3p."@5_B{pa=qc96'q MN0ڽ}Wm CP!0!R(tM?N 7K!k/P-5W!oi4R+:ZF 4Ta FR Tzh.ޏ hQ`,B2tob%BMCNh*1Wu֜QD i <by D@}on=yerK%vRF0-eX4WH{eU5)bM39Q4xkQ Q Q[fTR";?ABZT t ˘*簝{!#;_`Ԅ!d D*I]qbr?c!\ e@2BZNbmtE!->"QDZ;'r p*!dDH3x{\KhR BB^!Abt T--r4T5^hu P0mM`! _(ʲ4&4GEH1@Asl۸K(lBf OnlѰ/?{)È D22B>T)iD:JUpQuT5eu PQ+PpD EրBA t[3kJr4P4zPp.8۸DV1B |`E!e^XiUi$&ptA PւxGc1za$Șy)\icd+ T!YhBZc?_GQrnX4#xz#׀n6 j))D|!1rmS=%99ۊ~p`E8fJ)8lYr("Sv0%,{Na2uKctnι CfF36dW5h; 9?!Ӛ Ri EB y+uƆl7|ؐJStPD 8Iw 6Sm9a< 0HH{L`gkLj&ןĄY)8!N}gtDShU`(VGJOF[ r`,b>&oK%pM*F/2BeT}0eA]% "L9@H{LY)uIFiS2&4`o4 KJ n%-Ĕ i)3h\rqpnZs_׬#B[[FWV+VIlwkbxfy@XIr" lY/ ~3x<o}p|Nt:\]^@e Ʀ+*y`XQ~έ-psl|8>t$m&(pˆUێOR%`%2f\B@ D ؇OpWGx&NҦ"EOK_h"ފA>*GO7%f Ӝ0_![~v]ĝeLe%XQhU>?{HNf t+4%[ 뮻<۪]2vA.JAvV79'ء-Njmm9y,Nֽ-Ly렼> )O05"Ðu ig8üԘ iFV68|Mol {p?: ;WΫ:o#onq04\NQ &V8.Ë.l.qjO*\;W?7yqKVr  `v94 MBi}(CWh}*p*܅\}R )z p7# ΓMt00?ݪ&Br#!h;8WѩNiJI;c8繱NdT<Pxs9/udL] 3r^r*1S65dL|y;t !&HmQ|.pEdD)gH !RpeK9mŁʘh}֏ESm's-> >)َq/ߪSD΁S/~iK__ ^ җn*PD`{_9S<я=;?%#Zs \eXv4_KD+tyiy|DǕ3нw/6'{Q5slxP0aP@,5nߒ3qBkٶH)lbf27i!SwpɸRb" GdDs)hEj %-* ]eG&ܮGDSɅِ.a mDޢmyEJ9>yT(.`eK"AHka0'큏;D<|2$Vg޶g Dž)D >S t"e'":"(E"[q4n>-R:"Lo"@"5jHi"[+XR0zۍҀZoя~lcXVeH%R35 POzF\k[*`:V'eO7v1Jaa,ﯜnҝLc4=I7,Ywg{:I:,y2ɑH".V;W躋PtS p8J`9+sR\M8_3yHS&JC,NOS݇g ;H3N [%S/7K~5԰OMEqv~p\8E+c^ƏMwCzh oOmo)@\_&~@{<RG~-j XzmlE{b)ih3Cb ({7P0ɉHaʞiH[Wc7]?eZYg˜Aa\ KQ%T1:W֙ FK!2#X΄l7Zjn }?`ug|LWCo4¿͗>%z;>\Arf̗z3N'7IK*+3/ep2Y$zW__ :݇!Ttw>E;y ")|F-f&jbgf ˧xFhb2z zLCNxN_Œl9ux)5Z\!ϗ3hQaM ZTא'kyMzmО#?N'Qrmdi>PĿϽt2oϣ@`$Xѵ{t=Noܽ"^ 041_w񧅄3҉DMgf ?ЗL]^ 4V5E atl<ZS'tw#w0z;;NZ =3wKK?/?cԿ:y-맵ʄgةK\_*uN*WO~K;/X<ŧI4c=5z0L|T{?{cr]%-. n8/Uu­TٝpErUwI|P 7}{_clr@Uʍ,WGŽPo|Uj:]rK!·]5"͖7hBnOz;ƍ|QmyBϽS.M[^('\]RW:Jo߬5Lk]Z ŻLͤ"vvGmr2Y;Zq(.s`UwWGðaQT] -1r>ρ@8ln)y>] (h23\2YȘ\Rd)&+&s(~1}[VM:uW?қ7$< ,cm(b Nulb4TgIׅiJbT%0TfcEi٘8 m2" KǥzX \ "@آy,I1/ U\np5Tmbu*̚ l`!* \ :,  HDCAT*ui2B 48(&`5 вj(`B]I %5,q3L ,u.^,H, La$Ь6oFePEr3 5%Hx``h&%өZLrΡ8()ؙb:|\5T& }"Hn J- ɐhD\22j,XE[*/!JQ @d* qj)r YlrDj< $#)uŃ[*a*)~W6ՂB}ms_|6K̒`#h, $QlPM0RXIZYO8Od3ų OAwg̚ (NcD6TVS; 'Q\\D0g(./[АJF`KHHhtU@sc $x;`C|EBB'ąAP$RDM+d2&+Z0҅y>S!^$"$K&DvK7< om đ+~9Pu(`FU,*ƝSF@Vɒ."Hb b~z*OǝN檢cIYKdF_b=bP"Ǡށ]j)gĠhC*9x.h /I:F5נk@)@2πEx.1BZD9*չcAXtt TH\ng.]DpVܒu<t@$'e#Kq؉΢Ik adQ&tSQA>&qj %bl9فqj%;?H3`u!]eќsђH Pf9v⭛A*ئ]VTEK*͂ oQ[A K?}\%/Vp7ҽJ`rK@&m:PՀU?+,Ʒ#ΈnU bco((Cc$uQ]l{ 7_q$?A![qOҩJ&~ %h pNbW 1'4д.Z Bժ&EߙTɐ+)9=5ze=ie砤fZ0;_-;Z!6R Zx+Ey#H˖ g@hB2^HMO8%7aBFг*Ӡը@z GAXe(T4qAOZtk@w O< . `ى8hS&4]*CC,KN1)F%IgRθ$!PO]XtQ@Õ1Z U3伫]G !B l)xJgᛥ^}!7m׻0A-夏,;: GRl2gO?v8 ;*دzmC-ls~wyُ8 =ݿf}{m*ݾw'5]LEwWeޝVW ^k__?첶wW+:&uWcp^Q[7t6WF?Q'(bFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaf:[QPz:j:@K7 èG4/QguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQgu]S5uXh=F ~5FyPZF?Q+FaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFa?3<^n686ټ8~׾߬w-Zv$ɯ[z=ӿ~S7S_ρ܀. CCZRîlb浴,8 JZe+ VY%C l>lҞ`X1xX XDaB] XkJBX@F}`_?\@} U`|sq'|黳2oc~WԮ$]c~OXKRN/|+K+V/{d8 0`kiY Aa`X)e-`҇^?X# Da%`J%fV+gZ€F(&╀ J6feO K+U`5Z%+KYKZM֛uyv^[:,kg\+56u=cm`F;`CeVZ:+ ZT.C<ۚC~i~`:VJjca'`3ְ9fۏ8JgFؕe_K>O0(u=Xi%`u߮ܭ!s`u DZZx%`Vo/kI]}-+ۏ:3[h Jj)``duf-` {z 촅opww_NNOw[N_?}a}h;3?= 7NL;Ko:5\aluI5~YyO!/^~́t8~v|um҇G_;r29>iI|~yekH~|s^zON?g?]^{ eWO:s`?uɧgmݙ!Ż_GPnzwO Sߨgӣ]B*^"3w7{YȀҊo,Jc 黯zw{߻pwD F]f#o{XeJ{H_!q1e}0n,``g4y\SVlE")[e[Ny####ޫʗAyM¸+Oe7GHEdgݎYu< iQz}#}s{׬L浡O/+ϑ%KAJ]-8ތDOv4<] $jѽ ć~p{-;lI5ĘdneYAK&jyi#ޒw9Z/b"YrJ2Zhb"e)D\U˲ 驿r6eVi>j;nU&l,# l-*hjZ"kUrNk u)Sff%/%cd R #Ӂfj ^µNz/c9-ϯTCT)G!tYE&s-#.Vy] =$BK"C.LFc1c+5F4cVM-c1Zs+C`fkL'9Yy%riCf%cNbiCWjҧf$2 xzZư2{Lƹ &Jːku&_m $щasRx8Ǹ'^"LLzb>bR9d %FJ}g-,cŨ|2EU)&eC1?0`-S!MK28t5  *Dɕ1$H@ $5-+T^@TZ U:?lq) /. D,Edu\;5,fn Iг<5Qjhvfc\`IYgC{o>*5{8nX ś !u>fTs\@7Jk$=|P@z;`}V="ÚIY!|^uEr\O.A"F& Q1 `\ة D8RŢ4xT CW=kU4e?{mX)+̣g).XZ#eʍL #5zFz@pkOBLA I d:"TSEce {H:?=(WstwdXdjo:k/5s%!56R <Rr-_W@u^s{zޗI?ڼz8̦tdc~;+=Z$eXħX$KJoi|#ϏXwHIcXÑ4 |$jՑk4ӱdHuZXs, k~0R4l Q#1k {$ƂJX8zg~ƾi`_:/.Di/Bz_=^_yk쟢Hؽ[Ny\@ {r_zGoR of"=rfO6su袋 %8\ybǟ>,W<ϟ|[ݜjOp˷wBN/z rI{'l/_贋f%:߭_SY6fn휎JդݭnXi5s>tk TbVyU>g//̹w`de܇!x|:3I˧b^hx9O)i!Rͩ"\2MKD5o|u7:r_u/zxFWz'>m*dV.9'.tSݏK8a`XNe {eGVRNo:rHjNB){r54 ($^ e2mUt-v]v*1E=j;=gW" ;]FoJgh+y''=0ŭOO})\7$7KE:~9 ~n7W |.AKe/a#a31]/#^[d9or4ۥ?/K n5;1f3|̵8k1>Jov2vm=Y )ucrppCǼ5yWӼ݌{bG8"cl9tX9.f|y;?j_> ry!ɧj/M8M2|eݯ}un[~@>66/v|5~܁1Kv8^;|:'l{|"ׇqi|Ăc/d;_f_a,#I62קe?T[i3դ_L&7M{)Qܼ/>3Ivx3kg}ۼ\T-C_|xyt7?1bvu^D?zpyB~rRhn(HHz_D,wK?nx泵cVW~IW7Hd͚Bz}3{QEP2QAi MuU<8jw-͑Htiax?<s=\*RIjodQЎ8I$Kw^@`i[Ǹ%2yAhFKN _C}%J]Ku#&P[ OҵiWi|Yc9 ؿ\Vy-/$|cU}>Oion*m_/mwGndG{0?BYzO6F"$D S:'Yl. [;1k Ǒ\~4k;o4_7êl&*G _4kWR$Kicz`ۣy&|('"lMw?I-YkԌ"QIGc.ڋ@.1؁Y6| ؆N zZ2k[DDTe`&f5՘ ņr-qC!nlul%vhE]rRClHh$#'.9Dra]x}ȇB oˍniŎSԏ>6u>:r[̓-lqC5فw hOejڷ}*S )'"+du>L:x3bvF܉Wco7iqέz_/__2\C]n n z*1s"I3°Id*3cܛLΗ} (aa;k{p{J=BYhx'H{[ȇw%,d0D)jM F[l,ge6#g`<݂G>f&3f? NU#7$ kG83S҈93.a 6Ws-\qqe8/Ӂ=IJW̍1E>\-&;'PIRH- aZృF rP- rhp6&h%]d?kfP݈GonbAywQy!9d-+WE" ADPQ~Ċ(?|,8eu+i(hٚUk.G% F6[Q4~EF=%` \ :$aQ|F ]qpO c=#rsZLdΰ`c2pRtAi祭yӎ! UUm]4[Ft] )iI #>Kk"Wt <6@ňylU Rn$#WeQC>[uȇ,`ԣ  j"K9F*%,z_ 2Iz"܉ܐwbm#fUlfH0z*_VYnmW߹G.LQGkv[H>]z5[AdUu8H9ay/ v9X('![Z!7e򣰑H-l&C屃F rPtݍz =$$\X ~ 2N2?4ػk(;9[G\E$)8~@2YzBGmk(+ӿKoW|T(yЙfխcnk)(03R0^5׌SbvaǨ6jcFf4cgIЂ*#Xi mYT`,']%bq"7XjIP*xUpXUD9_G`Z>G>;is@%Tx*k5g+DunF;V~M&i\I?f:[K6R{_1-(:Y(,^_V=?T!jKR#6m+?_O/Y!A"\P|6t64929[}o!n[(0 WSiq[PdM!;BqHb: %~1ˎ'kn|PT -o):b=PynjVQV ,HJ4qIFx|N8~jz`:JI49ͯ6W~W(blk|F+GoPwmSN)(ìKroIk?XZֵsG8C1RDck|礘N7FKtC(^(bD7L 9D7[b%"Jt!QGٔS-0+e"]\TwGfP]Zs}K6тarBٸJo/=1lܼƧvLNx6^ی8%}.CiM\O^?} EVAׯ)%\;4m&6O/VH\ (H]Dp<.!vTzKn%tϓv[rSpu,P9 9`5>3Lth M?K'9 Q)يp8[1]2<)gnBkg^x9ì|ҧyά{Dy~RNeHnp4ϱV]tK`[ qlwfÆz'E1XaN(ѫ%I Q 1V 6g%ޅ' 1V*5au]94ab &G86[_|N6? m;=~N~>Ǵƭt/#N$f3 DQ)9&NoEo(QӵeQ2 ٗWЊiŹzc_X]" K B(IIFg.:-<ԮEꊪ,>U5~uf̆1uX<*.rj1tWX匘M{tFGwwU%g-h16`LS>I1桶" (Fu]lQ\a5g%u^n\aVgxuwi53̋k6+,SՍu\eVRX,m*RUαmQCAme=7,7Qz:(!M*y,VD1t^cC}" S7zcp fOq}PE/=n0+z\wCMkՊg:7b'☲WGB|qfS g8.yͭIbZ;?-q9gw-3rh!@i<HT #\LzMʆIͳz5,v1Lq.+BoxsJT3U4q:v΍i=Kg g\8gkzcdNGcp`^㭈ƭX`F~ó쌯h^5$),MCdLFdIX}ȇB.ĔC~ 0ǛZN YoYU5$h_jLx@Vzafic{arPCAu#Y͉_`* AaIc,G8g8^3(w2}bv| lD½*2R}ĤU+D-Ǭ5N+BoL+dgzd֛q؋Tۀ}fa[[;1k Ǒ|Đ5)?QGO}thiPsh.! 8]lKß#hG׬hrjk=J,V*MsV!E1CA7|kz8jÕء::!ۙFCeH.7=٣NUQ֌_ނH9{xzY*#>__@ר{<ڎ7ȇtZ.egq]z{oGbbA%]85bI7'LbL:>V:2oPZ\ث"5ڪ| zҮf>KU/n4aݚ-=WO|zerCc(2f0qrC,ѾPCzJ`$@b_Z<\]iN֮) @l}pR`j_(򡠺Nm4{`-|e}aEУNO8`C,ȇ-ǻ }= \bPo[WF.[wߍSQsꎛT7=Rw0NEḁIE0"Cl  Q ]^u҉ +aDjARLu*vبDmՂ"?SlHi ǒ"L0#Ak`qQӮxl\GDmE~,'ҫ}ܜTII,0$[%tqXKHK8s%>`:ƳVNqv6`rDPwM ]1pL]™H톧yB[|F3)24eTr[0x}w'$,dBPeہ(sk9"82Em2|FT.py0;s aU xJM}|OqwP(tP l A-C-W jDv _z}:EUPrxB̘8l FDN, % ANenDhp"Ї՝fGw2 ҍ!t)El*cռx@gn6{zErw۟ ʟ+1>%9YzOD2)ܢF:DPg\ 7jOxb.WD. OuK?=y}"?WTYG+U=-UNp?09P}<&Gu$U Ӆ~:h}•h\BJ[bz+Vތ4G1c .j^f}<Ș:IHJ-% V)"cΈCݲ842/~e-<1V,Y"O% %/"kF&S9.P. yʦ6H]$$ ħ1f%7(}- sCs qϫh4 P:}fT1 cRp:p .\(D3Kt ZKgSՌ8A}_˫ܝz2Y. 9q!Ŵ"$ I?I}6I5`٢ri%QNJlJ&8:yyQ 5:x|s) >> j5EG!jj! z_2b(dV=@W7>lԂQ#}IH09M4Il9i#t'̕-ܛ~Td^q,@٩jo բPx/j;U Qv_'/c|(C"ިq Jx$Oz\_y'a f(MJήJŤ#佭G#i;;FmgOE鯶O_7oJAkoRTy拚^/CS.H`<4(ww/đ!`BSsl"I`BTN tM42ǯy9DE5|YCl*s9@Gj;(yQL /I&*f"F$Qj#}dx ,|D0mMgP q"h3#|.b- S :5'Vjl 9ש^-Yf# =wa_ӗ)<N Z ,,Y\Tǩ3T]$o|LRKⁱ|A_>o U u aCE"YT)9l~`[=Qz)W:MtUTu Jtf=NQuY:x ,|moVyR9U&m(vKvQS}$];,..c4:,"qA"."M%j ")[#&[W'uP1h)5D|Mua`׵zGwTU GS{{+F4Q4PYH2Pnx ,#G~LS*c`P~iSȥՃ $@K<? O:x , دgDa@Tp\Ra W<6)c{  gװ^8c] X 3dh*UOzJEuc`olfoPwT>aoJU+? G. dfj-FUvEX i17LlQ|un-T~Ǐ? 8 zOp?Pd@߸Mۖfw3M@Tէ0~nl~ q" ,ʆРe"rU2n(/g+>?o|f*/ogOfO#0/߯~^J@)<7DKc}pxK67yWO@W#Ӽ7'V\EP?|CݘmFp9_czA-X E>AQ,A}!}i 01N4/Lh4өIKU"ԝMw6_8\vJWxrl?3X?fyV„mU/ghFRrf@=W^b\;F˃ͽ \L֋*a]`n=oir?/sw s%oZ}gr{ͫR7nVL]Maq+3y惮XS:ܬuj{չn4rbnKXkS˵\(g7YWv5`|?*[@ZJnk(5:Vf^5杷Z`@*v{4v-k{FVM󾇎ǪZ/x'cAY/t ͼJ/]wn`]n~ \9>eڄyU=>؄;n[U|3+0Vٵ/^Us{"S%t燅ՍyzE7Xu ,Wu||Ho~5!3U%P+kpQCMh?uצYi{ 6k^hETLl5X}v^eJE4$T̺P}S^+0|6С;UPW`0kg ~. jՉ> A (.;p h0,\-lSgcSU>vfν D (.ubш6 L)"$(ST_[0-nBOpA88HjQ"C*uRUDXJ ʸ2Յ/U܉q~Q]knriDy"Łݖ:E,lU-;QRôAE܅ܖkhmhp<%LIFq~Q](U4j*?3 Ji5@G<wK[iny7jmCt.GG+dcZ. }#Y\7ŎwKx41VX1]6&.C[$=њz(Q1ѺWAT<ըo=c*ABЧ>'KGpWÇ^,o^,OZ,\,nFڑgEm~ĈEQ#* #4,Ic ~nobB3\g$JqM,)"XAk5??$m4~ k4ݸu҈&ÏmȀ&]2bʐz IJhD CEussAZ^hC~9!"t*ڝ~6yjȦ;WY_Gؽk!1m>[D1ӎiEQhy?Gǻq ̃᭤:^n1hF?Jn3tzվϭv_0O|;j}웝`th|C9Xx*L#- x$E/_!} xbLgac;pSS6~<"O(MWoR #DB o~~|rkMo%ZwF>[.1J8Q2닉PGGߝ9ڋMЈұi)0!"Q*U~i};18^]#m1XϡUQ[%hH$dH1T$H,\Ã04 zj"m&hӍ c0Rq:^*U#N(+_'5$ {Oq35:J-#цz෍`cmqDGK@X2'fth_X" ]efdȪd7OFvwel\i)-_΀C3f?iVעԫŸ WD:TyYUO{JuI[e|rRrD?|us.GAgYR(Xrد)ovs=$|XWt8D8x$CAt,2FT+p}Xu] A!1pیoq!^\@drp&Y wҵ*m5cTh郝$zWiGw V"e|ԊKlLϰe数]uIDZ=Wq8VWԾ9wU1Lv1ɉF߉>џ B&)RMö4NWp0> H)/~)EX5 83O :bxj^!&Ft QYۡ`Wx_8Q1.fO#oD[/Cqy%IؖsEۓv#JBXq 65 "xDdBS*ݙs;DLnEoE0Ę+X\YwI' ȮW_C%sOyu%ʍD @70$=}Gb T|_l"8Sc)+\[V䣑T_60X,S!LIЙP_JMM˸M@<{뗻vk]_;?f_b}!}CHPkm-glԯݜ@!4ൽS繭2wC}XVķv[:G1.u+~XTN5DljH~CNLy@\E9?!J(SDO$\~l{㸍_i0 8668\\@tg%edK3;_LbKڰwmr&. s%&(~[WD5.V}|.;$YIwVG+r|aq)BED2K*& !sI#i8!Ƥ(R+MG7IM]zUV7I9+)V]ѧ$ߛem,d[(gVCjăe1j#k0U݈^=^t\{wG6]c?<~dq?2_zw>w::W)Z(ޫp؅kXz<㘳 L =S+,5r"eRȈ 6{dǣk=I, k;fU2g+]J:~4fhXu ǎnЎT#~Q:FEĐ4G[dݴ__V9O%kk#lڰ`H`/y~RJG' 4J Еx_=nX*5"Ltn=ȱf7:r mq+^4w9/3I?k;@m>¨?N hã]UL.\_1*!I~*T!$qQFe(uz%PIEaZhʭ@kĹvL&?M2T!1$# o ~HU+A>Gs$yP)vslc?xwf<9zEۨ33FwFsZ4fo§a<Bc%ˌj 񒐒f.Mgܤm R`(e96 bIѦh@V 0ˍԧۙq "/;ZPj={,!Ʌ,Y/P=`!#T*Fވ2 SȹԎh*6ִCM~im>)WnT&rV,1*2()qI(w*^p LNӁbbu~>)Q lz7/`m>±9MaXKDB>B`(Gk4qG`% +/9AVmx%*չE )\DAy\#&Kwz@y&^T5y{ %a(gDNfDnuM4CE;pC٧a%>w+%M.x.-*8{1=S5Ft?ujZ紬AKm7ڢCx8ȵsO߁y6nkp20Y(v!8E֥su r?)wUCo APU,.Ä`)(`\(PR%KcU?׼|s~(d-GAPhdO{ߢr*P+TNUbϪvlR˥1OY.i y֔KA\*Bm{R`, |r'C˥'jAM"`/FJb^<ި3(Fi ^xǤ+~4f#Gy.?P=e2&y@`NH) +[恀y%$i #=Q=WΞaЏWR+4K+oԼWRKfQ;MsNw"Uv7-|ӳ+{{gFٳ|јC`23G+1t5+CJ6(EhqNN6|jf9 T0rkU(SF(DLm2 e*F#j`.S1^Z't TEJ_Y{I3qNQͻ r79yVqk\쩉T-Bp.{}{Ojj5i0?%W9U\H>*ΌDcU0+4ƅrد7+WWM<_+4Pj2 %!/'*镚%yˑ/ Fܰh BNcdĂ_#k)(L FN+hXUjLY WY`2=ȗ6Ԙ|h }.6rUurS0N=J#(lNo"c=9N^<{~8^C7Ev`h}ڢgl m;r*6߹]fZe,ڱ(.J*/I3K#5l[ǯ~%d-/AǯFc@Z>t-2gd*:#KD"HEO?[xN݈V~h u*4hVV]<5-USMrq KfÂ"Ϋ䅶$ , 4 4r`Ds0Rg|!ܕIK)~4fdTA3=?p$쳓јFdtr&ᘡhhG Mn!P*%i^{".+F*;`p y8Ͱv #2Rq=Gπ4m]}Dž-B[Hlȗw>M~S10-mT>f\1LO,q|3m S U< Lg|f1; ,(qA9r;$Iʌ G;'+ѢkTHks\7%e}:k/@s|hy^> )+Cј1*gNFv)D">H)-!}6e ݤks PhlS@<|%f,_;4koK7Gcgr:8miU Rlj/A 5,zzгn^k^@ 80<]k!]oa Daw3st"Ln :Or$H2Gx~|tvM?s<~L\^hB eJ)f!WVpCqfȍn#`>B1ܖbkbd_vdKwuX{(V޸}]P"W5X\reM?orت-R;"yC) 7) gBԠ i,C|* %^8d"[M]cc\oXa݁kj{5h3A9Zw!  S,2e<]ߞrTvsWHDf[쐳"\,unǍucGn#x22&t 6!눉 Q}pww-nwɽ#kO2/Qjp֫"ncH?"G^&o pgK|l/?(yg &c.q$Ҕ8'lT¡NLz1>`DIJK9ΉUVh .ITxN 眑R;=9؍#rY</+8f&Kс=&&7cνf(sv) 16'Sh_͑K-/=hvqΗސ߸cC ''\9+ &kUf\#8?brum16x{r :&'$5 geGщ7~3ԈFYŐw.jԔ08{с=&8}@svSH<;kOL s~ۖ)1|fb2p#OogXELC5{<#;z]sOM g*6>d7^&m4TvPX+Sݭ&~ϽD931ir@1fEW1(|uX)֛}Q_]CJ G8\,:?U:8>$"o/PZQn{>j3 5gFd%.*FxQijބfcpط{{ML,r3\MkGuw T#`-6NGR| vPxPhbg)g\9!D{XvJK܄efQj*zqB7$vz$\䔓Rw,^Jr`3;sNNN-F'8 g\qo#7{OKGH>_]{LLBgy.<)i#"a*aTf:|7F &11qOnq bbjvݘȒe0ӷo:%\NrߕNl!vݫJemRVfIDzw`PH1br@k(|@+b4 Oc+;852 X,OvXtW&w泞/45*EKh AN"G9JPu|ΕF[jQ8TwVJzth`%J5O:Ra@oWH-dJ(6 pf%0@thEf9+o44 Zd餎<ϥL e5\)K Vb]6ՌiWItYՄ$2GռDӈXl-NXWykYbصe^c+{^x8:LI@>&)!-$FLYBep(|$}Y)^Mڍf*U sA%Xό@݂J;\D%s#Pɢ[Bepp&՝,j&u4YkLՅa;tT&lM& Xwb>l]8pMeh<}Y&qsiNe؋dAW:3p[GC݈ wJ´gjti8u,VdBbzg+(=h nEgI}fZ-'>"K@ rCa+igzHN%Sj+PvXBuKȈJIu*#f| $'H0zFn&'Km@y5a(P׸m˱`ug\B]p<Լ)>LF5VQGhn,N-Kh N=* %SZj cTy[kc,C w2->Oem.ȆG6[|U/vF=0M^Ր6UXBeppuyjvE޵2ދHWiEF^He?]Y+Cʹ>S^JnKY5.Q>iEk4iHƉWOAEx @4Zz]TjC3}XVx j,4PWa)Yú[zmLDr3 aa:I<][~ĕbaѠkֵFഐRY-$`} $Vb 5VDx Ǯ)qM+nW׫Y%F4)"uJ KYuՓJF e-;Kntc[,6javYT /68mP%aG)/\fu\i>N%=F [0m\ˢdZQ|be.28ySu {h7%J%4Zia9-@Zbb|G4r P鴍"`V>,n:NR-9435bF*z"22 qPBUJ9J[ne\p uEP}"U^BepHRHGEEjZ =neD(zC+vwH9^*XIvC^z40: ݋pXWL‶J$LHd"b.qK=B3 2^`W~mH,Jn*ao̩t.qy4]#hX)ig HC+5Js-42yzI$!]m|=V.28 TdF܇!P<JJUUNZFexZo\^)].|F|MI>H},} N]qWV*T[qsQJjlggٸsi@U ?Ke$ >1e_ Z!h LF;XCS-Qp]"F$meņ 1˹ގ:fs7~i}KP. @kҫÈsIMDgSJuNCM/?U4Mb|dw~8ѰqN]$Z m1 ~z2!;oֱZ {I~\o8 aLJ bJ\r";0dc7zz^SWvnQFijG8cRf*b3wqhg,J{S$;-xHq2I54[ů7͵Q6>~sOxF`fk qf4M!0^4N[ ̧8 90`48:+a)Sݮ=vvtTv#b i@q,7(yRap`+ TxtY28wIoCsK3K~4 T<=ӧң!ߢQ2Q+Q̉\_b&+cÞL(2ZKTى2ή&&ub f\l|tKĒ0vl8` &IO ˴% H`$:ja!m%"]{p̼>` FEs4jeV}~ϒ=,?%&Fg[sc}uJ~z7|uj,(xr9UހS^W؍34D unP*Ta"dV8@9%#B,+R2^trjN%_VJ c0BtͶϡ5L˴OLH^07`C*AYLWцfPQoo,MX6oߞ_]_̧Lg!WoLY< [ff49Nޭ'9xanN6MPz0M./Psnf0'<hM].gyD+jCsu45,}}Vz ?~ 0`sL߸o ~OXX qm\ 0?Nlm۞pwߔ?) :P\s}uAi70)M8T3oM3x]'u=wTIn~ gotxnxw톙M?}1 &5nD~kXxDxE٘bC@w]h|_Ժ_GSi2XbZ~}5vD/Q~̽縉'ʗTd79{0eN,70ƣqϚ\X0=A\xO>yv2Koc&vbN שghl\ˌ3>)`FJŅ๡]Kk~F(amN`2g50aN6n},;򕆣™ M\Ql 3܀aٟ&]؁inmp\.l;p7=ػ̫ee߼y6@{fȯ75WoHdia* O>h۫Eyx8a8a8a8a8aNG]-zAeSv m3}#;G#*m`Kxs~\yK!;.FfZ^Ǽ}j UQ~fʀ^#еB$. wB)Onߡ6l /٧%AD H"#K&-,y`y%q|J}Q,|(r/`[UV7Y6ù0U`QśH~BCf!ڮ|"N?&Y=T1OD _O d}u}ahgofoXކλf% M-aF}Xi X,R!ŰL2E0 %~1v1>C'(5ƨ Sy@$,%;9Ŋ{lx9r=U8v=Ŕ>/}fQ, )X$| HdS˭7v9yw|}0IoYBj=)P*! Q9$AX99}^(;H6ϓ0Ptmcɕ%!ܲl] ,9 d=P!\bD|@W8Bb&X)S9P޵u$2BmG iuE0٠$`aUkF2C*Vyp9F0sNuݕif)~hoǯWgFҀV٬4hр상bZ$v1GiK=BF;:W,v@y@Dؙ‚≛P,p#ZƘ3yxw(4 N&g&@S³˜)Y%85uJMm|9|_ik\1wn?z}P o^%wI[ϦH-,;g!;f{;58w=3BW5A[fwF-x9zi΍ƤQU=&:\i2*CPYW0-p49x 7bXmLL)Ut=gtd\=zfJ(5U(1r8& ڸ>2#GROܻ3ܓ0ZJZCؙF^%ιkita%0խ[*^_S ;CnaLttfH yx_ب% c%N@H@1d98I1_u)(@Fg@= Wv=lv-}Uhշ̈́f qӖf\HprLCs͇k9@[+f/P^0¯Y..Ӆ,!Io|ykU~=tzJo'Zzj:{IQz5}ڇD%R̳OJ\M=zjL>4V ZOiQH&7L7֡͒z= ,%|Q,ƥ~erizxDd!QiAH-(ZR#Y[ ޾{t(3]<^#V1-h2νbp~yxޓH4e΋%1**Ră,r{;=[J>&moJ ޺$sS_h6^u]vyb|y+>q+w_?C).z!>| S:kx.nČ̼?緻hٗ0{g 7]5DLk,OK :k{鲩O2~,H< IqemH O;/ʐ^m7J]T@'L|rv2)3j>>:=;7*Wq}4>aVֆƢKzpmFGWgg'g:~<6~;;xxsѸYV#c \W7l8Wgcj­7;;M@?Xl[XRz%ۺ.)qݯ>CornWnElonVz%afxr3{&Hͅ['EYUTRRI;]lcc#r̝Nd2NOKj7ݜUh_;ʔpޏF-uSm[ 8]SW3:,"Q|tF_J22WrŌsFm9S$$IRLX Ǝk UN% v}j ́-OYR?8*YډUČ6nE*lTUw,T}(TՖ^R,..\f LW*.=(To}жו#keQ_aKT:nkU983"2-N341 MQ}JC ;PaZk(cqf¼.AvAau>BprYCa0DttT=PAY-鈢͖„'IN̹r๨莀| 5KhqYu(KQ/)rSfo}"u$0(h,2DzI߂6Ee/Lĝ79R'޲@쾀K-O-ZhoJ1sc!`HԇڏL I3@=}(԰~F۫GK&u8Xts1>jתI $'4{KdtCx##@贖!:4H:nC:W$:"8SIQBʃhЇB[L))QJ602\vD{P [{·seb$U&ٻqe`ɒ,s(]N!}}i!&-d'1gvhf4CY,Ypi ZvKz!!dV%mL5E.摳DPF(=뻌!(RǚTZķ@b,xΪ<-y ^̹G mOh"=;jm$C*KyZ(󂕽6|&:ڱ [h3 w ^v,O etz}^l#ڡM(.#TpDܷF(TR&b}cyZ^xM*GcinAFxP :uuC{{Y@B հ+umIvnˆ-Ou؞bKdIv~ }}[ 1x൥x횜5[p33Df,zK`I_뒾DQKcO_I_ ӗx*F ^skm u%Ee1W\qͽj1j/A׷^T߾Y< ЮJ0D?G>q7$R I Dwni,Kw݀`?CI^%|TCg4< xԾ"Ǖ!IwO#4"~8H zL\݅m6 ^2'O~7RNU)\*ϓ.^U QT&n U>P>n N7O2ͣ`.N:.\~߬vU?L,حך;j85jm慨_kn;rmPoJ/sPm4գ^sXM[c _iNԐqz1=99難BeUR!zҚ"lWCN^CwSC}}M˰^./cwؿ[9d>;IJ?>N=8RE˙>azM}d`&}Rz3u#i9qqrCH7}'}hzK [m\Ϥrin)%:ʓ!D=HMܫ }<] ~H}U=Ҿz"$ߚit xeF{vwʣNiАRu@Q](&o.\}?hwԁW _ ʨq=Vux5Vuh=u ӷv:+Lz;SJz{d@_YS-if}<SEџPߛ?O`AY}JMYÑpt'AtX.=$iKE X8 73/QTL=K`8UEAgü|<"r~@Tq'K\H՗] -d{dVw3*Y8d hp0j!Q "z3~q0x%5,(cJ\;];po3`>IAM{8-cˎo80o*#q>lZ?Z|%@6_r%U(᧞ڨop(~c}k,t_8Mp+ aыDNE u~ӊv{ ++%`Z^=OU4:5՝j}q_B| -ޝvf8QJĂ`IPt #2>-fA,"!] ̒,^Ninib7b<$Xe 2vD`ˆ<b>0g,ҙ8b!'͓94GB( N /Fn!?|``ʝc%8-71@:$0@\/Ti K|QF$ ׍x(qEnƙ[HnmutpvV(UX $jX8e!ʐ0\Hzm5 r7&)#APGue^HHay1ED8z|!VmTw^Oǚ yy48rB D2JOKbL< Uم6j߫DrL¸#B ]uSՠ .ASΥyԮVm^o;bܘ@^2`6X4{.>sBq&(ZHs}<,"$F(RIoLfŌ,*&űO$^;o-wʢ@b#9Klo{jj֬6L^5 ~q;Og?EOU~:tSme[3 dMm*#dE>WtB1WbZ߫a!BV5wFmSLcO9p'j]eѵqy k-̇dXRmwG[-:KYڵ\ QaEF֕/jrx{|RWmˉ:}_T ezlqƢ<)ڣr:Y7FSWHȃmLŽ3\UPeۏ>p)^M _W1;9JGɳ~gF+[C!hsXƏՃwa1 i/]oԪqV߫athؗᖚZj&:?:5 2mDbj XD. sw}#P5&Kj?X*{~܉"eו;/yX O3DRx36*lr~g⹜LȢx9Bz@ْ>v:LDDALcX`şd`Zp8H?$ϵv{sssqf|4hGcz}9}ܳoi meq^p"˭MT4cw}2&cw1)u9Y߬edrݨorL}cIJ&yܚ;{|.r7M?>ڟNu_J(ѢJ=Pd)8,ec:W}&#d'FmN RaFi 0ƒD$I;cLqxJWN=^hPa9nJ(29orq]x׷-0Lu߾rIٖ-c7Hl6F`:c-uٺ`PfJ< JMXA@iw8&clQ1Ϫ8վu:xEwvﰠw8Y3]֕isndeFjwAJ}dIpƘZMh+&1vTr+9d|'v7_+ԓuŊ5醗[!7,5EzyDgSOz{Tc 6aDwgP :sDN߃aBBbN \u f?c,M\'Gr L|j@l,n/`+#]'hS5e+hgTjlb#PS9EeF<;뺁M|lG,8 1!9O+0uE&Z0Uy2 ȀAbg05L e'[f!5LDŽ~ 11'x(.5sLDŽ~ 21r5 zt94li Y=Cl6eM3YPLy5G{&Iblæ[Yu1y]ỵޖM-So2L-SoZkLǹe`emz[ޖemz[ĄpZz[.^oKRo Em)*tmC϶\<{`su;rvza2WscY#sژ+E.w\)j} 19+]Ӷ.lKKYg^1M`tbr-a*Vkt^6)Lj(+囝>dɰ KvVw\@&#+֢˙ޠ|QְOUˋOn}k٪X@ߚ/c#EH\-/[4nncrrUc8Ay~WdBp:Qd4mIvr D\wc8xN)(& 2|-=|Bf1whr4-腽 A' =c6pXvrZ_ CK_Z o6ab:FvoqEw YYNfQiw;:yuPj]L;I^sH e?<ٽ6~tlognz{CYjY:?;h%ϙ!;X}(@L`v&ʁJ /`.Fep^V4'yާ!S8w!&'4D fk4D7rwm#I*qM#@Ibq7LfM֍my$řbVQfl"%bآwUZ:7yAU2Ѧ'̍f˦'RMORT6=gӓb֘h`:\\cU"R z+͙1EW$X]\͖rR "vRK\i*_`~ʄ+͞MNXc3CBϱLKGMx R;ИֿvUB`+n9FdȸϾddZ/tHZbY`Zvk|eZyC+KG6{ {}D[oS/hi=_n-P,#/mݻwgb9RϕR_8X$g޶tDex}=? ?LAia ns-l Hf ϥJA+}>Ͼ>Imy\~Ug?(~8y.ɜuѤ(f':gu7Oߢ&u{|STl9ѷuά:'^}<{4v{znjB:Q+iccJ} dD+5|%Ljm!nJ W(ذxpEry4Z, W./5C^E%w\ǔ 69Iӱ̜U;,9|$hd9c9ᜡu8mU#Vzg3ā~3D7S L p5=JCL%x<"BƂ+R+"F !04T4"Zǂ+R{b TZ=ઇc-*P06\\MtjbT!#ۈpEm"nX WFic Vɕ6\ZRH+mLAY<"\Z TfUqeqieb 6P.ijv?"\WN)b W(h"'1\Ŏ+[)F'U3U3:ѸJӱ6p鹲`xD"ϰ-\iLł+Rˡ*O0z+zƒ\%Y?\*z{T,Hގj̳`Y Ӛe܍sr2 ӿ׿٪`-#EW9q@BfL+Z]*9;Sڭt]&h񣂹5X'~k#Je?cѶzVbUk^Rxu`tp)2)˦c~lLJ:;|Z?SaZMVz~ungj@4C^zXT& JaSuW>(ݨSu 3MDY赊7reLktc qJ+/VA un+{zCP%bLP%~5M:e4aSV4M U;B6֗V -PsfaFt\#B3 !^!DurUSG޶xmt;?p9R~F1å35pG.YA.\8R޺=9\2FUHY ;Ei:ysZi%upRNW#D!=$7R+'Rٵ[MOϲIj *&\`+٣IrU4{4Iw+R fUqJiX<=uFj] !ݛ||U:Xidot:tŞsRGQdrNvUR]ϹG#kgG6]{[ :_Xoop #9F:l!,X+Z>b+ǤvZiC ͊L?UJMW4p!UD M W( H&_Lj:H% #pEM<"!Rk;yT+^ -5WSul&N]5S{*\5R)Y|}Mυe" RD+՝W}ĕ\Lѕx+kx,"ຎ+RJJW$XA4"ł+R "V !4g5WhpEj:Hz+-FFW$WXpEj5tWk\= lDBVhpEr5WT#@)Ձ]W$F]Zy\JݱڞWVؘrW(  H&"@* ĕӌ9j.ld MtEjm72Jd+Qi!;]5+㪙\} ՞j#C3] J \k\D"JF+{T{3J7R)ـ>JhP(H&\\sr.Zu\J`zUqee}&CQ_e#9## >R#<,Xϒo>܌yl6.n<痓/Ge_PU^~<16adHj#9HȌi).j_*9;'7Cpegfa:eYУ_9uNl uV8A Z#ֲAQ9~J\ۚ| G>tf㨈uiQ*_F9,e^Q=˫uN4Y-ɯh6JeHVzsNO K>6~'nIɗ3;# ?KbgΓ﫹'M>,5U9a,@-|ůuo麧j>I1"ɧ; n(>2)vc~lL tA!S9% gF_.b/_^9zx IWTDǏK]?pNzP~."*?[~o_Uζkpﺤ4t/_jO}#O7pnׇKl_fe}!Q__}]6m})\x!xeE;,1,>v7ѐ^Ħ]}x%ɢ5CW\Y|b4߾ƆYu%/s(qA̵ʧ;{n28߸GlωeΔ;klhr3'B)*0Iv;Sl5B/-lw{sr:e$KOх{%k7|)IkJ=PLOC~iſt4+Qx1xar-R@ X꼖"+JMgZ*`YmD\{)8Tq$] LJst*m>MsR/̹*gmZHllsgT!M BrtqHzh/|R)*+Ur3aiS-C@b.$g< Η) dZ9M8lS](0c ak8fZ{8_!m! b@`zDdRKRI=(ʦEƀcTtս[]C:mV(?h(QLnB!VˮPH1Y cMmV:wE"%Aـ 5Z0+3\iu x0.+GQ:l:TZ{΂.-#*T*JvTZ2ɷ LI , RlRY.t[ ҀRS̨l`W&4 `AV$:(k@oB&uTWPN+$_,l*+u ^X =ಬR }Wh%Wk@ m º $7!j ȄكHc7uӝ1(QdFtѷ<t'X,x40Lbttӌ/%RC':&|B(v8(ؼ($NR'D_ 0bˀm/ 6ImIAx nUA׮d=2[˺t$MLc[oT^(B` +J$W{HV*u2P hyCs ~7X|E ) >@M&jZ5 >xmBV5 UZih^Bd1&#TϺێN)},C)y*a3m>%tB;뒄 X}t kR([|Pm!ڃjFP,f,Z{6=^ Т td!.h]`QmB$1SZ(ͮɀ!JP CGdU=\J> a!dEh3A68FtflNX)tV{-6?XI+Y g#MGhtf%ݢZM.U{9 i AIX@it_tВ&Z*mm˻z̃vWr:_³t>2EL%OZ`0u1vtrlѓХE4IPp$J(umT]kM!JKBs$JVO ]z31i*aWgFз<̈=Xe5nHJ4𐗨-ɡmM1WTsC<݈VcqnD{}ФD+Y.TP=`eˌ`*1J[Ƞ rwn rЪGͰ%6vł*'V"⤩Mk}rSpl'*JAy/i腃IP(EmQAR܌EEH,{İw:HC[`pOҕȪTyc@sjo6i]YJ@ZXzPIQ{/AFL(X:-lF]Ρ]뜼=NKîA^;%DlajlJ!@6(# Vp*-]Zش =W&Eׅiqf0KFDySatE8 8%mCɵa,nf5'E!nކ`Ac6ՔWn˥!D$b9\PY 5I#vSZ,}袀%![ UZF]~B:CWW4"흩y0B.8!# fș+E^<6[{ m9 d9o!Zmr|r@i@!CF6d;M+o/0yM{qlKY@\yKݿn>i9Ww?a6[MW^mBM;r#[RWehcm+߬bIGS@\ǟVV~E7V`z@[˜:oCu(h:tsF̓ԳQ猌:VxبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:l9Y-1u2Ǩz= S'ͬg.NrʒUSѢ[3H;4&(CLWvmo M>"^:bҩ7*8is#h88j1Кx t;3h$/݈ ύ] (̓1]]%#+lhZ;"^:] ΐ2V$ EW ]ZutE(ӕ©ׯʊ^3\~đ׮B9]2ktvꥦU8(5"ڌCi#ҕ<n4tEph Ptut3J]}Gj<rNWp3xtet~L O3Hp ]Z}ʬtut2F3 8]\7+BOeLWgHWnGDW ]\gBW׮eLWgHWXu&kiFCW5'PZtutE aDtsa4tEp ]Z}auutM4zLtEx~,tܒt"]q|ԛWb죣~ͱ C{,~(O3h+tS/ZX99Z c+B̩P[·uJ>67Z O+B) ҕ^?qDtEx{#~hj?3]!]'#+hѨ+BO^]ΑnuGdb:/h`'`ğ>A5x$;!I>N;Fa4 Ehw4hhΰqF3C8gsTڱUPj-OR\?KmqӶQla^_iZr939]Mo#M퇷92]yz+]kOPtZxޏ]&5sQ'y'a7WG#C*ؾߏb://.LP"j.i#8P)1JEք(oAN?I^M4@!K6ż\?Wb9 ZKʺKr6(LIDc!SH!kmMHGk| f|aYzð qré(w]P\lMv/yY~m_=CZS@He@io:bAA9*쬩$euņc6TK]cݧ`6mj"& Ib/ f %!ȷ.c׮]S[݋AyS|AdzƖ9m)qxu@!p߶Yۨ]rrJpd7)5\\q6y}UkLĴfW 8\KIh],Vy #.rFd JYDeZuks5m~:Kw|s~"vbΦw4fw|N B>.Iy_iY%MgVndٮ:iP连}Fpo]nt?Wj>7_^ o}9I:}X/L?{Ƒq?$n7="r~5E2ʎ $šH"9Ȱ꯾^JT;1`Jt~NgwyjCDΫ6|7OWjH{UqbEVJPINNU$4y(axB%|P e6}S5'a xUq]]G7|~O-C^СLu(=kquH3cFsM)H,,s \ud1.;7D1g~-aڭ]Ē?BF`'u4mǢz^`dd\0G&WiHnFj?ζ}[; vzw)Qs3u7>&7Ϫ*ufaEލHHǟ:`Ų̧c)"'xXR;k=CJJ,(a-ZGOa<AP6g!mFJ(!lw9si-ϬJ=܇i ; l1qV{)-\]l7 ;ϥ!hh9O& l]uGV!j_ӓDTV!FK҈; ,QkDHΔ[OR8+.3]#KƜrΜS V_wia0x*Eͧ7`>ޥ~6}te ,[~ ]u]?_ꀲ 6l:`XݹO .̍3ԣq67 {pYa2e} < 5S$cCHOϽu5ЯHܬMS|vorB]NwGs :-/rIQ~1y HeCnh1+:]da2W&|WG-O[L: &kݐm࠴z?W;]חwNrf υ#>N U ^kl׸'Q&x\J8.pR$- mm!*D@=ƘP05gZi‹V~QvغyFfjk+y }wtDv7 +k6iCg^CD{߇w1AZP#&^ϗdUz!m*Oxf[oa[q-SE⽉6D bخ<9J;Ǎ4ᣱ&B-OZ0-x bA¨h:&-ړtl5ۋQC3ď< ft)a)mzC'`<0ln8#Zv$|V[\B4竊7-+X$hSڵh{Q8Ѣmz"@W?ȥ}ގ$DȘ\"sˁ#IadҜ 5&%tUepIiZN5bEB6dpնGN\kt`G@qkB)Y;޲u0W&83ıݤU'ݤgNWWv=e B-m \JĹJE WW oBl \Z6PZc@JKH WWI ́+6iPJֲK+,|5'7P\vʳgW(-\] \I$m\GC(o \Ji[gJq)l\9piW(%c-\]"\I+z&~q0fWdЛWnح ooaQWա70if:YZ'R6sG0T"ren\K%1|UǂaLfhp |YWW O]ʃx/n-:i4U@wU\V8(϶-U/Zfw8C`(;QK'U93{YN U ^'="5iW0Υ ܺI!l!*D@=ƘP05gZi v}Dfmmlݼ #lx37K#yS1ŸTᢎ܁ia4}1NWyҼ}hٞv1JM`.)5)pZ)CTQgiF颠R<a U[k4/b spc;=Vwڐ^oyʅ>Tk}Tu;?/tQ*OЮi-Eoz&fLl̑Y3Kb BC8Eci42#p<47s-kkʿ`Ƣ@ECo?<*NVM}SzB_H͊'8y迩OH#zc>}+\,pQ%A:?|R榽x㪉UFNN6|7Tj#{io*NUrAgU%R"prr}U$4MlrME%@b.Ǜʡ 엾{ FӰhl־q#_b\4[I(ΐtnsCǕeOr 6従z6^4$'`z!ɻDRMZ/1ōnopP~JN^8ptv(~3t.WwTfע~\_oTqD⮩6M`\58\8Gr;@ ^ЖT{|82J?M# a*|\ZFa1[vEӾ,kqpD僢)雪=lϏc30[d@.WaR_Aⲹ㎋$tz"C CHk.ye#dbЪhԱȫilЦ3ZCP:{rRjp3Ӳ{fq`N[ʌԛ2UGisC(som 2{o9N,#dKoƎ~R~Xe{l$=+iz+&VP$ltj&jV* )I> *.^nR]40?DVB!0 Q&`sϙ%Lo xfژ>A;|hjCԤ yTV!FK҈+ ,QkD&oA<՞JbpЮ7s`Ů򁷭Hp!dž*[K3ЯHܬMS|Wgy -|-25VB#&|WG-O[L:*%7d1}kGMZ8EX Wa[r)oHLBuHAdqWB #攵dcUi j2Q7@u>i1c0*Z ,0&-;/wbYaw WyҁeuC{Y(~/ޞf'iEK>ʅ$}utb@\OsWP+\t[nwE^/} xULaK 1wܶ@6lm+c3ῇ2vWL:a>j (>-[RE2}hBѾI4VJ%Dx4"g4x,\psȃ1dpe_a،<'U ^D0dkUjWK{n -2]{9v{YN ;#ۉHHGif4Fie%~p9*ftu)]#15ͩg#_;BC,)Acl aRa7 uPzZ3k hbDCH|t0Qخh daڽJ"h\H4I+X(?HN/XLy/nJLz`j. V+_mkT":ۚO6@`Z|.~0|?i5d0?zq٢+ε&Z+S@e+S|Mҫ\~ h2rMKBHKpllAć5=xoZO\yENra+3y$;ZŬ:z'ijx\!HՁDH25%_o=v]o䴦RӤ8K]mDL!Yצ77jJGf-_ΐ"$uHTji '|m<#sBй9P Ӥ5G(uΦԚ(Hfua -8< Qf=?Tyɦ ~]#OpVَK +"M/Shz}KQ-<}stHgڔ?ahU~+djԘ?J`g^Ai==SQ[ OT;ڮ;Wpu B;כ|dͫ݃֠smxRO 0~p;Ù]+?~@1u.W` sX|`gjj?B[tY3ĖAOjqMh|c~>,~ ؒLa><UM$j EAel%p߽g +y/hSNn3Xa1sY܏sY0TWe.K`,9]0 PKB^xins[ cg6y5 @p :hѳ!Ge1\a둠07z=`z+z+M61l+# #C juN D#4:0p(|.|P,`LX)ծn3@jh9.|YKBk;;F}cPjlہƷs#_۫Y-hnmmh6E L%b^Ա{f{ `+" ;(?OĴi`4q?noa{ɅiIXGKݵUd/*nTMɏBw Pr̡}bY70BvmO帱f&'{Q8 >/~¿xzUJir[ݍz٠iΆ> 6 dۭ(n>zwE4I7q-B+7_n-+m?͝(ʷs2j&}p_R%vw[5ݺ֮~~즶'LA|tphr|4 05VSiT̓KxT-_B).E=asxLHLJIu9]# voyЃ缝oAy*C|w[{o$3^@[%Iq|uS54 J]g /Y HmxOUۻ)(Bu]ȥMKw ^Z##ؑ 0X]29޴Al!{R!_!5i'[[a>YV`p)nQ:EǓF1?0 :cRe3v]¥$iuMt"3߲lҴ= ĥLp|DJ";+S޶Xp )`+}gzwV,<~twx '[c2"5YM@@;6 |0 a3z!Y`R2x2 ]hhQ "4;t t^ux!Cho,E -'Hum+e2B9 Gm14BrPX6j N |MJ0'o"+CRVv:SΥ8]rMV` Xw_W:kpiZiD ` Ea.L] <ˆ` gX m Z+Ҝi\p ׾iۣrP#Mo)gY̰ba/ʪ =gZɆ8Գ贅A0œ(n4^Ir9Y]hO*ON0*e8mX׆$/cl։Cap+ 2f%5'N^9lL8dF[(>me|lo*28CB^ 5ar0^~ *\kAp+2 <"-xp6y/y~a@*zsǨ154FIVմ MR'aQ#ӊBJsd?iH}dZó1g>#KyvQ_oRԕ|Rx{bdkٖ$ Bɨk,MI5t_6| q^-H[̦e-5θg%l}2nZ6ΙAiM;o&;;nML G w#;4*H^*Wu=E 3U׆ʎcXCW7!wUj( wqlkS<ٟGTBhS )nݷL;0]I\WhxE>K%M1WHV'q;"`_֗X{2#^be7ږ`}C'Ol*a! s]sBMVpVu˱Nqw=Z!|D`2|rRgroܜyVv՞Wz V֐XݟgrŖ*L9Fk_;5iN,< (1LJRyc:$i *iltfڞ~|rNPRfsKBC }дM_Pw%[üY`O ر 8VV uH tk{L ]aX#t: ?8y̵` gDc%nypܰ+ެ)Qz>@.%M֍!?>O yDԑ;m Fb{x ?m櫠HO} ,{1R}ɸ;h&]k2aYyl5HVRIsn1F00A8 -_Ԅ9 0KP2'l45{&BRw}w?U0]cիQ˘F"üK_=yk7Y6i]q#IODl&<6"|f !j (T)|fm e2\aD+ac=}ߊ>w* ԗ>bζ פԔs5u΄!lC>+deՊ=z.{IqN]j f%6 -BC`>|_T &iz6͘/*wqwOubl -cÉr]|3pD`7\y_P\`R}̺-o֛"1GDþL8z0_ԩ ݃Zwm b3M}L=~Sa$=f2 \`iM'bߘ>QYه S8_e*dXpS=rzYeYd&kD-D p.QCJtᩞb 72p=l] K 2`dՆM' ]&R[`5Ħdɓnu&?X8sVԔup_D_1w@5):@"'T0*5b.[ت+f6]VV0R}} f'D,;jy8n]w7EfRVnrcNe&j9LO &p>pN;Ǚ2F"\n("&z!<H{ъ. ]x+d>tn<-*-dnY+' =o #T9ɛǻL Mo$konzr14+6I:{z;v}'ygӡ;WOwm3Jq߀WNi5W5,/ p9G~~ÙF)WJ)u52Y%1΂ƭJU>pgVѠg|PUA3t 8A5&4{u=n'D^xO³e77AO$3e yUT}Y$uٓy?Me1.k8^&7@NMgq7&Wb1jY读/=ë:V|c/@Q*O <_2,#sN(yɝ$P=ŠcMӒ8$Lj3YpI' <S^^Hi57Qg=]X#躌ɣC 1oъ;WJ#)fK4cf9 f o1質(|v p:\L4LMߌ~3o>uO(V'!cX&VnbX&Vy>X&{Kt+J7Mt+J7Mt+J7x3ey'qs+7Ǹ@m8pawEAB]d'+ o'듏ڠ0ƒp|˹TyxpnI lQtAhs`P I~/]^{<\伻2\ ޘL2 /sOrVZDvjr81=Ha'TCҵAO򪴳į_Hh4m4,Q$FG zpa95\x3Bv6¨}B7)ZS2˱r 6gyfuBwRZ%X'$cJ,も%Hy.>ɍqǑ NZ ,h.dlK2JZ(i}JAKXBɢHy ޸޼񎈤4b6xr=b6D̆1"fCl !b6Ę1fe8b6)b6D̆X/b6D̆=jĔD̆1"f!8BeVp20N' a$slab*Lf_-VͦɶQٛK mԬds}qf;f,3yEA}vhc{=Usvk'Vj T 6Vc+6.GCǽͲ]uCbjS=vS/$}T-u+7wq0g}~ݚ[yOԶkuz2}(޹q Os7IyQߍ @i)Yߨ-7?rS7u\k`qǪ#-wY~biS1]mxwNZN27b{-}3чʬbK23DiE 񒹷* TwYٕFe(;Эl`dBS4 =kpזFan]ƟɱxdJ*nX+ ڃ õ֟4dx :&ZN/nV+W7Mdu.F̈́ƸBǽIr[1*|Ixl',pv_QB=힏"W]LGo|}3uq(Gϊw>MDONZ44tDeCvFuE}$֢핖0UPk=S~X=I'+[ۙDs{xg=:IHB4eYxUnٜPF1wQ 68EOTGy*P.’q,Pqح{z=nr}鱽slm)J\[ BÕV6Z`P"s\Rɱ zJmdfX/̻gKLCg9 aJT-40ss ||;%ڂTjF1bYg`t.z'䵥rF2)V|]Zۻ4bw4yweYZԗ`|Mv00 g47`2f*%ɕ^hI=g:_u r tC.@=ł;,#R!Vx!KEUR $@"Ap ʍ(w[y wMi1ʇ6Ovs`M3- Xʽv9c8KXjj a9Ft_ e(!\ 0q\!"P]J ؠ(H 6̉{Y5` XE݁f"c %ODP&fɳI&zTy0E?kU?n3٢Es9uz "<P4F ]@{'Cx)(B[AfMYZ\ƞIFnH/h|QBl|22D2|rO',ݹ]S)'nٳߒgNSToAf3e&l&k}juhfvҍ 6uDܬ-,i` ]ꍊh[`'LGe`o͸ipS4ߔC$ak<: м(,p@;XL;Y$?3O %5sQg3) Bm͑|dr?0o})M4 ;e2fUY Gh%D&At;Iїdk{-s!C$XRXYB}J \fzkIv-bOjntkH2?m9{)w .W660hp;T >\{PVI.Iv t~(0EB(pz=^cOӏ.xF hTf\S2,_{N|sm[L>2q39b$ @7qJʅ %RF0g a.{M8 _oKw% 8ꐧCfRBtF*z,}ܐȍ:#<5%#[mU; s=}D L!I@d&/U?շ+ώ^'_K6l} Qoq߀e~5v4eyYrͯ~^]-h~\:?uY 0  ? \"~g| xUKр Q&<_btY ܢ³:d$ꏪo?vR7>=y]d֘IFTkNmo[~1ËiAG"1{A>LSxe LjuOmf Io&軘!$ c´%J< 'JyS,,2HI)1ormb]m+vR7-mbeUmJݤ1V?DurhCL5t ޟOu2rKYJf(2?wy;*FCԧf*5̨c'GыtJjǩ`J:K5CXR#܁L{yP"C >^^V?p۴*?nN7Pb)ъh s>שO.'I=xWWg߼)q U\UO2I?;wzYeaO0mՀ0L6ۏŰS#| ?ş_ɳyg_УKP (޻Rbr>GU'k۶nnٻ6n%UX#4Z]Ŗ$0E2$e[Vo3$/$9lb@G/a6㷣wGgGӣ?_COΊ&S]iA \GgVĻϏj͔~S^=o^]#;q{,),5kun_GL~»Yݶ{%w'EAk VwӨs/| [:q\;QڽySԻa1LmZo[sg뗺qe BTb.C4o1vmYC/z)܄G P9Xݾ](byu/Jj9*^ӧ.)(le1}@, UEokȿEx\KYhUqjuq8OUuXKX3֦ިë;O;wY8vGoͫ/ʟ<c5E(O0mtcB7! YM58>w'm7]Э7YRˣdEسFSXh)dj暟qZVU]`-˕eKdצ`jAI\1 ah ;h5n{Qo-עwF=Q sgΖ>E]7BP$UtrMM^J7jc]q[,EƖѿ4T jP-\/ m:.%sk P\FV0! Eg e-c-eX(%%,cؙn0QX7h:WieZ:ŭZyBmJ4HlA[[,$m})'E/~A U)lQ={vk>u- OJs 0i`Nh9Z8~'v|+JNZr2c߾:;;~@)>F?BnP%ʷg #~[{㣇~K4A_?+/> i.vrGAc-'NG> gU hYHy֍6c j+{n@ksqϋߌlzSJؚpTܸ; yץ|fiʨ0{<ŀ42zLBZHFa"Gj)I!=k1&3B*!ԫRk}ـ :<goNkG5Ld峷WЫ~1|SbWAo ?/WG_^Mk;wP&4J-ГyuA=ge `yږ< G tzpzfiNޟa8P߆r?MJQ(M.:uc:-:=G_kl,4E4q{TDy܄!④5DDDŽFǣRPJ H|Ja&QLJ aRX MT%J4w-PyۅF'7W<솕V'SѰɥw 9w XPǸFRt+naAޅ,;֟'ϽύezDa> m膽|tR4cDKzi<*Ι`BYzNfVoT cKto 6 ]8' M5PtȄWReL(2n7.zrߦD.=cQY6:#M4o:jtѥ\?},>=wgOctATaal@h2bLt@=W~Mb', M, JlNqI$cQ{Xf7f9'}kE| 1]Ԣdc:\_emMbz҂MIRF B\)>U)EPI0\0OcmI RcA%\Ä2Ǩ2RR\ LR }PƦ&gA' 0kk! F^pn Tf8P^f18cM0֭)2Ңwdߞ>kSCidR2P,\j2#LQ"iBK)CRY̜5^^u(Hl\6Ծ4DLH"^H4R1n XNBp)a+Qwsty(]uZZ"ϴv$M595.)""e',#\e3`&PIDT$t1fW-5"0h[4WF @ɠ:wJsk8ؽ+?B-Ja궵uT+4ckSEnEd33R)"Ub`qD06;3bjt.Mz/*))rBV-BhGpan2R9 7mZX+"|3d>mX#A[?ՙ,SD[VM Ѯ ;V:+u0R (LpP|[B"n F'ܐ|!`a{ ɏc 3M OiLޓhh Mc| +*iy_a[)t;Ϸu̵i=3- e)!))B_:1[ (4Ք9&\A:(k3r +LwoԦ>HR =uaC%gR8Kd03QD>&t.c^t]zW%wuWyde<1"X;N7\Hb޽.\>nɸ6WR. c! aD92fKOUx%x(lDL݉D')FD'8s둹Gf~"7Or=y X!,N__h&*8݆N_ s2K kKK:VnjtjUEZ~i$boW.ߥ{HqX=fm!=_ YTO^}*s=iEs]GĐ' V~"?ÍB_k9f B !NL(V~N켃W{z}{ۘpg9%_;Z!xr"!=~|9jK:Bm_4ZJ> h Y"f [U<8B@>N5.wv A orp28x &E])J<=OTx6DEJ Uˋ]Oc ZYLXT{]hr7] 6YByU+׭΀ ٩wSBnS1M7 8]'a&%W&fV !5F&1&I-2Lm\OF5'OWɵ<%JTi&P3zSƸ%bzbH~+.+3VR]LR:n3G >[ O.pՈuF;7FGM[t26Lk > c2"=_15Nf>apO81#¼*mzeVӥ KJ:K4i*Qь1Fe,sU)RF8T))϶p51enswNo4C߬rN%s/z'#KbN-ĔdڵK}M8?"1;Di)O:1)͜x(Y['0}[HpfNH.!C TVorNSdmf Z \X}aMe~pqti~YeKH58*|b5$=xtpVQ2f"'4?mzj5Aھfn 0i6ۤf%5o6bɘO1L=_5~WX~s "Op"b2ɬDFem2!s<&lX s)D"i$VebYmŒ9L<ʲ koZq Cҕ%f·fsCYM Ԭ'ԝpw}= .V?SZ<_ݞLsS6-9M'Mڽئ)LJx03I셋L_MWBjI^{v0FuD٣ L n]Y{a7hTs6??0"cNX5'f)6#P4cJ/z숥tPZRVCo1,i-C =dj)܄ tlQ]64 >`4@o˖KNR7M_3=3Tj%A9z+I:MiWȒ^rv&@8a2IaDvwt=w0{N~ ,#,@q!Ȅ# P&w؀l6M% ]ʒ&E6NyD]*0ND}$k$P C gq{-*c˪\/iDv|МU9V|ձ 2&XȺk"WK1;r:P?P bAYisS*SkJ쯝KXDuJ՟;oDsigfq#qJ}};ݯ"GKEeBqV[ WE ygP}_w;QI m7{+!eTϵ ǿӗ寰VzP}b0oryšdgq)HD-dL8<Ʊ `iVػ{;,g5ykG.eQmh( ft:1ۊ YvWizr8^F5T58O30+8od?_]WWWg;N?*2hO5L4V4}pY %; g9jAUFFHb^l z]c}pQa IW@-gn?]_C*ͳ'eK[r Lq>= ފu8ah;O錓~W6,2I y[񣿒"LѢtnm ҥxydZa<>?Hf)R ~eB~R<:Zɧds)7K@@F@[7Iweay*ޗWY;eA1\S&OfD皧ȴU©z:!l/s~kazx-Q.MvfyGSD@p6 9yĢ/Ed:Џqpa]ӟ(WpkQ=ZΓx4 f>7'Y>V%Z= "neJ:& ,z)Im@m@mZV%FSY\0lfF9|t̄^=$2쩁RQD "f#Ҫu:f=lT)ԉ^3*\'r󑏝_[9ubUNU5OԙlӾ-EL*"BmjXUuB`ձDQerm9`b^-"$lĮJe3=NUn:/⽶Υ6!/Z#~F2T- A;gkWI _#~%cdTz A{?٤9^WC.P"rޢ_QB\A{+b;QR\< 밡E('eK:Tn9vӚtlYQNe긬;e;ƥo< Ny4LlFnt V_V{Gөuᣨ`lQ 4̭$m$-Xy5JXz95Lzٛ]u aak oVX 4fB )Q02Y78ArHCr$cLg>)s____Iu_225[!r q_qoqGiS7(b 4azja`0@t u(<1tnX}MZip-tOfPg0HAOGA}k!k^RN+mQto56⬀\m4}42Ѐ[/8˶5j [5&T2M b;\7tQܷMsΞTc)&s4#l:&wĞ*rP;:5/N56Sha)5ܯ:TK՛thhoJv!Jzk[k:^Ģ{*[4ҋ 2P)^0S`_'Ah8EJrb:U_?)wbx1_O:񖉑gd&jYʽd'wXy.aҹ}hKmt% x&Ɂ>Ϡ?DM޴@)Tj8ZGIKXk.mD[zA>w\D%ujNs+A P}JPþO 57虛7w?P@ub:%j6pXʱ, M07U3*]X<иI LcS̹v@di7mBL[i̇رu<@kZ:邨'ފ"fP=tޭr,#4(p7[߆&~b1f!]޶fzeZ6urnhRBn8Ms+P1k٤$3Q-iϙ82bL t?By\R;B#}v7u nYQe6z0Ј[kʜjoͻXrW͹\ ,k XeI{^!'0cU\|li 9p&aPA̸!Cnҝ+sB&YTKD y@/帑Q#$…l.5ߍUJsF?+mǡm;"a9NEG݈E,̐眇kygѴO+D&^ݿcvC" Pyl" E?1w7{=,Q{+on46GeRw7.:uL룭g/ɳt7Ԟ~==ݣ y'.:.@O(nJ'I?E8QN}D#a탚68:_bk?`wZeEMj$=4:Rją(^>M7 .3>¦*bl2lĝL-(sm`is .n!V$}KR]'U)WJP|8+0-J6 $$9d/2!食OM)+> ΪTfJR߽BȾ>lɀYs|/Nvc~6ꀠ_ i7ѣ,ZWYPNDk6g2>%Fls7C7OΰSHcuם@6^{] 4:؃.I=+{)z<{~UՖp&gzQ_+1?O~d'j*]>}.anİxQƈ|uX#@Kzs%FKӓ"4C1[R6󲟗T)QX2-=!:3UqgENA Kʵ+=P%Іпg$4B70ĤRƫrAp}3@2C{}@@TO~4 @;}h.PL)ldg1yB~04p exQl͙p p.0h5 J$| m<[}NB\t"Ccn|`Am905vJ5 0׳o+R |@]3 ^neuໜS2I|^g*N7#Q]u WD{LE Lv[t| E,!v,7'"EUPFehFhavs3qe?Rjh8AdH-ԵvgqLjB]TUI^.xp6R̡݁= RQ;2R^DHR·XDW*w8j}pU+rސCh䷿?S Fg5\Rջwc>_>{~xqV%FAeON>$U(i tv?:QeI('_\=k?_wPk5+-k*f[n69ibd#0Ӵ7Es.c ##nϊÈP1d[:Ytf-jF~%WJe7|ΖԱL_Wyɵ%Ƒuw&Hܲz|o_7Țs+X1䓆-柱*: : kIqtYY:@\Cn>7§&:.W޸.tߗ*Ưͯvmg]F2XY{z|uEov̒Uf*dY,Ye2KV%̒Uf*TQYrDԲ\F:а=dzv9+-6jcGKiCj_?}TP= }S?R? _ϴAMOAm]v+UwҔKa9ۣnr,VDW8堅Š_AY;'z75Gbh>IYcؖwwZAXEB3Ϯ9[:V DF"cNӄ\(5mntɓ3NմLeт WHL sPq?p*SuΧ=EC t:H%)ү|Qtp2hՃ㤲,S4R 3s9"GEmGVGd:tSꚶ #M H: ɍN6:B&uq ӧcxUrYK*[/M/Xq\LwSrc;eoթrMFMg7Y݀oZk͚Yjwn/`ݦMc@sd.׉FI;!xϴ>>'z,hZtдW&UlX(7"7{i-j&X1.T!E7Z*u׮MEN, :+Εo/ \w^+t%Do%yfJWJK%{RFU,3-1Ǡ'nd+ فKҖ! #J '%g grv[ڞ6[q !&F0801pP6JÇϢT'dkV+3>]VskxizAz{H?m0Zg%K~n.)lAPC5])Xs8s.QZ!ɈǶL>ͩJ semہNN"ŘW̙HHW,\@Э h6Uh 4Gm&!ȁ ħeföjpS6KM˃ߔ -wr&bހ p1ijQA!MN`w/к ~!59m~zS;oXipbqC ?q*2}cc2 {ړcުqNָ<q[Z~SmI؅gzNNGJϖֈ(Xɝ *L7ӹka:vS0#-;cQ8`)A EYd |#+xW0o(qrPP.vB)uJ[2&|L*(ləoRױ}{v$+T7ȭpw1j [֟:Zt#GF6m9$|Ħ 13s}Ҳofo7/T&|w[I7nBdIs] )ə'X8F^'@ ]E6$?%{KWsCBqcO#ǼТh2 0.G%ۘ{ȁmJomG8uJJ"g,fͬ?BY+X@ϵ= %2fdSX@p8V߸67gL aQ*S< v,w)OOkG)U& ;3^vӴ zYB au7 f9r{9hMaXτ&Lh|D2QD%a.f@[oژ#X pW̖ Mߑ&zb?H޶۝Qmc)a ))tB6 ?iJZNKS+.#řFZRc C!N-;%mRd,8h#SF%>lsn2 K |C.F]|ʭ#K}ĜSq~si#RZe-7әHƜ1tq4՛jlLuTNSi{!L͊Agdh%|=YO)ۏqԧfτt.? :~!0_ RHތzZ,IB#s 80(xeRw2/}_u"+1Qy?S1HTc411 s[ <p٥R 69k5Ic&]jqLFȍ&vSy'ˡWْk~WJvWޕx`>3mD큮IܽlQn/is X"@./`ϧ @k֨n{XIyU#5%[=7>ysTCk-C~tj{_=qSp+78zxE/vSdQ\x)p Jr?%8k*ng[A6*|dنZojoC5(>"EV{U1Ȓ<͇7{]-^Ϋ84AvFʅ%^jʛo|.;^.Pn==r*вijc`CؿfZMǵhRW󁲇z)Ʊ24 "iɟBV 5'҉AǭbɷXv[*kk|_U !9sVOuHPBd 6*I$npa/:r׆|qI[}7;ʻ/e ǣr1u|fC!m9GK$EMDnd1at{B88, wZ0)e= )D%T{ 6غDmTaL, <c8(8#&IEXPH.8yQŢ,YEϩi vLN;k;SYt&3s*ǻx70b/vDzY-`t< $Gc& A탿\}D ϼ|Ͳ~˼YV=w֣|󗀛#_|)4E,w>w篧:L{NĈv!x0Ne"4g4Z"K [|#>w]?rn4jm>ʤ]MϦĤ2}yKdlo^$kp$Pl bT+1*%(G3,'@ AE~˷%0[讥]f-hqS!{&5m-O//+=ps`c-zž50ʼnboW-o *;YԣDj߰ypy {}qFMa{Iܖ |-vecC*B{_E@7 b,L TChǺ@=d9L0n-3GRiYNe::66=戕&&!Oc{1}ͲGNPiMXANY MEwp21eEi"NGctESƭ_|eV8pOsMˢ-ڸp(67缙w+2L?OZJ*jMhX2 n)O6Vw,l((9m`J(~Qx`T0taUgؾwO^% rd`9) Sop=!X1g4'ҵccѻG[כ\_pteg"NG&YJ; B/C!oE;i;e8ȶԧ?'tzNz Pdo#T]oF*~._"hhXb#VG#$ZnҞ-QR%m^RQdMj-j):K1(@$VGd.'#_+TNo>.{ٸ JJ4*p@F P*?VFG$Tu0;0xzq*#Gˌ"5T51a+Jijp!xlsK0$W&as .|0g'!uAkA"L0Ѻn $!h@hcn_p=`g"FfrZ#PƄps;&q"Os):URaTķMhZm6Yb^|ָ#_{ほÀZ0'$o()Wy oa7 @W?Xӆ I5ȣ]c `jd3Y~ HT%X*ٶ1}4z~o~h`\\ȈD1 {n~mn^/jY~XJw`{DlFzWT_EU/jyKw7IY_~7&W@ۭ*^@fjSö9/o_\ѣ^Lq鼬NחH y\teZHka '=6 gP"nNE=hU\C:~tJ+atL`RcX3<[\[j:>bj*"6FB) kL9sx1& .(CR08V阸fE;PH9dcv 3p}a_m#ae͒Ùf>"ϖ]/ҷͰ.hx b'2 6F5 &c>awh. ?)akC])$oS%#UJ0JSU2Ԅ[ƒH!J&2(.A|{,?V{';l̪Y?6R)wRX% ZI ku4wdWE9y7h:hH10^PaT"x%^̈%#Ƞzf6I ;ڱ $ _lV|֦Ѥ6CQqBrmQ@أ4&LM~|Эl1CoЅ's,5[`?T`V=NBq*zAp~u> S9b5IlZ$0NX3ͪ#PUf4y*4lV˸(+OmLqa.*7P9>]$cù(~2)`e5V@wS b/.A3-xwdE[GˁQ7Xqt6b#P+wWkW w0בؼ)dQ"?mjЬ3վ\^YLXZ5Ը`~Kn!kshP#(l}F-cvZ3@?n@O 3 p-T`1(:)乒 mo4hT1tc_ʀ'PYژ?٣ QGLxemoaRY#W1DFGY&bUr:C:iRVJXU%/f~!t ` Kfq: T>Lv1Rk@z6~{!| ʭ#Ć>;j#;v$V}S÷ZO\smEY!C{j ԧgτhtL:!nli2r7c^V=JKǜڂ)L5 ^c3AWr=<KmR$vㄹsp M8g)5rTVڰ^EvMUn'dhb95pR N%ʹd2۸ߡQw"^<؃όx~Fwz:[T}mK<ȽC+*VU489$z )ýD&Aj5[sRt@_ͪCh%j{Aw$aZk"I|?.4{K7}&:n m:<ȵYHs9eӮ~ 6ۜ4ϕA理eU5=pPw* qjЎ> ; }ԨĊrND:xRYE{,bftGj tzzQ~:0a9G&f8!qBumT#bV+EV=4uw/ǀF9=2u/16:_I/0;4j?r-pD~B0,#񌹌bdNkʈ0G|3⌢ 1ҥEꯥj&_l_9wmSY0m< wui䗽7|3Vrg)7X j02 "TJJ8%R``\j) Ɩr7+- /ُ]Mrˋ䘽j:"[F/}+|>rdN̕` 歑JBwP|c,o),=*xS: ԃLjns3 p}~~%Y2t""AqJ='-wk |;v;!HE]9P B!LC(0n.'nE!bmf!Qy) 9FAt>-J+Myl/s3,S4f=2UPnhCE73&P~(/Zqx:-i⤄ ?A'܈[0%g/h:c,B'%9)=|HU|bp1秃2NpXkTq┉8SI*|BLQsfEb^gt2Z ѻ'P0%H; h$#&ZLTskRGL3nqr~8讯…O4#ZAxkGMgÃ(W^oכy?;S']<|:G8L apO9ٹ-zf,+ݜp/,bMv$ ܾbMķ$4WwK=P!*D=ta-~uY^@?:ϲ7ﻳNˣ ğAӡ?R|K<|bǡz @ZXUw~u\J$ Ɓ)LxYov ͉<"Є³k4QU`J0V0hx\7Vh`7gAzuo {؈/j׹ً ׾.D2+YuእQW"]]yZ TWTx+XQWˢ<1噪+@I_"8++XQWˢ-~0QRW/A]Uov=˛ꪠ%pGN].Vihxtd?6-@{~)!ҜcB)7jRqI1,2 Ƹ%kHHOfd}sMJ+nu'aV_@sʝf~ |⨡>z}] k݃j~|ݯTvkwu*ees3L#P[& :gv*W7~(m읟G9]tOG@v;w߁GV.zx'+% ?g } 񱄯~e:oBqjy\?kW{^`ӯ?at8<3ݼ]ςa` E:u8~4ieg5+ dz5ylAv9bhPäGIxRdu4 .mA)w`]_# c@|{*xDdA$wVF쿲d:,ŔA{dĬ4o&53gVQ,BM{7~s]yxjvZM 𨚃/rPVն隤 >5/؜@^}~Y y{ vi;y>W(+hmFUTִ 9H:6=|N;0-:c^~1O~4r>?OB *TIݑ\VbFgsگ+XtDžؑ|߯<Xqv75,rf_,}_1X?L_90EL13(e;%LRV"܁0!r Qc,@oapXBQKzGOn zv3ܦX9w7Bd}d+o~mv8dZ n7:=!~ kݩ̙>(5; #ʕ(5/@hVtgPzD>h28OuBkR9v{Z;N;Dr :u$\Kg n8MO( Ny?Wʼrݱ?3no-ZY3.ݤ`Rg(B@4y&K-aLS|k?:) $e#<2O N Ҵ5>NTGu%\ݕc;̲Y':'e$Nųji)0Yu|~c8{;jsסcO+v}~n/v:$93XKb1i 1Y[1 Ƙemec7O Jw穄2c0B7t甽`9Dۈf0r'ec';a-uŽhz6|_ G5ylʢ6^9cLµrƑw' K9c(Hp1OMalUA T1 bk6lop_kރ~Ԇj\vC[i%pc>kz~Uf Dtz%,X/Ew7 e Fa"JjMD1dYab7e͘4ڈy* 6G v@? Ο-#l(L=̄,_7=|FƅD}F3Cy$~e<9~%K>ŬL2n-]N}q; ''G&e"J2) &.)yi:ќ6:4T%   p&ɣe⹇?yŸ^`&/U]`\E岨+@&z+5pynQߴӤs9Y7p]H iv"b9F k4vjׂqZ;c9f\PY/QF'q5A^&c^~D/?\ӌ:yit2;jO7$=;ע= N =~ycuyc/!i76}}xNdZ-w/E4ݦspÇ3@xANe*s4Yg.?ͬE.6XuU]8vq7+.F{xj YW.`F<tlUilܻpԛKսQ]~[;(.aXm{&y{zrX'[ &YLq$`:9Z͒_X P qV:vU$Na38'H믷KK*p$s*Ih*5eDdhA #(JH!}/Q (s3J0\IIjMh̘$Cieib%FSYȄyPny9Wk{{z Ё~QpƄF62TYXLih@,oePY(i$ȻB8Xǀe2tBRReH ,3bP_vNAYE3faRhJ|V0}RÌa1-22doq(~<\0 X:1+'V1':e-0IֽpzRWTKq!ML$3|J* 0(+- $i}krҘyV{X.EL0 P4 YfDf2P,piދ ŒeT4e81ũw *)᠆)аR& a.dacLO[6N n7Fx |kƛo\4w[{gf *` @m8,GUbJbգ [D1$5Hi80,|ɂO`ve 5I(Qb/MD~[~Q>:٪T$L,3ж0g A"pE57Du)8+D[ҹPon=9׷eP#17U˂vXo5jw{vs2tb*ysL[dQ Y` 9RᡖF̀_x%A^?ypA=e p|*! SPs9*rSP<{W-ω4N0ɵ{\ neٻ8rWEw,Y$nH<$/)."wϩ,knI#W f3},:jDݕ`:d  'R ?^3mcB66Xۺ[7;/~+t dlODm"wN&~elX~}W3/xhGu&e %o`=ё}wc^7NI 8  {7<_>,/hy渘y /fyЗ)JWHWej?hiAtAh1t/o~h:])VzteH +N6, K+Eӕ vWHWZ܂J'S9tCVztc +fr̠M1@e\5ҕlZ])`YTˡ+:**xԕ^TMv)tRDCyO`tz*z1VDW ŗh1S튖 *Wzt%ܢJY ])ܸRYU V8U b *Z> r]+ĮoL4/K/d.T~h M90{Е]]OC^])CW WR hqNWdWHW6F `obJO&RBNW@y<+]r1(qp2\Y ]mxe<]t5!.r̠¥Еuo%JWt{E8^F4wb'>6|w1W;.hSƢOW1k.g:Ydl-{ӯw}JqZ.>~O}Ȍs _ƱUo#[Qwj 9HJM:bderVwJU7Nl0R|og]F$R\Ҵ|^p奟ݲ`e +]B@-pXR.C+EJW$<@g+.hɓu)h}8tRa]*~IfP/hJŬZg PU~Y^ z1xrzW &r˼o|a~,06_n?f|3?}F'~!Np9?OoN.? |7m}pr檷yo~e.6'\L՘D#]Fє- \BwL>Y|zlɅI3-HhB9qXKi6gÁxDB=[Tc|yP[؇i6Qj1"X"6S([5gFIsV{C/*:#s'hčFy\NXlD}^}81X &rw@& Ĺ3R˜Dh9{n <3qX 9| mJcEĕF-9[ 7@x$ kvo~='M\2vXc4#ތcM!:͖J*0%?!R;:ʻƐ8j1g|5~k C)(oHЈO&3I,_o/Osd#Qr%[Õ(SAd*)S g >r=tޜ@:*E`[=LG*z Č@r#QPc-%k"?к9;rw'_dqu$~b仌) ۘ`M(DmMs/*uS$J-Ɠ`1(5/VW6-f6dk Ĝ@#[d0rsv`K "rk ],)";D{ >b.@N`dMȗ*)9$S cAn1x:{4 :uh ~Cka 4 o ho9g ̥7XM HQ 2BckB-S`1wuV6PŅeQ}"О*C/. C@pbIG$yc6 6J1Z^1ԠmETc-9r 8lk)&sh'7g\GZD*1+wOl$`6p  MA#Ҝ`AV#4\"84 cymF2KN ͫ]O J+9O;PcfH57.4 elk&P ! AHXP%DeDS<T*"uZ SB0W:Cw 4tlpBrr|3(rTvND0B|>ۀyzzۯ6:ɗ'#^~w`*F}Qw L0͐HB{T^'ydDHC6u 2j i<t8 G/,,tA\ ) >@(&912eѱ˽#`4ӥiGt`sD!Ysu6<o 1vX8Pt*'*Ɲ7m l;dzeA4D4X ݏ7)^WƝ OnR*UXOXk3tdXwPsI~EȋY}L!5.x WLE:38Ž5z1 jki, mM1g8Z.fcT;6@'^oBR}@Nemg$9^Zbѝ $2БwX]lc:LL(- A58"6x,=ʬ2L0>uB6 DMqPkV'h?tt֖Esv4DFv@ pf5am@z뭙+ ."1 VoV! q$ໃ> g4Kp07|kjhCvv!fwqnPgVG7PM< @`3=Eo8Cӿ{BwgSGE}Gj^kDv s )kn b Pޞfv7|}Ӭ2#lS#9x7 JxKd€pT9P.ϰ #P{fr5w*2T2,0RF K{XR<͛ {eP=XA|⟿bDqSj>[H&c|GʟuWbE7tEPXc$t\:{(5#Ogus[q uM)6F?&x hN}.*Fn1<n[Yi@W!IBL@Z RB]#Mlp\2'ѬӢ\te(u4qCOZw\ Näiы0kKu2x7]*I!b!XL:I}RAp$4ubՌH}Wa}WCq% Ƹ􏍷Kt!絸b{p6 bg+qІwK~9=_>Bi ѦmJ,F ?^蝮WFR,;4ۣE5{yq񷣛Kz5bsdӛ>@^_;~~{|L61GoD~?‹?oj'G>\ Er}I:Sv1eЊ6~PZ* u &:kZꬅ:kZꬅ:kZꬅ:kZꬅ:kZꬅ:kZꬅ:kZꬅ:kZꬅ:kZꬅ:kZꬅ:kZꬅ:k D?O/~ЕV{\eֆw |u7//_%$HA"AjtM|{}޵6nc"(_"`pbzල>@EQGv-;3J۲XIt3DSywxȠAt! 8έg\ٙ[<=:"bN:[Ϩ\5z ?`wo.kL I?_)yruyK[\ӻT|kހga]"S`/If">3"bYDQ !4eOAqKo΂﷞]Aցsj۳GLQi lBT4D!* QiJCT4D!* QiJCT4D!* QiJCT4D!* QiJCT4D!* QiJCT4D!*hQiK%bvPKh"B)?}/}X-KJڙ6.q:SN#NkA^報{>ȺP&hܺ1x c3^:3k:$ 3hYZ Z(Xi Ci?J IV4~СOw#6ak3'tүq;f~} aYjD :䥈p{])\iyg3$t4(>^.dXe*^3,XO_&fuـ[b߸?_`nI`OK|` @1A&<M?NΥ&|4z挸gߪ4xT0+6q=A`i˚Ar][_:)8!(k[,XPϚgv: V9,h4+&y@ݹ_p4m-S?,3TXX:v3 !cJ OABK BdMFh2BdMFh2BdMFh2BdMFh2BdMFh2BdG Mۈʍ/)M ?"KC`$>AeO2(~ෑf.ޯq!;р+h@G?p"-vF ,;TUݩjq5ZkBUrSCeD,#bˈXF2"eD,#bˈXF2"eD,#bˈXF2"eD,#bˈXF2"eD,#bˈXF2"eD,#bˈX>z;wI?g>t^绯\ZTiA ;X[+;PL58%&!\?pxi(I :4c?%o'F`?&`_(W9%݄>h.3j:Ј[0v `t8xFHL)XP!*5CA\DEeL66_X_7 +@h >k!ַW0 W7Aa-x'|hC=?hXˏud DI,xV@P)gi1FSE{ԭx=TKƽr6 b$ꥅu8F&.0B43.8R2V5'&%8?bda>h4#HLKdJAbʬ&2$6#̈8gI&`C#)S +HM厶aĘf *Ł&Aߥ(brV$#0s1`G̕RM%*ĦJĈ\-5av\E>8!M _ = _ǽ ¢wͳ k3y-֭z~8T5˝H迈X7ݗC_U*ϞW[ۇ>4~Nb%"1 XpP'A?>b 6+{J08Lzޢ~#n2ۇr`'̍hoY{Cc7/z[d},"_f"f NDDE|_c%) wϋ1ؔsan#g88+c%T!m$LRTMa=^O^(]>.Wߪξx֔³ДzlQN-V X{[?C4V*DW׽_k_ ~ږNa5 ?|ec6=[݀GքK70LvLYVfMVdVC?2pa{:D:}5ȴ$ց+m_J4ʩ4s8CKzmy=(u%֞/8U:-{wO5LB%IQ P}` ʝN^]]UsyY/08ֽS60av\q|u7/^n̶@mvE'rF}k阙jgsoJ{ zUU עCjc|7&t\E=ef[lcuRKRnb) YpbCCx'aYMDBzu Qc*V!UDIt(?F 5v&j\Fm Qcհ;[e4tsm솅?\ܝIݍ,y>n:f}N+tg^^}7= (uGau%CDyfUz_q_!4tg!0'i"B7U]XU!u+4.Rh,6/)K978蔧d:ˈ?PĂi$R ۹Pu^]]:\Ӹ\~/1o}Sum鵊? X} 9چB\F&}&Q*<a ]lUb؉sU|(xHqFG~SBw6>4,sןy':-qxCmB;݊Rٞn;XI޺ۗEYf=c$K \u1&#:,C)K!wdU)zB. Q vk N(fӸ6Wa,  1S"_:Sn'HGi5[n-I,Tŷ> ~-dBAhs D\UQDե6W_ίTVħe'rt8H%h҄$&;B(M}E=dh6|ɅiF/r_Dxt_j(4¤ǂ wg!~領LtlN$7WBzbw*omwt][ eB&r+i޸>K84#HިȀ̻{[:}j{μ͎߽dlBZ5]yxwu=/`Kӷ۳cއAWn0PtGǩqUEp-zKMkN?mon}h3o#7b`0_0x|1@|s N%qV}ݥmDrN]*g]3ICl.#eb|΀ۺ<&tWRAѢ,pr /vP6Vu/|6Se~q ;=0C8Mܫ>ϷD/-4j]OG7x #ObL['2:C-'Ғ2$NDH)71܁{vשJJfi%IRi1vVY%d鄦ƥ*˸.I ! ~-vkOZjO|xCi垻}a͓_5M?cUjbWrjR$zbicJNn b]RG (_dA#4K̢eҜ}xP:P[ )ݚb׺rܶ2IM˛wy=h>Q뷥/7z Lj67w(/*x >4TW7Nڮܾp y&bށ/g=zi:Wr<'߄I)GH!%^w ~v'[ BuX1Uxj/l&BD=gˑLG k=aj3jX$"﵌FMFSA}cm>~B7?^l3M+~ۻ {)^*$]160?U D!ۦv}m{[^yr>dw{ws!z@V5?p}Idrb^jλ?0˒y;`Srp dYT_rWāoy D>-PyCHKDxq=;@Ѥ@O#RU kRRgL>B"I PEI !@ar"!'Fݍe'J$LVފ&,j독קsVzyC&w"MgɋINNT>vW80$5AZX,@_vGEgO> D9UaaR"XNTT㓎'L9Ki|U(" 8#88>jK ^Ȍ ^`A ƜߊYu([#n\dfC{To6LcNN]Mټ} 3>YF&jVv?2I͉{:W6kfE=;NzEcofu`eZIMQǁڐgģu]X'!@'}=}#7_CcrUiq88i!7uOXˤ 柽u-M5ks rT' ,Ig.I\gHQT-^)G~e5vX!*`BG Q*Ҙ|إ.`mO.{| 1V%zav:y ;r'3t4D{Ql=N;[B7zf!% 3Bs4SWec&ÚݍaJ_toO^U7LU";Ʀcfho~ggOw v;XdBc:Xx4|2'Ԯz:8!^ؠLaL )&BnM~t(x2OA Q{ZIcR;ĬW逰  W?aoPǡ@*]Soy#h&>ӝ!kgJbk(v'{%;+20:i2| bK|ȺdY\u431h۽Q.yv#"QfDdbcB)7j<۴zȰ, Ƹ#aREb?D hl`1=qZFdb?ֱL&Ú-cK8_h Xm]t,n^_06hw Kԑ +$aB&99wCѷՐ6Is K\ND`aEFw0i8Ϗn4󅹟,wMH nz~zK~y0ǯ?/x~LZaYjPX9z߿?އUzԲf eUElO7KF'ǝ ;f7ۋLx2 -f;73盙剐Qqwfl6V 0I Da0=aV|q8Zڂ- m|,d^GU!5>]r`ұ8muv Dd@hm.Z||h[B!rF Rb Sg,ԙSs#흌>uѓ.@g'$̐f}͞+x@-RIڲg7箐o<g'b|O !+! u]}ߴO:|`z- Y (9~?R%,, lWPJs)5oP:=W `)"ctq k>]4׋ ֔0E_%P0 5:mqk83˥CR*EB\wy2 g `gw55I}Ʀm۪~\]U`S6[0skv\iue@iSו jE[ނ#mL{XӋX tJ,F~l'&rE'mwiF+@ ֹrZ}kLTP>Ť<7ПEi>1}>=~b>PT}v Q fFb"Q MNB9D 1 14Gs>|4Gνbtb 㣆lG{Fݸh:/{R5SlXmTv b3gFӎí伳ugtbf6h a'm99pg k,ZzgT&80+zG)ypw .ER97tՌ_IM헿58bBYncp|ܙ|$c)$vrl+ٞƙh@`NPg"R\ٕ&IK>IR~d 3!w;㮒3*IK$%]Aw8ҨK* ,U+Jq9ww]Ew)cAlUƽ_U Z߁I]5GNgyR_;+CB`L+ *:Tq㜶Di"58D| W=KJHesq$x 6)A2)DN#JIPnjShe\/se2_&T\!se/se2_\/se 3_\/se\ݥC 03EpI\!zEpIJBr[,K_d׿˭&>6 "gKPPB`D$4cn}_9s &ݽϽtw,!f7ndJS`$ HX$Y<`ѧ4ьv8-ﻧ8]XfYǯ&SDc`K1N`,Dd%0le_X}N`O,Tm\& i925Vxi¸R3$: &c>M:֮ ,r*baYҥ ӽvni!7%'FZZѷao)XQ-mV87۽g\%λ*(1ːE4i"_Yj?⼝V2y{ \; j&J~o7og㖅 iFc~Ձ%7iQ4h M[PV8dVcoM:L`5R1G 1֑aYqGkZ јK&ewۺn;W(VU+%PU78}:T>dL׎4cL ='a!i!3{ c=vn%NTh`k.PHx3jg>OLjSR:†gTBT^nV㚖 {0b@Kû?ϒn0bKFTڂWbjRe[u"1~+%4]ۼPpen /gF>zRc6(S*)/bXom${lRlJa^iGjd~s6MA `c.*5X@U6Cn 3$$/ WvlhЕlTou +2k}Xhu^trqL|9 .f&}v]g=$DrC߉e=\ѓ"0QD=Ltد%xG[-b? 5w**9k#?nowoxRu!Ni>5[cKJŮnop%Ѓ+0RpHY QZ͸JurR4{t>DaI AHRVa S띱Vc&y2豉hj4BZ"+˜QGfӇ*_זDSaac3M+nUtRifYt8/_PM7VgBJuzr;+K;x;VNuf{b/SתȎm2.tqp鐥)I xt{xi +w+ؿqջF}nI޺/GnufCj=[6-:60mzL+eM&M柤G<9.L;Be:k\{|Y&Eyxۑ;#)Ȅh4g%߀)qr!B^n_\)\yῃL}Zp0hX&O&> WjqpdkĎ?'׿*w O*rR_6JR*̀ Dad-9JTlm8OWE~?|ldB k!eA$A(kGoo['"/\5½e)AIN aRJ/;TGi/<鸬e`fT&3@ۍī+g@t1tz!x !3ŕuQ+ASVLXy6@Y4&v e}Y;ꁓgH25_X[1tXL qZ#JCQ8qN[4!eq3n#G!crQeqc*h%{gqoR0#4'(xAVsH`^%[fhaN%z Uwwܴ[ϺnnO:6D٣WMBZmTl~SxAg[Lu,Y61mRP~be]&iP77e}sGLaXQG Fi\!14!e#?@1VFg ks6?wsuۧͯx8VZcΞ)iwKz*zT ZNڂuZh>DfӎxĒ פ j:Rj jE5;˜0ǀ9xyeA^2&Dh3Ll3jBX`UaDbԗ4ZN==߉I\1fWiDE\r|*R,ޡB{$`ވ4\"h:uqF)yWP\10&=WiF\rqF+NL\݈+. FG* ]dnHާ92(e reQ?ppW O0'GY( reQ DʶzW@hN=W/>A:XT0{5QvsCS81@lRkzQjae[[ge0#c$$CBS&9zKiLpC.NS"ap) ,'2LoBeZF 0 EL QPFQ1-R`,3R'-"J5EL~lj1iy5V)' N)4`J"Vs 8b) |6g  3C?UZ&op\&Y8((r|Q@8L>Dzԭzv67JKٰWMK oA8[?CȆ?E ~\z);O|D65I~T4IA'ISͪ#PEn8{H,L^0LVa!i=ظqשQ@q/Sɴq<˦ϛ_d/_T`[qD0[tv :bLY;z8+gSnHͽJ`0jtb fK Upu4Kb0U_σ>xYS[|nM%բ}k>..vlT&:UCM[$ &b߿vW V5+2i7|7#`LßwӋ+ RhJ`c F?ړPXU;K?qvwO 5&";<-V@s }kLTP>Ť6Пbhڞ@I4|*"be$*"51߄!ūcA9SU~ n~*&0>:߶oV! kEWK'-6snoL:?J;;8'?Zښ^-]G(B! B`D$49%Q+d3!Ptr.f҇`E,A =%}OL8aIpоY,-,xGI4 Gr 4S]ߪEv7SPRY܏+бPkrD浫;?EY iE6Yd=VfQް53d-0@cHjNE,}ܲ0#`!0h̯:9- -A)\EiCl5jܤZ]̤V#JqTQQjgZ{_i~4Y]L02`WYj˒_eY%,-"Ӯ=+JsqٵW]լXTb(U0xa2Gi$$+d&ֻL^SK*KcIf&RRXv̋ńf#O5mFGUlX})O4Ȳo +ٺ겮/84}SnGZv >(D2 ,.Lu43eusڮ0b#Q]#u,cDhD##J0rXZUpFKD>L5V wƪ>m}lE62yeu~g2&åoGQ1V]sYe}cm9Xc~Hsk>ypC{pa0}fG\i%;C3z>QMwz؂$aQBJI _+C-yϟu#iM@ô{ܳ]W'_Fuy-ҺϫsY{߹֝>}g=)q:skxSkIS#hyGn z}.pq+@},b- T4c8KKB9UaaR"[<wfK*@^H;d}i-B)_RLҗoWeɌ ^`A Ɯ2|+f^ơl]{Zף-mnrrC +0pP2w*Z2#*U JN0+D{YFa. ӁiQP"'U{fȣ; iU`YT͎ D9w.!c}=rQDiLd6'XwdcZadQ(֥zH du&̠r0IA6llhliҒ700%1ZĿq_"qwƎUP ] +6re%/8G`ؒR {+ye%+[]yU;2$I3iܰ+kZt6'Cx(x:(^%緐*t;#z!UxBI>2iM%&ZH0<)cJ[^IkedΗvo\zekacdoӻ]4evI9u Ʊ=Ⱥ\n?|WE{_Av[B2iڎN,8e<];[̆ tYg bW'6lG6lOiKwjc؁9P m} [m8EYrkS^b=ۅ}uX؟\^&-~y.;d:[1s!-v-lb챟hYL3EawE~89,;J` P'Y}T}-jn9m'(JEUy팷F.G.p£r9hwU8w:|Mž{[ȱBHWhK̜)k2GNYU0BJQoK(GaȻ=NYwߛ~7",/z%QH+C:ˆʡU:K9mDjq2gڶמ+ +b[E-m=3{9A ZnUa'N7PxVҮio?dR+5mzX΃;*GBфsF[r϶hʲrr$BO$׏#bRru=E\oa *Sc F).L 9w *3Q|m4CjwՒ5:OO?= v.k->Ʀfx|󅹝Y@l_!ӘVYEGd qgѥ~^Qe0XJBҡVm$lQY\SnW.ٞ~7l/i JNjT27VGTnDuj@h oV )A+0f"p M"ZX*dTd:CҊ!8d 2 ]ZNWRfz?t%lzqZL(91]ḜCi0Bvt%2]\*X%CWb%w`ΐ& K ]\R+@I*Be:C +,y2tpU2*t(1tutŴ&DWd dTysă3x; )Wo pΌbGގENGD_{Qxrgs 6bajq#9 ]1vt%3]1z UJ.=3xJN2]!]9 f,BWV( k]Ed :ahE PJ $;0:gRVauLWCWPRRW:y ]ZNWk2]R4w++˒j+kt3]$D2ӓ: n:sWw^]Jϑ% t6od8;HWq4%t6=ֈ0]`M" ]Z,NWLWgHWu4]A4]:L*th:]JA2]!]Q- ,JBW-ƨt(C2]]q0Oi*&X&CW& Zyg:C LN"`vm=uЊOJ3]!]I5O \MS+:UPbJsYhz~w<z?~0A}Lƙݨh" yt<,&C3 dK!sEoՇ_^Do|ezE4jLMC9 n6OCQNn&~]\ozc%~ZB܄t…UCo[kecw**ˋH&4> /9-x(q=>и;nޘQ,PMJDw; C՗Lÿ>qfW}NC9?ynMekr՟~SA^}/c_ƿB+X[~(fhoKRg7Lqk{]zj^lfj^<F߻v{mHOB27 ͘׫ԁrPzתZghp,b0|i$bSEu.nRܱ T~ǀ1;ĬW逰%AF`g)OcTkn<5O?^@`q|@M>< y*Ler0Tќ4nÍ~hDpۡ? Ǯ_aԛt[|a ѵ=tgz;EEl0>4{#Ļ]c=$.Usi6w-q hOl+O2xx{o/f_erygqj_jS0`[i[_*6z_Fy̦zqjp| 6vE /Wj+6zKփm̱i `dsmwbvf8!5x|ݫ\,WDb%V(0C.,>~bFlR}D^%1=6Q" {37=NZGdEgoRmy&5Om]jMOЫއ&0 b"dՓUϳge&s*9[Ҭx(zV5c1UӶhXA·Hl=BS8!,<%ِF4ň+X+I@M Br[j/㡋sA;1@%Ƿ=65" J]tͭkoy>u=cxF%{ Ώh8הKc c[gQZI8: v5ZiZıX:@ P &3jc"2 ʐ ǾpV?%A e4qv3pyan_WHh/hmvx;\ ?kϷwqooxuQ,K=[.?Tn9 * (<8XBBd:Pn,t6g<H_u|vMy+7#y *RT*URp˓6\x Q#DD%ʁG1cAP@Wq4Y.S︘U4BHE饰J2A: 32plE?׊C])HN uJ)N-qB&8)̨! C$ah4!-#5HuVMkK0l@* qQH!xUkp l oq.^Lk؈\ -,ЎkdYD" ]_'L -Lb/IjmQ@&L ?֔ݎFa I>t\ ]>'R 7Q@ޒ ̊AĻ"olbF$'s?3?'R ,6t%I~ \qX0:$b8kVϕ rE y*4lV8 //cw2 :orMF.;z8"Ef +ҁO]>A%0LG d>#̟$]|2?={Zh90j1F̜P+Wk w]Z)㠨@wٲ@^L>#[EbҾHIfڛdڡUEM ,cU>$ֿ4wgGrU2P^l\v)۷#@Ϸ 5 z9;y2!:sq|be)0en 9=Rc{}H=O|N^b1Jibb 悷y$tX &g F8V%꜀gq@7X#J]/lC$k2>.H-DK=; iE`Zs"N)}t\+Gnj@R/N,@DӢ[H^'3Z?z/n)YV)@!1B^ q4tJܪ$"MkC-w]wfkbŪYXM&~ڛO~^o{;{@YsNs U(fQF;Ƀ&k(7|L+{@+e_%!N 0%PgQ3j/A IDGemXcnj&KF1I*‚4eFryͣ9A.jRU\ rZi䴓=0}"4bکT[W_02|µ+uwkh_st1K}{I !E83QL|"Lmo=OgjGlPopz/¦ ZCU+ ;hN,X[i,2`eNKȧuL˃'f6gҏnjEEbr&;mz2gS/ǧw#,e-ߴ|s|sluA 7eC_&nL nd]T=0y#-_/ rle̗cͺxajL0扂S I*;YnԣDjӰ MqFh27qrǴ˭۝n⦣W[鷺2"T?7E]t 3:9@ \EUdޡ`6{/U=ʛLPj;yo'wmn[V&%2f2Ǟ8Xthi*-<&IΔf.6 4ՠIq6d8i6u8=a)Z-{qe3QǒS0f !_BnW/|xd:oѵjp婓bӈ+ID YpIG* mxwƷ'hOO7"Ij)R'u$qT*%zLC[o=1$KM :wڑH\ Ot42Gָ-yVAq Oil8Dy4Gr>.8 7k!@.s6I|:*ў\@g5r%AL 'I1P3zv0,%]udF) 8Fw .y=L'.hh8Ϊ+&>ӟ6%oa ^̅qo?'Wytв;|~ϭwncr<ύԳU̓ahfDgy_^@BoP6q, SVD2Vk9Em\p-bjɜ'J2{hor :A$Z5!s4l.KsQq ygLX,UB{Fo:nc9"&6p֯kWu?h\Iw-[=}UKIE?eЬ>Mn V'ڭ`Nl>un4zoDY -7x8jF@;=ĞU[:6=([x"k i^ts3.ͽ#*{:ͮ[EFט4_so5jH\^m1_ou|D=E*L)"'FCXVQ ZFZ=G0-mv؋d(mZ5pw-2uuT.>C_*>'Ǘ igi%62nu4[mMr=)Z(.$mv3k9jT7дSwd)+yR1Z%p,j(j(9owp8ᗟ.0mhKu*2͂V}dz˳lR#-CQ$Towg/I86w~"FpR:3e"iMxH\K9 2I(oJGO>T8:b3mL)۬~ϙR8{˚vd>Cspk0ILˆ1 ,wJIj5ONB\"N?jS-T˝S-דT#vl98^osZuےK h  BG o2}u* Xԋ ?⸛~fLgiCZ_ϷgNIb}hۓՎG`dCکx tP,QpKjΖcmQkRaʦ5,b٪oG;sRob8>-jvV~k/`) o>"!$xkL¡D#EဌyPnEk4xF ӵ4"Y5 l5 YkXA~F&G}4Y.. uHDuα!gE8w'=vܴdHb&ޤ2т fbN;Ыv6nu(| YH+?RsF>n=a̯^YJ%TVCZ8Γ p#y"E%( 'A 쥙W:ThJ=>XX8DL!OD#˿Зr݇cvn1xdcuʴ(aS }^u7OؤHX|w#RFXp4V*l>}FՉWYJ3=k<]fzϽ?Zm-8|+-ĴF dxfͫ\λ \A:! +r&t>+ₙ01`fwMn5]K ja;Z<,H/]>T[qD0:O1u(ޝ#)7硫o%P0 5:mqk3˥CRUt.sLHzeŇ꿳5%Cg`J\(7YX 2nkV54W0g~ixe \U@逓וB ta$`#] oMrgR(Kf9f6b+֞,k_+T( J|b,0|qii :@c ǠO-w^ϷcTmS+QiO5T-U鞝"HlUD>jbISPBEChkOmĞZ`G]yP1і>eE͠؞;O{7cǺEua&9+n_%?}g=R!v"[ZT4G9;`kM;dlTئpރ&)RR'DpY,-,xGI4 Gbf)hL |Dm+Je*bbB̗ǺбƛMywM"o#[ Wz5}rmCvIsKVX~`9[}zYI~IC6]ߨg>8+:goɏ6ƎK&n OmPvv@fK<۱n7ZեjJlFlp@ ;MgT7.*E{$ncd)$a藺Ƴme6y-/7n:щ&Fu]doGׯJ^Vu~QyWZmVEb7ah M}mv[yU7yeZ<hTds8: :QWae9j ?&!FUp!gQ(gaXpѸ{[#nc@Aj 4 \ {'p s6əl>MӋ4b\,evA P(Uf< O'ma] D͌ΑtOy{H.I& ר]QQ7_> HKB9UaaR"嚀l=Iz4+V;U<}v['! b _~;1,0f*ٍ%)͕JF(eJH14N9ťoW12#XP1g&Aص%vMf6MVeiY.B|v_bҀ*â%TT'ZK(It yISt$LEt!"'U4`Al@e# \H☁D 3  D96d#Bt9-8V!!3uS/ N B  9g H\JbZiĴư= uy_MIJ s'܌tL,'WɭG+~ʡƯW+P^_]}nf ,sdr-->';׈f~V2y߻ qݼۿJ5. _J0}TWzz dEP+xmyF,B0#BXYLqZSFD$d$EV 1<e1t;v?|Ufen}ވu'(y9W{U찴Tb_Ob.h-OX-R)֦8Q=vݚӬ4ì$NZl*Qg?5|7RT8y24`}x}$80-& >f<}S~gu!_B!qR@ϻ97\+ASVLy6@Yo{Qsi*e :ˆq']YB15B/*x{쾱v:*i2SeLbZqr8 Pe`> f!so șMGy  yu9$!6:ʙ{3WVZtHASwMG;Jk$>ߘ6$o cj?LGgyp<;xko8gt(l@={RL?ڨ^o5嚙` ʖ5AmjDY* \rmP&qxV-{A1tcu1AUhb%lJa^iGjd^ wE% 6@UR0X)Iq,^83O4ǐMˆwiTħ]bO_L;ys"fsWgwtcm2OQlk=SRl-'q͘+}&`5}6 RHΈts0!A6̢UԿE})"-i}~*lK$&nˎa^w>7um˜} tS_xӘG"h@QAsglrD`pY2MRbbԖQiBsWַZhJ FZiMEgy:96y6T+.O53Y/| T?YSmLwY si0X㿂}:N6)_Y}V5yÏ6Ǝ vL۾&pvv.wo^6ۙ@v36Y0RUL ZyLK`n: VrvQig/j  *~U-YE]0>lPoF^Q  훬P\2h`+1g]v8*͟ U9cy lz3$|~Ȧ85ٗ級`w,ᏋXz%"Neׁ.Jb5dVcyGBY0't%>fFJcgf0˹M-3~S'D҄=rũ+-̒p\=rň +DN\򓑮l8vrf9%Wχ\qe*F0Ow=~볬(Wgf)mū136osF/_;LU3BlXf ۯoVTׂ/(5a҆WƊ=h"+;V{5SBw ڏ97_ et3XydV:X=^}( gXuSbK:jwEՆmYރ]t9 s@T@P;)xp 酶$-t0Y01%e5Oc>ah,J@9ɛdTH,y4UEAMI.|^ 8"Z`(;9$R~, -J NY sk# S*TH6!(Jjj{'h?eGG!n։MM)HN uJ)N-qB&8)̨! C$aht-BH0&=Č ^P$cʹ%wM*v`F,a4P1x: Tu5lDS0c3Y8_B,l0M i&GG6Y )Qagq: ߌ败Lƫ1Rk@yS[N~'nX)o&0>nw_7_܍ y߫\^kUo Պ3..s6L:l(7|@y8ji_+e >k| Wnٓm8z M\jMd73m5RJ96WM q;my+{]@l;d_o ; Z:bz$(@LYT쏋Sd.q`(n[j@ qr9 čNz鴚ci5/Y:oc&g΄ 1a=36UR$NdIr.<ŞO9? fvq{ƙEw龍uuPԕEɽ|\*>tHpb@k|ƑM:='qΔv(FGτV()5έ!9x)쀗1K7zNq ܎YekKe7nf~ޟJFjzǁzs۷Js ,(E(GQhSjžW7Xf >K=ץSޗWzXgL)rhmĔCd9PfFX0g6g^gա :^998^]jTgRS%uSaP9w$+wQ)+" 5 䢶I.Q1B;lWI˫3ZqԸ3EK|);CDƪK=%v ɛ{ґwŠW W2B?b0`#%\2|(9늑%c-58 W9Dt]b\ q )|~;];O'O]}D/֞=9-7*F~owqZt֒(ӢB[k_wNjs~QjaMkIhz(7j-IB#s6"8p)(xeR:W}rUESyBֳ(Ek18N :kMDB!׭d)5rT?VXt0a䱨۰\h'9]qwsA YCy;}fH]]WyףR.i m:bQ m]ۻν~K;>hu1w9tf3Yt-D6v>+~k-d4tOsnBp4d.?m%`fr4ӟ7mZ!sCDZuKM?fھnY;MHPk6 "ƃaĊrND:xRY|D `Fo& N W䞃vh>/ ϧsUbBN/To¼GY}肓O'Oʻꏷα2zMG4 L09RX6xL|2gzbd[=RڋZs8Z6^{-TH`c)qbD2K6Wjߋ긫!w-g9gC"eCq8_6@v06OgP:i80#U(fQF;Ƀ&k(7|L+{v@'e%!N 0%PgQ3j/AB{QY=t<;1 RqQpFL8M\p^($E=Yp봴cuq!99LѢYXWY`UrCp$%y8z? ٍL/@/(G0| Q Ք<s{%,3UܴhW=|(-tlۜvlx25z>mw_ZjVO}yhbq< $Gc& Aϝp2tH߶L ɹ4dd`gs `dvzC>zJw%^د|3`v۽_VOg]Fi"LNJt S;:$ʏe1*g6v]~ G7 %3kAѕGת/8i_nw3;:uǡȌ13œag,6tSՆɢtנ^$Rۆ}{lْm<]5zRtN=ଶ8\)Tn Vt1[f Tx{!T~#7JYvSAxsOU;mXF)x-:6V=fOiI&U",aڬy/vRMXAN:ZU\XF]iwp8mp҂m΄%o.Gb-gX9rqm4kG8?eμEWF3O#F\L"i͂K:2^&ewOg_5jD~X^pxW>YHXS>_fw9gYޮr|4)o3 5tʧe/'Jm橨tx8xUaƸh?ɹ *b*.u·@Ay8U`H!<4J;EYrgki5ۋMVPA_w_MOls`)?a'{ 0v[)O?q*q~4΀/I`d[@y8 >+p/TRgR'AH\t qhtYܶE^,wszI^q6TӼ{Gŧ/@DzSop󍚿d8-k~G}cAە*=~*`8'k|oB^&]e͑LhsvOvy9 J iֻR=*?^ac)5YBQJ*%̈́f $6l .\ \S7p`bA0C8#q,و<\E\*~*r1wWoG\ tզ{<{߷Ĺc60"MPRrշ?V3rǰ5 Rq xĝHAQ꒗6tx*#ӀjXVlSB^aG$s9Nƃԁi;`k2Q#5,MnE55 6UR }ϲc"`0iMmXS;c:OZ#Q!R ogV=N g՟ů9gZS.=a~7jan$Wws\-] FYޗ+_Sh!x$ȉ "v0WGNp8nFqD6J}bl "HI #l܌_F8;}UBZ8Γap-/;0{E\ KP$ bl$ E&j!]ꔳ9fwg`Uzk{Ir9[klB4XXYEiqa\JqFF@})d\[Ө,@dMRH(Łq ǙsgՎ1eTEtAwp̞3 ޙ׽w=y$}@7 ץhПlW44>?z wke9Zm"倨,vR@+c mIZ@aT`bJj0 9vO@Z+hf~2#y >QT4Uܫ4'm`#F˧NbpcMPDySG,ഘUY!nmRZ{RX% ZI [M`[`M ~( /T6A%D:y[OY!dafLF04ez!SaNiu\N0}.DJD hWAG3b 98^.vva#ocavBc&I} H߄:ɲPzUٝVIUqPտfg׽(ÿ=4n'w9%zm.t?_z?kY Q@ޒ ogw3͸ ~>lDr0~2s󻻼pj5Vׇ)az1aVUqj :" jGˁQ7Xqt60sByt||ZS=nv>fæ(z7yds ,[[} rnQ/ڏ-If:2amѪ,OMQXro+u>l(op*ľ_L=c|B3nA+ re*}`ֹcqz?lڳPXTs{oXf\GoxZY\q)&X1C+ >yi?ߨfohھ@,=ӟj8A>D5QC2F03Ӗ2Ɯ'KYvOvL]{+&0>?(!ky{+Umlp`:rÜ1+!KB@"DðzWoHFf_;P)JJI d1nSTFJ7XF\`"DY挖 '}:>XRxc{E6sҗ/76X&!)H`~ǫ`>nS9RTjP>oMx`Uhc"Q'o3Hh!|YìN*GegUe.0u۔]}ZkG?m|tͬI|7ﯺ>yW!uҟw=O>…%mm=ȍCF:a4tm:z ) Z5skԴ=,zy'njBK-QKwiy|[r3=r=|}?mhdv W踭E$pW/fDNϚn9&ԣ6/Z]]\s,q,},[64* a9tQ>+9#IQdw*&Y8I?m½/' ' @=+@ڽ,s~Xpy5Xu/+ogخӎY7Ȍ78ʚg9BLzp>PNjq W zMG4?h'>r$(LkN)ת;;³JܩH3Zs8'NZ6^{-TH`c)qbD2K6鄭_*ܵ%w-9VyM A~/~5-2 .,| 3(Y.0N *0$MF&Jw!Vl"}ET;-D ʔp@͞E"F]Rl]H"` :*kޜc^M0@3bTiʌGs8KbVsv.xlpZi䴳0EARg]=`MG%H1h;!}.{//?w:@:[ĉ1ć ԖЁ:O@V.Q8mgVOo~,k8x2zWK̓ϒo2 !^ /y{nz@9u,Hə*.B aDhN]iEkk6Y nsM$u?{-Z_&vmz6US3lOadY7'7D$\ń 55JJa)p#c&ˉd 0*;Eɻ毶+ݭUY Z<}!l} 6Ljy|t*|y:|Y™pfa˜c˺pzwd*LL扚V{uv*}jP;Ԯas==YّBen=Y);%)/n- :^ T@WB>ןHܺ5Ɠיm Yl4 ڄ혷1GT zػ-3ؙ̺vJ : tR׾]MuחʫBQU9q8v87ud(8-WlCv+07?Y.1GQg+yz-@^Lޢ+w&NJN#|B&f%oH?wFtHƲ_:2hKyʩ"qQ'OI2i7jCdVSiHQzM;Msz"DG#q` rZZn Yvgg&'S|vN;#yoRtf3Iy7^OgpTk.к `0@LG«+bw*L4!*~)9_,P/0JQq0$G܁FTe85dZٽ=U,$ɈJ٨-W'IQE0dKL ,[):mt|mK{0wVM1=#h E(^#.K L*F`JŃ rNS&>ءÑC[T wʺoD5{l[z,ەk1/fVQD?sl{[B~A >O[r`)hrg{HWi`r݇cx6cȮb N#4)`{^A/InwޫWjޥL7Exp){F;Saj'mos"u"G6WX//\Yd_NڨLၱkTf9mP&qx΢Ybl51-'ӆdC7"?%P0vY4a#A2?9HysQ b"巇70 d(/SSY>{tjjm-.?ʽKqmC&qSA6vI},,&fɪj0<]4?֠w}jS`TjgӥTqexCZ{=?_0J g!JXJE;Õg.0@Sh ]M,`1X>)F6y$"kY"<5aw9+hVRtg !ynD t8I}ߘ;6%@j6`5:֪RRi: פ &uD9hkcfM0&;mਥ4}=z}W:o9ukt@׬aHj35(/^oVQ۹BB,[JL.0"~ ljOt燝f_3{ISdsp<*B1B;S+F}4Tw [9^2H)ۍ~5gƦ!7&W_{N xh " 0XyF10D !b 2:myk-Ľ^qx`3oξ9}C4zA$Q87}P?Zj9-r͙A6ƒԍ%V8d\ % cB)7j<u#"( -J)*Fy ^$~O;s弭ݺWAp큂;Ç442@c~hesu)RB[Ƣc\J\#y LqY̙C:<Ncʠ:M"<$BhNv ZGK!zƌƁZ1b"iZ+I`g;)=X}fNC T'acym!ޡ%'/ w&R}i(9}i׌âq$ w+ ~vxt ʙme^aMlMݥCLs'o`D"uN2ӝ|`f[+\4O S1єzda(Qh Og'.$xXrUȅvr aWKM)0qZFe)A^b"Gg-G6-Mhw4^\f:P"Fs;bEiƛnꗯXhP.2 μ/ Ǖ:*CpaZ*bA6-%.)9#\*"T 3âҌ;$8"` (O 7w{,?j܅AnS8\̪ͬې?3&! 5A8Th %ư@DxFxewש}m)JVzTtRJF 2215\DAEJDƠC"hJLVЎZaUDQ)`;N$8Iˈ;N9UpJYS¬4#V|V6l?hG3w[8yzqS/q|ܻ$;/7\w[܍&e?dY+]~>g,Xի[h*ܕi:Z's}R:]^yOڊ$?vEHO,hM,_(дG{ʽE$tڸI^0ɆK` bMbJ ?^dfiXn"Yf؇IVik0E|/ASLݹõߝ'SC__%P0 5:mqk3˥CRUp|k}cb0+Ň 5%L)夽j7KkIf'ڢUMM/4I@bE> ֿ5ibi̕ bZٵCڌ>҆r~]`@?m"ΉJ*w4NV= Yվ<'C e4OK \C9-uA1)bO M=O**QOD1X(Ub0߄!ūcA9S[fL]XMa|%\Qr=*#ʶ]T!n Zp+t:Pd ZrA25.|i!`@FUX\s,M:wd]zU:<}#zt0Q>D'Y;:O:>U/;ZS8c\z}xq눻өv@;S+a,pGt@UQ^Az8(Z9@W dX|%7!W\-V<~x?v]d2gbٗ۔:o~Jݪj5tۯ_i:*Pw RpzTR-2lYag/zWGɓ3IDB3l)gK)?!՟0B_G{2)2<sQiY{Ì?ײDJWD%TB"J،يJ4{qgTs"+tТ9G 56z$d x$ )[X1cOD4zl5ͭ7wZΖuAIs9^n{SɇJC|8nVmEJʹ[t8=Ʒh< vcAg2k5Z op4)3 mE3-C1鯹> nsʶztC5_HZ3_&hJ7MdZ޷{fӮȞXاyo=9ku~%)FD672MTUŭo/1D 9F9 !@qޙt{WGZˢ B1 J, ige&Y}Vp}/#wwEH`8ͰSw4/Ɗ_}3sHkq? )J IMA(,^vGEgO> IKB9UaaR"ڠj9=]'ּU5JF(eJH16N9ťoW12#XP1g&j=qז5$ϊ[-㻱ǫ'j~O:P*Y: Hx$N0+D{"#a:0- 9ڐ g':- qXBg 1(S5;$0 rtmH>r ixs:x10BB23uS/N BJKsg H\*vqZi䴓0E!T']=`u[] b@!z v04-]m!t,sdr-->';׈.t xJ@V/2X>sk\?//0.bdC9%|q/"I>\aR/1",DD2""&#(H8Hz^'  {nPpnڭ+]R2Η'W^aMO3Ra{ 3#:(KSǞ "k/%%X .XZYh6\:Iw 'U*ߎOVM-6[=t~ܮmn;<,wǙK8c̎1ǘ uǡ+c=3zk1OdgܰHp[ˈrg߯gUWgofS*0ww$b~;Fzye|OdmQj=Sd|QL #Ko@ץf3 {Z{vQE.pB 9${8n:Ҽm_INˋF½i鷝w}^_^0C. r\ɾ@3 HRn=M Yc'(ir&()щ dOd#vN '4AE.-RIp2LF+MR&D', dOVyar.\5V.{ٔmBL `6F"0l]T}1v]uphgSg{TnjtFW(gl 6zN6^]6&<ãV'wq|%'v9/fbחﵲ0D`g'>jj)w$yxL1͍6x o3i9QtNLԼxл1oՅ .ؔtel + NFfZb3Q `lPYZ= ;X N,s ݉~u(&]ԁXcR8 +uĐl@OjVy.*5D@U` 2{02̮\4zar'/ok}; ״d~m FY<{{؛{LzNF2/er w7pt7G&;92]{oLT7'Rliٶ&f0%bKZΞ3DŽQ٣x9M)L.(L׮p3Wl=׭Oh0L|eL7?<5g ,]we%Y*IKB:c}2vdjv듩 ?#ORWbB˝QWL-kWWJNJ 萺ku%wE]ej]e*ymtՕ抿xI?G?aUs0DLq$` '!R%7Jr6=BHٴϙk9;/Aofj'!zDZct hejhyb4A#H|ZJ,pº`&;3g \Kgj?TPUgT Ow@xyC"W0 8P'/ [kxzA1wQ $zJHXy0cBS@51y,fm&(C0!*Ck5F콖艉hf4BZ* 7,FΞM7 t8zے޸;"I)0ԡwO׼S/Ŝڹ}bxr.8:{iέfhε]~}כK_mf/n]^{QE[#eвVSK=}scOξe'[]*#!wy$7}H !~o{6kf/Zr dUl>YS֮WNV&_o5آ3>_)ru'hYLJEcoAwt؟@{`)"jxT,ICqlIgJ{+%Vp9EBW2ɵ`v{tQMV#P2zŊӔ$I偵Gm7IF/`&&s%G1U鏬Z(?3&:5qM͋?`8XDaS̬ft/2&<t- ;J5je?E.qVq &cJ WǂL*'MB.#%)MX$cZGy&80y 4fA:WDKpdFp:(!c-a*iUҪR:-a'[v,Ga vM]ؖ@Oǹ}=#yws@ƂMJBslKe#{0ךp\BVoPb; r3{Oq4de{&o./5MOo}oSoZ~KPp\;W}}o67Lܸ0]ܷ=?kԿ< s֮8c4y}5 ̒.rS*HL$WdO֠&K(OĨ&)DZ!xh4@EFP'^d]@w]9I8Q_5ɏh;vWS_奆?[g-N&U.G.gJɬYrYW m*gP7jZXz_UdSLjiܷ \t%؟Bֶ2郦UFrפBUNduӘlmiH+:9C$.=1 X>^cd-uN :7xbEPh{~<1gLl4Tq`~~hy,TZ2֙褢cWr mK7ʇBQzp:-8)ryE'd˝O9gY55ݚ%ֆ05 i/7G %෻6@[#.hH꡸ ؔt%9k + NFfZb4Q `lPYZ= ;X N,s ݉~u(&]ԁXcR8 +uĐl@OjVy.*5D@U` 2{02̮\4zar'/W/v~zd~m FY<{؛{LY_zoE)ûqKGooAa F; O] ZgR gv]fg27,0U3Z>ô@ o/CVXQqep%I'RFHhDT >EI%D`bѺ9 _`8Q7'^th .2ƍc#6GQpG#hZP)$fHŮE ; ʧ|P EiUXS+%\/76&u56v ?=A<\= 2GQh!4Q:Z{meII>]['v {."a5<2?(}LkʿO/i£Uh4!BkCL``TetبU1"%$GM Yڨ̪DYe5Mc7Yg9{fo/C}e^dB?YaZ{-%HCя saP"G{ɉEɰGZr@N)jx4k'NU8pW10~2<)"59&LY0+PKlQ;MJVSZ!Յ X @$7a%2+@c"H6G;G} ^rUQVEY{@of @>;mEZBZSNV6ƫ-BF , L?}[*;y?_x6=s%Ry [I80e cD JpGN~^Uŗ.x쎸h_[8|c {k9U88sɚXo0U 31y·F`kFB)lr~8T-" z;u_4K]|R?2]bWܛ 7oL/Fag K# mKniR Yc)JR+Wәf|ݸp7w53hv;Iuݶn3|;v[Lk1I}۹c_l4^ 4j-:ԾdjHggrw}}V^}L}'l_Ot3~ nֿj~ W )nэ7^(n "5j)2~̩2/k.PXx&Y$1yJ諗\&_-}Tv6 4K-dג,zLӰš"& -͝wgIGI> b2m#IkmH_e.z~? ػ$xX)ѢHCYݯz8CH ɡHZE65UտWX[ӧh][&˖ PwFi{{)rl"[+lBaBMGIH2LPBZYO\O%dL v.GDVKM)pZFe)A@kYQ;ӰzΌϏ&xg_w6~(ֲ_J(\Uz;qr7bbjH{ X+e1ZٔM"倩 3o`DW0M^Q`2eEd9$s/̺i/)fy.)9T+eqaQiK`@BpSLݹ)pcΔy^S  AN[lBrTir"jFg?}3 eLCٯCgMleAb)r9\woZX FUT1G4XC`^J{.a.68y] bqK1#tC3nLϯO0ErgzO0*w,NM=)}-4٥o yPCY`2-.8M3 h:PN˳c Ǡ'/ ;/1/P5i|tO|"HlUDY `&D ,^ ?j&w5X!\}Qr3*C4γ W֫Z_bj)t䀅g9+c :DT2WNtcpPz&@O_؃q^/요Yxeyj e߽>݆ͦ|.Y0բ,(ir#rKK_U]_˯M"M#mXbl ?= *Ŝ:c`HA{#–{ &٣98{g0\."Sg(g5壒[Zb|M>Zr*rmz 0WIPF\B _Cv/s]sWx0}FϳcN /Iۢ-Z[VIZA嬑/ɟKʏ&:!+$O%r 9eرZrvX`599uȕ'/T:vudSWP]qt yz_n{V/33& }8a\C=̪^d<&kmO.(3}g|0M@h,#&cduo4ߪTZ0*s%ߨ>߻FXvYIdٕ]MnҼpM66y".7)飩d$:&+O,Q[4\3˲h-8'J۞Jru*Q)D^Z1OH]%_ -u' JTjԩW4Jɋ'Jjz*J#갻JTTW) N~~GwT!,:-mO=cpW"?7_od 5o \.p4)s嬗z1kI+:Ȥ|SYUqd_S*H0:ta93RL@SNq飶P>12#XP1g&̹6No뤫t-\0Lnx }}&bO{ŤF@6 j-$* %^FHHLBDN6t$,r!cS:$({6*XGplƴ!aHu2 ɘV!!: $H[ʩOP%LR#1s) [$N)i'-a{#N©{b1 nЅC.P,B.=WV)S wO??_>Q-#M"βˀwgzϲ-; a.T0MZj%lm/?mo~&n\~&9L `#D⢔ܪ2h,T.9s&b;L ·'V9pj7Ѫ{n.X#aR(vXrLa"m|\`Ub\b+[.r̉T6!۠LnL )&M֔Xjc L#p"p:Gt 9t FFJZ h$d ?",//T^_^~n!z,rdr-->';׈. &+qy$uJ%Q~h垥_eB\qhv5E)55yھxߝd>!`vy8\; ]!WN(^X(CL<ŃiK7=8vQQm}Js"[:mqIA7@;&-4vuGL4W?ūVx5 S^홄^xkIk{+{̬*\@j5ĜMc|3fLa,ŔMJqxu)6vahtk[U\}ުABQEw?mq҂o+|+:a!ZrUQEve>Y aPq$K9ɆޤiLj &6l2. ~~PneR=^Ў: g l1]́ob^*<б=i 3ZOX!ʝ*g\GsQ OOx:e}{ ߊ=ERoEK=W8-u%xʡ肊U:-QE">ӭJu8`1waʽ3ø)a{$GأύfTY^ort4V_@|iSW޾:uutxMJUz ?7"Rra~nF-giWOAO5z1T x}frɴB0-MGy  yt9%!6:jvj}SMI}Lv&(äqKDbVF4B9rp)Djw-9l-48wr:$?ҰN12` jb w,c7!CZp|SY~Qgv~:h!:UdN;P>|eE.|vj{~p= ΗhV3fl<Y)1* |aӥ\lv}a&/nbkv%|r%pp%JYrut+!>Op(_xqBKp8HN9DŽRn&x9xԑaYqGkZ J; "O/ FGܝPLAz)HC"c*7z}c :3"]q<] wmHixLt3h`6hgYrtSZ,bj #QUSZ] s^t7RIg!y\?-%NOYWx_z :o{|_Me2{U2u%$+3f&pcZs.CQ&1'Q{n[&yC6Z[$wm/rKQ3eF}w3[O[ 0 w}DT ]0plԒI sMA#Z p'1'k+ 8ƁWgS[J0c^vNR9fgìs0%p6}P;&Katq5VHu&He(8BI#Zޅz ͔!<̷u1U88m i'=ww]eU7ek t/@rHLHZ.ѐCJYy ;JV εmȺFD5DzC̆ &9貲Aݞ~%pN]BVʍuӺfM^vlOy:=()Bh⣣ fPR9CǶ[1~^L$Tyg-$|]M)pri`޼ok~sQɹS4ʀS4p5S4hS4(hS4F&%5 pOct"AQO :ܫ@=Zu;e.n W; tKUix߶]҄1=՜`A A:qz^5=SA2!izWM_"ShM9&ީp t$@x-\r{Lőo9ܰ4%]tI֤ dNj UQDgYF }:g&!hRhc[JmKNpZ) ~LPjIY'+_ Ekvۖ/ ewEC-L+ܿLҁR dSS,͢a`9ESw@_ˁ;!>Λ73pPY!"uiΓrXVVR6т¬e/)LK0RGR.f.%948g2q壶,GK#tVl8{,fӋ!,_TXFCJ+{W6mMbX=qhucf*xWPF77u;~|T6o+kӯi4o%sػK-akrr a 0)p~|5x~iP$KI0j^"{m}\b.7k{Zx.z_8-p?KT_nҤ7RNㄳw?]+A{U AO7ͤSW#2>*}9s˼  :#f6U9L6]k{j}C.x# `AxI3D &(7,# ɓv {V>V} 5RjӘf"j8>|M&pA( ]؆siӎiWaGpfWQ2VŶ5J8B9bmDZ>#}wsICDJ0ω6NYQFKʱ8cQ5n+|܈OW_d0)qcy~<.!tq?oTt+=8tMO?h|G4#1K*ǃU( 4dr,$?!XύJ gx<椂 S-"OgddUf 5ʁSVrcΆ/HfF|3 CZcޞrƲ4bo+ :&~օm+ 3UmҧS9[$7D0W ۟h::˃$'h 䁀Lj99 dd:@.)-bV頵KƁv!KYkF1lgL(c^:ktlFDmљ=NoWoFZ VW+>/ݮnsbF]rJ *K%(FO☆* ^ԉpἂD6;{ހ3N]}uyK]ϔè8Kg3}?;T:#%D 6(K2xh"2-2!'LUJNZ1eWZJ%/Uf{{Sww]ddԥ$,㟗tx/@9srr k̖{. Oap٧(A N8\9_?#YdлMښœ)F+y 3(MF/i< DʛI d:c'W Ȥ5y!k2_LuC L9}H!I#^ S5Y 6YLuѩ#B.0%0(h,2DzIǟ6*jB6*Z陎88o|v5|B4?|iu͋/_յqk fGtΐF8얁XyБL()'oK?>#qY,ta[!;yWD?A?r/ڡZ䝦LfŨ! Lh 哳v{\q! ºiҗd30!+ī4?{׍dJßvtG,0؝L0A;2 3Xni,awK-)-]j8[,:YGf.s\-!zy z8tQ{~oucWdpdz:(fl]RӢ~ VNv޷B1W/+Nsxۣ_0'# 7#IOOwlãuF}h/j;Y0.֑yeٸ"_uͼ%kF׽jKz/jY޻mn?zMvbifb]RJh6/jdw_vqxw9 9;?Yn%.]l6qGQ썹ݳzl\rt~rŖ\ۋ>u4]m*meoW޿/α rlY e{*ˠX55-Y0CўZ~ԙY/bԾ7,yHHt>7CݻrA%n3.Sbi=}z{JԮg)1&-8ŞΝޒW\_4XdD}OLLŞڷ́C&>ӤՆ?ݮz~ZeS,OK[:Qb N4hM2GlҤDk>fs}*OGM.U=>:@y~mgf7B킛"o{o'oVg_k3p7[4^jחk"fou[+A4s[ wAQlkwo.pMuo~XY[]+#\oqVNM}w0{PW٦v^vg)J(Q v|_I6{ 6/ z{Sk1OQ_5.j%,^uw AZli}&LO;aG;.Dkݓх(B]_?z %ke8t|fjE>D͠6w(S:*LjopGneAFD9>-giVjw k;~;Iӳ2x;")ƭD!EmIǩ.O\|ٚBΛ_{;{S ڐÿ`vCNKm֮Y%p/6wH?#/z6z[dZ>6U .ʻz~@Lç8e]'gn/Rәon*F(9d;dqܛlkr}[Ż9i_tsmeE~m݈䌯!^kμ~EIFjs@TYS[OgrF.Jw=\s&LU5ф:kUzh?s5Z36@U}*KXܤ|Ov5Z3Zj jXqf[%kAgDMԴms6E}OFeMr j*zwl)UjkY;\ ]ӔR=FIE> znxlk#Mj{))jGpOFk$3E2IV'nY׀1Y}NNY2d*Cdt]QrL]eŁSѦ8;Y1'9g7IBjUEr'oj*ΩRRmmdսs2J!s;o6BZ:3Xs 6Z 2lk6b@jT&x^YbX&UCQЙ7R4Pc #7% !,R>` `֖JwbƓ%u`iP\K!똕Ll#JW:Te ,#|Ox$!4_6 unm [@q;*q /,T]ɂN5~T}ya"rΩF& jBeD!w*2}:XAZ~3u4\\4T&c)Ve,!{ إ)G7}ЋYuD! %z.h W;:0l5f*D"=ѳVͣ cH @-I+ur앲w 0P5[FoWh7] jv@e^j utn/oZ% ^fhe1ܴئ.CR#_G&ON@ VBzN _@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N/ ă<'3no@6'Bl @c1H@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 |@=rh8ԝ@#J @cЈH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 |@f̦Xo8q!Zhncq=G'E-dqH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8[Sn?Vq߾.;E9:_=>Fc\B^q Ѳ~%DqF;B'rcv|z/phS+D \=Gczz~vp8ݪu/NӪ<>|D5jÈq,|8Z!QW/v!R .kt޼>:.6U"z˪|8}sp|4qD} _h.;PT%Lz8y2xxwN,E^?y?4rbsccP{WY7p5¥#O=xZG)Uypdj;Ո6<@(lW>=0F/pR:\(uzpƓ.9(T;OOzuvN_el iL%ҖԃY&.yjh]7S/v`|{K:yyMȝkm$G_Eؿ`h @{`Kn,%ZzZo,3jbrfG{ɒR*"9a鲌!`w Av0LVWa{&UQ ?z i"ϻ ~@ ?۱o7fm![}$V9_a)uFn5uqlvO )HFRx( )|t} $ZqJ,),%K^%Wܜac3'ЫesG  ӨǺ`?֥waO@j"NhyOI*f2Zu秲dos ,ިi[@hR$TtVa"$1D %$@iJgbnψ"KR9tJ hhsVy lQ_**V.Xfm1{mCxv~O㭜ϊA>W~< ?O.o kE7vt d_,0^|:~249ɌM4}({5aTt߮Z?g)o "R_*j5?lg꾇SGtѾ-Yc'?.*w | /8j$0K#՜)__U>ݒ$4Q3ę \5I9S->;x^ⳀXxIs! e'MJ4d| &i̙BKY9b#&Ҽ6/#g;,P XeKSo6sm:Q;&osW0Ě_(v[/9.lwX]PSIh"w9=]1JHkLS jT":``;n2|ݠ 1rhY^rG78 &2i~|2 #x6B2 0bww9~#zY\+o TlT36QxA95fr]g Yúpaj=׫Ing/ ?jr^H f4be4k4&%jyP$MF+݄Mlh 5!aIHeӂp0@B`K@ pDwuT6!aܗ2d‘o5J1%%B`4El'\X˦6fS8 ?^8akrcQ0I !E83QL|"FQ0Fdn< H,ʳ=de\Y˗}}.EP4h|:}L"5Q|{1nOkDyFbm}z]ެ~'6G'Y3bZ{V6v$ϰ=GeܴrsrCOLUL8@qNH 1A)FR21\irV@^*c8pfop[[)Zbe 6|Tqdwh|߭dV0[<`.Y֭ V+,3P+xyqaZһ^M,;y)F^KjӰ8t; M,u]:JI-[MwXRlMa=ÿa` #P/[,n8MӑtQkf$a5SA=战b:IW^{yǔV.fbhPUd MuǗvxې7 OQtB)(έ}[cձNqgS~nT`m`bm\<\#*\5՗`nZƨ(F]r_W܋`UGHx\ݺ}U=uRRvq25 . Iis>Ms}G=rEG:q-͗A[ꍤSN)h:yJL qT*sMB1 )Joi4SI@tf#@7hd"2q[.XK˭4$K g*k]"J|F%`j@',:oo>,lxoK᏶B6؋fVpbZDζ{/аg)nsn_~Z4dzQ A=VDEK^@8@x dM[*dDylԖ+P"dKL ,[)G;ybr>j}vrqClH&!&FK.ZFa5Rks`IaÒIhPxpSI7*G#=19{ļ\SFu@hFE6Qv=J01Ldn& ])4`JaTq92^S.cꆬFhdwբ"]6۹< ,qw9qb9#8+p F;ǸDh#<;!,<)̙M.ouh$JMr|t|cjfjAoOļ ж&̉L"(o dRȎ HhǽJx$͸LY9OXX ENrZ4+,]&xe3 K#o0t)9-0: X%sadZSDJ`Uf7"q2 񘻠 6KsQq Θ ,xGTvMWmD!SôxZzY|'7|g^\YpQxO!pcN7VzShWk|DE*L)"H- 0k<*<&FFC4:᭶&9TK`3?.$Rc`0NkLٽw3:⇫틪z2Ode2Ph>е7}⮟P1x\r=GѰׄ+%w%d8 G^kjX܊|++hoe1?C.:x+=};Y5'(j}%X>nq;uO6vf?hB]Y-1ӈp,> \9xv9`xWsK?xOs?uìI4<VY5Q[9¬{o;E_b 㾩`r6)֜cBt\Vx}kʸ5iH曼_oZl_@ |7o?d\.C֘s/TR`yq^ d{^_6 JXDTk9(t쨓;nZ'̩0{o,nc5%qń 1a=36p(Iē\x=qh})F,vWˌ~[UӾv6n]h ijID'WÇ=kh1+A/Nv%TAkkU%!3G %DVQ9o>ĶvQ~sxdW #`$)41T 7$,?^vsr }MϨmTs&5XP3M40uVQe$iDQ.Pcsic t 5J# gNϝV;D!RaeH pJc5fM&heWޘ:S ݺmҼ~bŃ13 nLb,:.wIOKH^hLB SRV11"sx.L3--ݬ1`5`攦LT O= Ў]i$)7 t~)/s*~Pѽg'mY5̺=ǃ*|gBOb)Fѿ-|많]|.ÅՈHOU?W>?ןbiWoAI)Iǿ/_(GʭCyg`qgTGw_RosFYC~Ϲ[X6+0<#ٍVx9xPm}\LWC|uk>#_٢|UaM={r#c̜P+W5Sr3|kVoY(c\_:VY[2[nghK\(&?Yf+2cЪzo_?MQXrY݉c'u|(Op~|v]{oG*s #=',n[6OYD$UϐEš4^lÙz#L [\2s Ywx1MxiJaִ Ml{O,\Dc7z" K^@ e!B]V󽍞͖T!D8k扁A"yXf@q/!e֠= 6N͎L1dCw}VMqoSm-==Y{$};1?Љ2@f9L.c)Dj i}`r(+ <5x^ LPH_أqZlYk3^,o>)~xۻt|t^Fk&!W*)jsQRc) xe$Г'S~&7]U =1ŸvəTj9X .Q+[ ?K/rrv'ca.f ;=1 YM٩3\;mQE'▆X%ohuM~Z;,׃ˈ}>1YV?ׯD?J )A0Vbis:fqffm^d)QKڽ6 o4VCQG2hZӶcL`V{XMh ]]er=uբ*S SW/P]!.q>௶^p\fp`EߋXe<ˬ{ƞ轶~=cCyN u#^~+&10Kh n;k&vv^ ]ޤeN%vyvd7̸Q%9 7︵7ş館}}bo.+ x,:׳ }Q۞d|OjxbF\u4 >3e* ) #RW G2hbFZmWWHة4&)ĝ z4*kF]!>ĝz @1)H]e8H.'XU>3S:tՕֿI>n~̹۵p}s=,|>N0NPO!PmJ=+Ӿ4R<Čh=JOvdQXl,h BgLZ%z$V8Q,OyjFr!YMi;4$dbc yIS%&XaU>#BsC:R0];9)28މ$K)-Ņp܄&!̞3kh. UeO KyCxT2,.7ſ~^]K/?=DWKhu[梟.gYh܅ @n&рL9rzdv^]x oߵ!&nA7d-96ޒ-j-j[]6a9:2-$ґ8M`mI;! tsr96C+t?V"7ÍCX|}(n{yυVkɍ&IGYC[]F *Rg-GWz"bDB-.j˳ |Fc 걃% >zɟMwq&^Bwg/bu.~AGɍrKb׷Zomc^dẀ{x8[Br|s$1㜯e+ /W͞)C̀zրfBRRRVnO6ik灣V2W eu(Au?uktOWW#]ƶCZBZ &ʦ%ϖM6+)\2ԼU4byC.r"kQrV `BluV3&eP Ra |Gtj) grɖGw1{=g S!CYR`I !0a8LLI[=.O>4^Ju-XiM*xUdZK:iŠFHo@SEA,Z(c6UAZX7aamdJY{JZ %#VSV;FQu27ۖsEӁH^k-%4NʭTIq upf(/0 R#MܠL##MB} 8brtFrjurl-:`1id@դC v2R+s7ƫT//z9s6ɬ{/K5uoo7~ݦ+~]}-UbĶ~ߩFVL^=Ua{U͈F7u .8F@'}N}c~X~5dݻzYM{USyWTUepG|úf uXi_/UO .~Hxc(+m/WphlV>nqGrm{A}" ;!h9]};Ц\+SV&olDeDC\M.@=Ծ"7|w> eճթzW^9ol=.oYf^O6jk4>\j,(p8$:G/j3r=numo? ~E^~@[aUnWz[f.]0N ]g|f0k&fn'bpxbu$:h>zé(@⼏_ PB J]Z}#9C3@ O\5xUvvҴl|GM|v)dvq3~_]W4}XtB$/)/Guр+S --#Ig$s^0#tn5XGfW:/F%4=>XBXC#w&QS9($74O0qϠo]f?.h=|!3_}ߖ&e=waOS*8>K=Of#]rR4^^ڢKހ qlII0 J)pړ TdTSؓ[HbIG.;ٛwݨ{iFqǼ\:ãb/nY/ >BYS0 SvIOCbunA Up[MjMrrڿ3S3855+&Zh;@.aD\RʋQS12 JcJ!,][:%5h]BcysH$ZlŝZq$-7Ѡ:"@ Aj=`9`-((H{Y*|I' _Ӂje4:::@kڐ* HB0DGb,waS:@u;uSbԎG-H]{ov*j<-n7hm 1332#Tw%ER\")ow8;;ss<*PZHwP%Ck&GgMã%x9!JbEBxdXV_zv'0vgu 8bbZA`/&3;/WKΫ3*3l y)~Xx&i !0ECT{k;Ms8DG)-)j)1x Y +,uvj ]J&f{K%BnS.yg+l9{)5/_A"x6}:LF ?i.{nX'ZV1Bo|(gȋ"3썚1uzr7K[\{[VN&V-E;SYoLy?Iv G8zL:`nZǨ(F]XBh1C{=D{ +>ߠ.R`)TDea >+ OZQCZё"ZK}T0U( &zf,S'!B NK :xTjUzep QN8e%#1(:.r Thوov{7cKϳj]/_nr~iZ.@|iUxuAg-u<,*!U9>jܪ>>o(|uc 4 uPh*%PƢe}噴;#+s=c.6-5Lf*N"x%ʂFk-yv9iH[_\M} s6j }jy=e۲׃8IA).C3$)@A䵔̤uqNgʅN9,﬊Re.rɣ5 95ȶ 똃;[nވ^a%Wߜj[6E]ݮ-Y0}chMQN )PBlz7ꢗ ZRĦEbKNBaÜDR8ۗ}nl߱.=IMKqŠui-xK(!4%PGQ ?6(b Tgǖ_)OT$a|NOP tI!.f8./(C_ M~w]Ϸ}}6oƓ/㖂ynj{Jg =iTh5jShGD* u˺7^ m)8hօTp@0!t^HHo].zգ(k;k t@ײɐ5d$y3?r:-­0|s^l#m* _xSzo~OOC?dm itrǛ5z'AKQuxgM ]dckŶkq}Q[Tv~SQ*6'\?Wqv?}7{O4H(8vju_L 5М^,n:ry:h?_ޤ]ۅo q,ʢDљxf\Vb{91\b\Kՙh_pX+u2T{k$OI*:'B{ʄLiI)2Ʌ4X(/.}1иq{cAxg˨КVS|Ж`Ikt=íl4rh%5W2 |Rn h.y~*ɠASgLO4DS}J!ądp,UٹAxp\ w CjL!4rCR$J΄ܽـT r zf9'f i}Sڢ 1`PAUN*%=ZY,SG?ǪRr'ce;1lB'-XA51xނ,9ꙠNFbqN i< 5uo@*Mֺ^O;p$(sBQ4 }Ʒm,y$]v;G hu1M6ANPc :0NsQJ),pQnꝝx91 i6;Ω=谺NB8?v||M%*_.y0HfHja1 )2* Mrn^H3o'۠h7OSY- \Z|$bW¤6޾9}åh͢nn,e~ש f /#șϼKj:+r lՠ&&g挍dDQD +b z߈D)Cw$)N$J")!$0Skļq&ɥ'8w?*H/D!mqnuhH(tTBIA0ЫJ{JX#]B:Cܧ5VxYA<"`G%FtnR+ě{Þwjq*{Z+%ٶez'z*^EC-QRMhvhQj4S4C-3R '\BR R";Oi@0 -Cg9yqAB)IFa Jz.f!M< yXVGQ<+ Ҡ@l2&,gSΠ1ks ]uQ͗h?(`t_f'ܚڞ{p 00%G4 t҈)H hQGPgc3q$>?a3C¹ȔrGJ: %#F+GMX=?% QS--6^#L`8(wR%'k™C&"[F:$ipns= @2$k:Q.Xj(D0K8uJ8BmNZaRBW}[xp2+Z_nCv' >0nΩvPQ*8;y&dZ+Ӈ٤C? Q88~*ZV[1Ixu׷7P;Yt˾lT 4~2_|^s飉P#AգM|M&+?$àĠU3mmF|BE3(nه8{:/Uٙ4]jmMPy /c2mOEeΜ'he.7_4h[eaPNF)+W֏۫8EsmQ$:zꀐ]DaMܼ\cY,7<csn3S:> ~{f]ElUf&;WjDi?[﷒lTfcTCMͻ_hR|H g;ˇ뚁go*T :A#QPg0ʵrLŢ~oY(,dgvx"km#G_E_G ;.n w7X`qDzQI~+vdINl6ɪU9Cz=R^-pec<3𠦅ܣo 6 @&W='L=d?{Ě^&)3/9V)xl&EJ^JN0,DA>Pu PENEdJZ6N WT"Q$H\'/D<ҍb\pJ'Sh_dXl8I4^N],ɪ9퓽^~̲p&rU_Uʡ\yBY-R3FIA#* )\e\V)ii!qʐD*$XH)U1H*TbKsb7b;c_^ Bƒ¥C]XѮ^~:O~۠~Zj^S/]֦x 0$491`H5i .V6; 260ceɥ *GI L#xKg,5ۍGaA<]lw˵0׆kJB*PyEF\ϣB6"^"9^Xģ%lTͯ$XOc4f4%D*@E( 9<(`) ( ʘ&[E産%hlMOi=vHJ{v(*^W&YDJFF>wV\B˓[{=_o;$`fo߿i3k&|+%_f<ـ/n2jT|y2ǿKDIcV?<//?pt $Q AlԱq*FAӝ.ߟK^U1CAa5_^fyY_!krߐ٣!$ɕ`i2#ԱukF/|I>3%x1q]6o80(˩Ey:鉶6S̙re^~?>+j[lnsUN7۴yk7ɦ2t~s$^MO7gI7y 'UgޮͫF?[S3)ʾ~D9R{q*ty lwOOPfO=a0W4BVOG&ech`Zw^%N,Z6_.}m ik 9qx9p)Z/,Tw86LLぇЉ#-Ȗ[λw/mէĞ]8 ۄ݈1j6.f8;(uS*-<G_nf24]O]nP)U'#B.;R2$hb hC<<9 ((eYşƩx89&9)7^{zYtCV4* :Md0,j߰\N6km x+ϜT^;]2Jk\1W+eἽy595d!d3`Z'Gg@7hdqq, DK VH  bgTg敫}\ 7k!Mbldž_6 >[ZLZ@LWOW9X݉pu;#TT%%>~K]rb T) Z BN+p(c.yٽ.ȚhT<$ɩJ٨-(PvR{P5:V)Dt@`['γľ™ώZV:&ܘ?nLJlpr7zeesIJ#KY P 8͑ vJ#9չEb<$U6-z"(P%#gk3]ןF%ı%1V`4TKS%pʓ2NnȴrFUg'yVdD~^A[WG1;6긫\/(b/k7z~h8HKU69a 0DŽq͍N6p/3P$ “o3AWvPj|қmœH*lQNǓ*xTMYReVN-ͮXL3ԲBRL9]JN .%Np453I&TUBV1@j0EZS 1PKwQygLD:IR⸏zA"\h1lbTznl $ T2 L$U⩣FksN"<\(69tġxdXWṮ%"7fcneC#A̿@LcZ UwяoJyWv}rw?°jMwӫa'72`ގp@(\ >v]Q `?ƣޓ_`WҕSl~Lb-!Y)nt^{y3;9`C*PHĸ>y4+$G*M Z^9l)_4" kQ @k#Q$) p6`Hq%#)`4H xH4 m9Ιsz7ojvU/!I,HQ`*r~N gFJ5b%+!:o4ovWv_Ӝݬ uXlr@l?gZBKMOpc&sx'i=ue8/qRp$>jӚW`uBq㘓80V*k/U$TZ^@C"E*RHxb*'y`NEݣ!8Tt7>As@^CEܧ3+54rg4xЀQ nV<=?~j]&$ݔtϝV?ƅIН,8L]Q&d Inx>߰n۵et=I ؏w9Əwv^/'F/jf~tp^pi\MۢYk C@=I8ԛYAc+qj<٠E5#(\~+i  2tLށ:g=;?욉f 5bUEi9$ )M$9",hJNFj&6F NpgՎseLEtA! gsnz69_ya<+_Hia&~] fh+^,uomzVkͲ,;ܡЙwn*P8) 8xXJBd:00iLLIYOm9򅽎WbBSJpDsJ3i 6 #oeOw/%v!"!:R R,ԁO[! upf(#r&RBڧ ܠL1D~ k^kG Rq̃LK%ǀU[l/4ᒲAMǿ~jTgTaPaQ6? 25kzJm"Gww5 Y)n-r&|&8Ƚ:{:7 >~O:soQz21|d޽c__a?\+T)}D`^Lo^e׾<'8lIACSQ䵤*G (3~FO]_^*̘JFpq}wSvk)˨_kvO WɈEX~O?֧!f;;T ?ݷ>=b\kÀ]"蓍=A%ŵO0"!eFe%*(lLVImUFu)^pk-KsٚP`DsNB^dH{d(1xEefq{}^ގN x= S:,VmM*]:|]si얮^xG7> ) 7m2y݄ 6w]{}xJxs[Of7w9l͖y6rˎ[vsiݽM]w>OECK-7x4ϼ̏eslnxu%-=;\41f˙6l ">YʘV_-Ȋ`<1Awf2sߥGY"k,ȟ/w6Y3Gץa %C^D֦B!Kt%'rbDIz$FUls~Ἢ#e()g#O)E2Dш m4):'ZWm5*S*[]xu7A^_ *:vOߖ>Mk h&HH Jp P{,5IbRQ[]MV0PeJJ%FQGZTTlz2!qR`-,JB26v˸6}mnl uo j ~kjdXbuBJOA[lJ"t Xi=x$*%JM Ah\mp* 󿐔1*֑h!9+LgP.Nm5;i\n&Zmhl=ma.PD#+6 LpIgf eQBrQ/BL$Aw wg%Xcụ:`" ZfIR)&vMmy21mRtphckozM5״CjYkaYYPצ>H,JOåͻb5rﯮ>5Qld##B2.j:%$do1J?^^$O7}kA7]]U8խq ݌G_NXT]2h-K~`W7NjNRN[4(@.ZbL9^EZ=T$, ,:ie0/gp7W\|uͰf.QUѦgS|:=ry dz$F:(2H2)='2JZYkI.J9V% ZMu𽂬*[}u?Mѻ*pwkVۯ֮Q8 5y Ӥ)5W) 5t^9bJJSY~ֈޫ۩t۠R[}T*maN}1[Sr+[ཥ/jåMB+Rgt6L̄݁0wG[#=l8؁t!7Os{gxHлE Ϛ )ACwq{Q)̳U8_w̙ɲ*-9:G_-|h^wa~UIp8mq`oNbBɚB muL+/H2U}rҝUi5Y&#:ym/oy oϖ9j2HX$^p+9d MOXS**CFMN%d@MS$ £%djRsǠ'hh{ s SŢ+p;Y}3\q:cn[. P:;a1tт4izFkJc&;Z\j/<cXJDlc[ 쭕/와 \3Bm!hD)I@LBYtDd&MD.bl]pv, tJj78 )`֞z-.܋J3d}'vhsёє: P$>_зA H)I;~%P礩Qezxfk|={8{DTY٣!)e:H;jtNd@>4鵇&=ve Kژze{+2[f!֣ԅyw h⒂/ LtL*II1 Kf\L*%lHM54Cc6{)A!`_"TGsl_^܉5pk8x4-k{g_?tBhsYegGbJ͙<͌܌߬{a6`Dʦ^FUFrk5 p|Kw|koɷZ s,53Ԛ(`VR=]AU!y$ 3.b Q_2)671Rc*Dx@GQFM;v|,ɋdH ;̣7u$Fg~28u ׺]u]°\K(L0"SЖ/Ag}J(ɻ@2dQk#)6S„H\zb+H혂whʅ#l& {d`[拼?O}ݠ[ }g#miR, 6]!O5_ecAw;I1vEeՁ@ǴWLX8u'R5c}n5 ]c<9A;X:8UVi7뿱4QPɭj jK}AѕH *v_W-;u|:v|H tBI\^zT>us(GY[QfA ?񙁮eg# -Ϲy5Z!c!6\׿ۋY﫼\:/eD0UO$d*\eQ(VTLԒϗBBˆXj!( THs; P\JɄdN4˜ H  A;^z'y`UtAPFg :b=>`JM+(4ј2 )D!O!ROOB:/x ̚)ll,S(+D*&E)j]1^3XW.FigMGuWwMu ^{xY/cv0S8֦.$Ou //6D4'4ꄩUeBVv!kh(R)S`cϚPgsLk1 >^]?"N;Mu|ʒq"OB3r(;Ω]5#w_&qk5ΓYXyd|2Y2/h0)ZN B E_][o;+>1 b0`_rHڊeIQI_VKlbZN;M(vwUuU]bY^lۋz\`ZE 6L7r NmMi.*BT\f Hvj`'^}GOc]6}ʺ 8X)'@IqTNz%KQJTsC&#q4gPJ Nׄ34XE5'8@^(!as.HQqMV'K :xb rVr0r&垁ts/= 82+x^}h3RWO:ſ^qzp3O':YմVi8Ϭrӏ⏫OxL6S⭘$ֿ^QFmeĊ*'y>A{>Dqsrsic??'H._$Uȗø3*? aa}z}"(;˃'_>_k;hd6o~3''OjDJWA*F }O7sᓫx)V{" TSD"3/5m5r#^l۟Y)/R\12a:5vܴZoU>>*2akQ4q~iR/|#±uL+P~yEZxW%n- 7K)yQ'do&;J7qv7'c ,P_DR4BÓS]5EfQF9nO|2zH" c\"5D䘳9Y":&]-&]= xPI}̧sَ6P|>}K')8Zer8^y\(a }%tfi{؞3Yh6.:Q2_1R .)FI8Cr]Ët G8TeK\1{s.['yA2C"p7JZB STl˙UJ#rR[Dy[tw4~uVo N]i Ů*o{~s&~F㷞LfYz Ñ-jꮪ1*ip+ ^؞\DIq~d-06)<3 uP36BNT#pF 'p1y* q.pb 9f IPDRBH`y78Mx* -g70zi>@«%s+3- ,k߷>~ Oɟۛ`}}T{;©s_jS̪&ipn !0*R*V^z[%&GC:k8e.j8}: .!82DpFm!NDqθ"TIn<&N8%{`w){6,3_ߕ2?/ɸ4HrHuX*!*i7st^NK"#D2BJKa\:1 Y +,u:Fb6{K%BS.yg+YŖg m$ͦM) E0FF>yZ@qxjcB\F>ޞ~g)tH3g6cvdWN_uib;^M\Bۧγޙ.CoGW8<>=QcEI~l&đZimσwOVsc救a:ookGǣ_m8x?/)XtRg)_ ݷІyIF?7yk2E_en=7?7F{dnƯR}s"#h=%x46eB_;Hx}z6qCи{r<>l &Bs kG~:^Z1@f\uYmA@L Pe#'*-{іTR$ Q&ߒP£h%7FPeZK-gƛiüp ӗγ.g,&'g\,>E|Vyw'R;rJG0pV.**t[r[SɴU<<`sE-2:'9U MOkzjaJUQ*ŦJ[bmbq-m!mQmڑ!'7~Tn8!TCA̦D0T$4i dzk՚3p\l,I\`{8f+ Jx%f6tQX &ڔ,ۏGa\vjC(lEo{MvQ{T CFdHhQ d (1w!·$$Q;W[n{ؐEAlbǡQE,alξVZ mR\PS`)H*B5I.;MR%.UkZi״ְ#(E8ڋ8mD} ٙW|JX~cIVRW Y #B>~;.yw#:Ap& onAltkTcf^jؠnAUX(~_&/ yPJpi!SCt 1傩DU ᑚ@\"X,ًuрy!>Áޮf^'~1n{~n[_/[mEٿz1.*̊-כNŽ^#T׈vMnCZ8bw NӅSe{|C3{|YAϬXRX)CNiH{vVkQRK߽EYh-aN^OoŃz+%+[ ,qB-"TxG䨜i}q墨>p#_B&V3ہG^Rzp*Gr2SvRŤ-1ǡ{ܳ bwNYYE@'=:}wnAC?q8rz3 ^915WLRHi7M͠MU++HƛJhHMTV5Ce({.e;eЊ~Z7VQ8 y``$&Fn$Tǀ(1ɵ4R D W@r)+AUa[l9D#5vCMUov?YYwP|D7w.=5t|uַ(Zr. 3UН*5=&C7B"lJ ןu ˉ $R$ -9T㎨ Tdd1wJ kyHS ZhԞ.KgMֈ4Eμw}.Z߬.:>oeYֶ$.:Jq{o.u$́'M( (4gI1J G-gT,Qu[T7(7El-=w=w5wc[$ {l[n ~g}=dzo}<ۜbc3噘%c̲M؞7*\fF cQVMr@TU a3<"2:p}G[3][e3/a2pe5$ KZXT(^rO $BDyRkDjHE \hq{cAxDms. V瑋, 0,=)Қv9.A p%! dm)M}{w7_~ܵz.gN"H|{e&q)9c,KsW`mrlq݄ñr8L0+'a/6^r-# dRKJq 3 Xሬ١wYh)pLOpV> y Q[c(!fC: lA2?9Hy]ѼK1VٝSס[ SJݦ˧Pw ~,m(z$}Chg>d4)߆Mr}u6oxm[brwsZeAQ+ٍT8 R_Iku'IQI)%3ZXq#G=zu(EY[QfkAr{]ʶEZߔN7I}wA?-5*>tO_&Qm?d/t@\)Dy[ q+w4 臷ǫ+PYQ Kq73ɧ(h'PO1*`/ʚP LYK4*5AR鄡6O͏g(v870 ɂR@CL+{QU6/Qg?ė)t_@iQ7ayD8@J]gX]ZKZ﯋Wvbڋ  &낒NS'g-zGK3f^dD$)֢k8+L|[`ovꯓxۛ ?2**-:K[^ U=ِdt|&jt66U< P{W~URnDT))np0%sH[]F‰uXTm8[NqXt_VgYC^ ~]r@}ֹ4)׌tۼRëuQv]aŌ@g_rV(S.Ͱ&-t]]jYZf|vX-~ux(sV)+8$YU%sV lb=]nш'Yh,`ۛSg&N^pٻJ,ahFn*L*s侪ߦe~<ƾaB`ls}m$J`C\4 w^7J *9 qμuhyRd^eS̻rś=]ba8S3Hf窳)? V>Ƚ3>)w!eS.M2QJ8P)/hKWIS)Ugg$Ɯ+pvԓ+NƦ\8OD*(0TWi\`Š֥{u݅ɛOcQcsXaZR6~R,e1itȖu<6JxynѯG9U*5ϢiiQmI]W:׿"㫙ђn~]6O;̸O5[{X.ߛԂ@l'5q:y!7p{0kQǗ\mbR-V.ݶfdZZ:YYLe!7^ukIpy2 O?8+§㛉s88?tRu_'o~<9;9"iLɰ%{9yy0@XG0P\[KV0? siqf"_`H%=P[x`~]%xs|o}8w15p$RPP,JRli 'BW'$FE4T-[lZ"WH)dGx0" aZiL+t-T7GEwUꫴ+}\'ȩ%t--a e%۲mYɶd[V-+VkJe%[fiJe%۲mYɶd[V-+ٖl3Wd3wpQA5kJڲmYɶQ[V-+ٖl!} SLdڲmYee%۲mYii 9'HI1,'#<2c 0 Na."l%Dc!Z`I4j!eFiX-DL]lj9iy5V)' N)4`J"Vs 8B"׌gl's9}~]V5|W'7N?N^,~rܸMs[Mdz)҅Sbfҍm5?#G}Y/.u\+G\Ŏ'U?0˝c]Kqd"A{i9y,ϥar[H'%9:0 fH /K`4u]fw'h|Qm|\`)"!꜃Χå]w)_%P0 5:mqk3˥CRUt5O}bЯ*6cux* ?i"r9țjE8;H28W|jJM˅?u{i|g;NTA`7_>(|^~]`@~DLt=ȵG|&7B ; OLjqʸBJfR39yvXuy;tɧ_\I^5Ӝbrݧ۶֪[8P# ^JlӮ|DF* @U6J!^ڠLiI8C;ȃ96F8M#4] J#:J:##'!ƣn6y`}y:^wGj$~ غvQ-&oezd?_G~mU-:z$tR| )XG4d >`@?cɜťXN3Ul vDΈ:jG0CT;:piipB:0#K]a1ED Fk^6s % 4zl5ͭ4ƦS@% X}ӈ&\JdM~ݾ/^oPu0ښ[ktHPx.4٭@r O얮Z~׃$inۗt$7Nm&b$ J\(Y4YVƵ&{koHݴMP]P1 ]NM} œޓ _?oc?/Ú-5 BB#yy?GtSs1Di1:p2ZNN8SY9)_ {$&"jxT~8!,$S{+%R(ĦE([XBhIv u0˸F#q6[+5QǥQ L9 [ 0)UL`*Z̉Ԃ WF%P2zE)wU(#DOp).}Ԗx\#32x5sfRQ%cc.7GQƾ6, i+ * o]=MrMaMoz7jP^oOS R0`dBVkĊ(([٘IE#K53 j-$M*̊gJʑ" >;A,bYR$9}OO$[pDQ|dUW٠{Ąª1Цtdl~ HbԱhv`7Q`AsMQ2GL0Tc]\Zb+rKQyTȐ ԢG&DuHf@!pcq>$RsalR?+HbD%l%b+7\BB֋$+)܆ !`wCEU-|2;AA" WY ּ5hfG/5P&WGf %?^wɧcy܀`Z$b]B̌WS2,:#5( ƹDlȿzрyF>DhZ6nsǿU[_,v}?9_GO310!xp[S<؞M~E9Yl|Y3̖1[<ch-7'f)'ŴwJ\;*ݶg]aso_z,tۖqA7N;')bo%~eVG4l"4n -E'n<\.-G.v'pܰwf3KjBf1**P]:.d9q$UV[$Jx5c>ja^ dm۷6jT.'$)x)k;9e E(YC`5":tQ {/ibCgs-_V/:&Oeݬ+%R8û7&e&OWJ G.gi]*E\*sջ (ʺ?jl[ZDƈCu-o~ hJZVX kDeA* b[J+ǬlDYU}vѝYi΋,FY=N~7&F\fwrK%v˱ݟ_p{.:E dg(jxp+Dp| RpP[p&)$y|txejfAo/к ж*iTFPa\!5f9r | * 7VMxj*d2֪4m3h(LGxϨ`,蘛wh h`L:#ʂFkoo߻mcr eqs<ٺZVv TvrTPq2f|<ϛZiS\zǍ iJ#u&y-%3)c+F$yr(2yURuE.V,B'yE>i/CS}H@JwD+$x4L{/~hVOV^ /̋\3CQoaZNPD2VH@uҜ2BT\fBmˏ5A*9,܄djNͬp.23QN+Cr1HɈQSwTuA2$k:Q.Xj(D0ƣbϩeqĚVhLѥU[/ ኲM,c888R5^lIGZi '<ʹH6.N~B*\W.rFI5b[_w\?t]4omW٨d8?}:>MAI~J ݫθCGf2#>"mfهm?[2N:/u*VLqQzw{wpe4:ӺY4j6O~3'd38` VE%v1$xs.Sn.bV{" dSD" 3/5m5r>kj̔_g Q>tzir^gj\7Exv#LzLX;jmG %ZCjvBp9e|ٚ 5}@-=|&w >gb?^]]a@νE2>A0u6 PX7Be{<f?8%!HcD|c?6 IvȴVBzc(D42)J<*ՒrGgmSdʤȠ$9| +>,ڀsdǡ\rz3D~h >Zy"IPMRU77Gw4ISAi\ aa ʷ8޶=vDGdD. S0cSb(12@$W,A)Y4DS9- /u.F%3xCm }罥Bj$)PYLŖyiOFwm߭|F6n{u U٩)%K{>=n<7u)ǭ~z kv-ԃᤞ/d`ni΋ܸcmW.SY e>EGW8~<<=nge-]y7 qd#Zwv>aɛz;̼0r3=盽A̻v;?Tޝ~tMIݕbS[x 9i6tW\XĵV??Fa[H7pu`|KoUw)=ݪ?.Ne_ܱgXЪpՋlytEEf]O_'kbB04qt{9P["c XL %ڂb1 @-@<\œS7 -ˁT!S$ Q&P#aAkM 6[vZAGfs,F'{E .F|{ ˈLlϷWjEgZ{:_!e\e X``6 ;$uOI1M*$}HZ()Rrl^ۏz>%(P\: \RuD#Ʌ"Rm刣&eTF0 *cb`:f!:œWq.ږZsQQla5ʶP4[xP[3.+'.^=4¹w[lÃ%LD`5'֢՚.d(hDr? A@1B+lԙ=fl !eX9[l?ĥTv5jcej7MM 6 ̼L%K@$ YP (u .;=Ri)B&dȑ &[+1$8r9,PP]ZsӠ~QΨSc_(+[D,b$rYQ 핞1ޙ,M.2#0%]0P8(`E34Rl$RFs5B.vGB+HɄ@4Ns @eX9;ĻH4՞xtJkդd_*Eb$E]<]<{XM:=ܓ&b6fy۲Z:!%*V?r~|SOx:G Y܃,;){輰C2P˂ EeAu%!_]wy1Sy㧩}a)23\ހVo\~,%^ZiQ)& h2}+TP $gDmU`ƹl<͵z֙ _ Y}O)],$BmVD{ɕG.TYeMoޜpY(SfG`֢mZ$Uϵx' ;.@ ))Ⱥw^o>ZfpXح8=nC%͢Zt7`K7g[lH^^Nw;4)fS)Zd4t^>b5xbZg։,: "v+Qra󥿿 YְqE7}0jVxtфq8YD-"v4'8e`$n},oqzwet!Wsgo5tg+Qк `nkY.@LëG«Q)cpww TFb.>q\ ,ɺ$Eۑ%z= ݫRSY\|K-bV렵Kơv!Ky4g䶳S&Ts/5Q:ZӠك:rs|L%Ժ|yUgvuY2g}rY䛣Țq=SBiYJcFv2mKʖ _K9{ eHTUo;D{l[Zz -졪<.fuk??WT:8KCkA.ׁET!R´FCܭJ:ϒُK>M,H&f\weR;~vQ؏>^p{qK 0NwXcAN0J5COQ*zUh$\O\LDn-9NH%0IOFn2^㔌?tROjɄ;|2 %";"]&_*Wd3JRL9}H!I<^ +AlȂMp)8lt*Pm ĽsCe61x>i_FL1`FE+=ב獯]Vsv, tJҰ ?K|0Jw`^!Iw2e`96tt$tLrVׅ M[>k,#׶::]4%Fdo~vr4r!=p:%5 T1%,!ÐoI/=5+KZV]lBUb's]$>?]^ȭ9ӜQ쵔`rid;Ū,zgU+kwI3S@M(-r\k/1Voܯwǣp7~מ~}CB;UFWHeDrNg3 nF7]訙"qd7=:B9@p[)?yz֞;`q*֞mך3LmXk+LXF,u52IK( N;ϲ @ Aք(Km(=6ㅐIEzm/?Ҙ4M I,rS*'!;Bp&"Y5sFh +٧4jv |d/H&_|9pZ0=ݷey-iaxx4ėM{xͽ "&~X.C` 9Nď⡎=DL ^ / stZR# p7Mbsr7˦E݌ݧ^X ~}Hބ+~hUm䩝MS?o XnMvbIKc"%P6{gJVW Ց T\ yt۝+6N;tVucDpQw ͷ/߯ʺem@W=3еl:JR񏲓4Թѿݻ᠀-,?|I&ѫT{sJLzg~\y~%A~ӛ]x T.^O;XC}&~j&Hnn1t8sCc݃Hcz_5lA&:roMsF|Hd%{ۥFƥh*EgTZ*JvjX_Y[OГ\v>RMG@FzcႿReC -zYd'O=3UbտZҶUe[ӵZ6״Ds3=IO)kO-۔kũ022w4c7~7('lIH30|=7A'czyQwտ>tFzPnm|I^]].k3kwt>x W3X(h>1^?~oz} aӹ8[tFonkDz_~#Tg :f 3#t-IAJ$t\b G%FS%,67|7(VsLWR ,׻맲5<|zGz[Ռ~-JBhiJyIeAǯi<nH?<~D8!hywաT%PDI9<++P)XAal%n\oiKM~,˭+q ӲDwMNA$T$Py%1zrx:Σ Ҹl&ZFDrP0THƐ}DUKNt]׼͊Bm**R402P8yaJ!)!hTDctsN}"\9I%VBRT?{ƑB6@FGKwHr{``)+HCٖ_pHQݔF(plq=ǯ(nC` #*ʞ_[3]5sBCR %+4[! upf(#r3RF:$NpnzGS qLF%*QC5ګC846#3!:K?hAd?( m!|v={QUf6* 痟9O "S9I97^M>va <&Yz54=a0 nF0~z8{sOS}bWen3_ GLL{ /˕UGf5m,VUVh3jv3<Ƹpmëb7z,@ Q}t!E J<7_/]l;~/r"u\g5G4(t"!f#Zi˸@Wd56$Sqv3Ub4ѵx=T6.M:{zP{ y9ӆ-8ަ=iq+AO8JJs&<^9l%)a }Lk!3Y}o@hՋ"jNކK$엌M 0R rS&QzYTALCN jR Τ@Q1xΒj}"ņ&L?vfHKOGGsݳâtR8_y}ˣfW_Ui i= {&.X4!a78E6)"j!:3om۷3Φ9:{W8s]b݀ 7[EvϪM g+ܘ@")yMhr*92^.@ׁ8B9 QF+i2,zP"i.s>C&Q &Rfbͨbڤp@X&+~btr_k -MJ'_ ed KhP;K%X47:T/GSFryNX4RBrR8e(z^ q,X”$R*CҒpvKrX,,FB^X^U.G6xñas;}[ş7p84g߸ĦKMDx0q 0"cF)4=Er4*N9=1βRy NBPж#$&DҥSѕnƣbbvEjC6Kn|-E!8 ahD9Y,a L:àGZ=9~fzj8᭩Lǟo6{a2g3A`л79EF1 %p)څh0XGWzM™=crfY^zxfZ7;扆+ $*v*5'Ұ 6AҶ%d\.AUURKRntm0n Q8^yPEhx\nmYЪUdޱ٨6Ǜ&ˏ-=3ۆGRFٰ@m#h 1ocJdqY=-q¦'NY J}VsmW5ߥ%h_/ :/:sf)} +oy}YϦgx[wQ3'%#NW|LҚt0OOGٗ({޹(Ft`R> :y,ړNd"xoӆIɭf֗M-le&PYh.P፧:c\k\ <҂F0wUK ghF.w"ŁoYǕ\yjEz!Vv 1s;|QtIg@}3/ bzBxU3!s] m=dhGh JJ_RLWP#*\@:dMjd $9U^)e}SNVjAJFa*N1U ĺT~\իu͇n֭Ye.ZP$Ln<)BIpDr8K*9r9Nwq,Q4ZT 0w!QuoWn ٮv9VQ,z<>]} _ XRVXP/M)O'6zKuKlDl>9㬴WU&#zn>Or~C#.tWY~^p09RAh1aJ*SȅbٍQBW$73ۂ *yn5k"|aP~d}~| &Uc5\Ve.Ihi^W< ^C;n|t;!D Q3I5&%SK"@,jxJZQpNKQ4NScp@ :g(@,J@mrpk1Y^Ξ^3 DH:[ճ3Vw] m6 vg)1,䄹u5o]05~7惤;vm%ҳ.z^iOw QEQMس|Gm]\Ӯ~㤹wym}rs5r%&_qc5~3f4j; nyގ.7p:-!x@AoA"OrBq*4Uߎ|& l㠾L̠bo04(oY[iƀy z6ă6yt0ߍߙLG~{;K ]H],FbLCL1 f[k)~)HCL  b'ś$xf0}ռ Լ Da em2pz;h;W%Y(ךY +b%?a\@C3oUH.•*F{\'i$˝e8kdelHUmS @}jΖ8r"E*RHxb*'B&]@0]nO3C%Owd|N^߾-SS/?Ǔ/|?fy;x'o>9?q{Ib#X (AC?xI֫)vf AYB9P)Kg )*.rƈL,x/Ch4}ʙ%-$;V~84ip6u}z%ߜ\0 +-10N6=p\+mYUNՕv^U>axLrwD&=+Dzٛ^ja4yO52 cFE<%YB-b-x= (j$"OH9_(Ekm#WŸ>dqv],wp`@IYHN20}GlzXl(vw)x*,I:'FM?;lLsK{͆ MOf[u+<:\̭n-X߾ez5.ag.ux^ECW$4FBBDSЖ/Ag}Jsi 3 5!dN%9kڈEʢM0!2;rUuA-8W/}t}>}3/Ww2"JgBVf/D |-ٱVZ 1$S``p&λ`owoj U&6y?4b7!yƣEq'7zf]}Fcs IcSX%] Qu1»[i[df|/\ )}+Hн:'Oy#F}Hx6G[a`CjCa6>W6q1t^f}٧KNLޣ% mZ;A\gnt=Sg9C]50zwSٹ_ܶZ{{%4_qm8q?6ҳɎ ZL Wh,A-ټzoNPu!TA,kgQYǘGoUVn8NfD˨ l/09O̓ΓT?بؠ@<9L L Vϥ Ґx*cFgunm^-Mm*ח'F  z0ᘔ="vl^-?=9|y.]9H<ߗM2ͤlJ/&]J+k5Y0Xe?ga>^ezZ׮I T:T7g<4Z6FեM$^{IeWhŪr:ʩ_> `]}97gw.y,bOQu~7b98ld-'RfU߈[5.Z9Zjf|Gf7<\}#-^[~E+rvB>@B2Y4pnZ/s:eeJeo; mv^¬sz\[7ߗ?ݵ~j27Y5=Z-i/)F,Y/ ;Y͇\WS'?ݱ_B*\}ԝi3.p#i3TSVjS=b3=f)( eZR`v#Hh/I/fL%#8[z<9}w|;!ƔeX5OO Hū9҆^%*S,Slf& y.C|Eם;Î+ky>2Q}S6'wIDM E<]xLUY / mwaP@q[9')V)s$c"9x#2m0Bw[>r|n;dj>$|% O/ʕϿt߮]/=b\kÀ]"蓍A%'D/IKa(2EB0ˌJTKQV* .tNzeSzr9[S (}RD!F/TETuI2Dj]Wp\^c)Kqf۰A48􅳮owd|jdGя7䫮v>Yg%ivG׋Yx'7} [xmAnlBr]Jw]{< GrGuǭ믭yvtj#a2n}ܲZv}ƻ;_ȱa:lo ׼Oe}oq jn{Wlxg\i./t= \7obu9nPCRln,4è gf9_Rc)mrũ. EX̺B4d $BdqLI{Ql `Tc $#>G>釐uȾO)3qr?~~\1IdmJ!DWQy*' 'L" ifmn3Ǘ=xʦߥIIF#Ѥ@j͆QZ)5g:QSAfq("ƈhDqūR\*Y(D$@= %T%CMѣ:8еƌ:9PLLV J3q$$P g"wud\-3Ni1.\f1@֞`sÿ[h#A19VbI #&11pTu1(xk&U$!N.|pܧ/ #OUOR2/0N-{z0fMgғ+^ؑ.h+7Ρ. b MFrNP`̜QTY1m@9fy)נl,ZBq$l8{&~Ncz:󶟨N)ڥ.㆐`^#wn9OU#KukUO6}3et͟//*K.%BEl& @xE_$Å+6IՖf< QJZllN* bAoL tP8 \LJx9 ;V~@NU V k6P* A~1*pqwʥV%6^|p=/2Ia) f:UVLuҗBk+D">#dzszyoȃ$v%Q' @2c(cR" 1 gnnMx66ml6ZkC i+ R&ED')k"G/H[]O&1[ˑ Okzw'U4mu핶 Z걯~S^t r)/g7yc EZ9&U#؁"MS,k)qxi8^x4#:h,7ل{")PU-)!B )Y|k$Q1BZYPd]ioIs+D>%rGE,^ XcѧE$O5//qk[3lLWw=OfxIZpջ IQ NC"wƃ7%ω We!qsH<8~̡Z1/:aNOl0JVL օmkĴxBx>J0sA&k @d_>j 65g%k'dtG42hZ { {TOsYl;*_}GeR1Jh΃hDIAZImo\T`A{tI{Us Y٬ɝ~oN]UV:!= !d$Z\jIOeHB N.@ɵfڼi>f"Dݺ?7Eţ*yK]/]%Z fgҟsLhp,7b. [\BxXSAr %AEFRr$y|tz2uY0wږ\,o~2v&]b"v\Y_H C< 48e4͡XS,u%; jeC~c&œ&mTЊK'"m-X39gsmBu0eW 9c‹B o1 ZR8PaM.аmX5 -5xf>LVZ75?==ƐV8J얁Vآ #vx}JazIk?s o) d mdX|dVeF£Q;b:b읊у)YmU1Qs5r b MI!%,=3퍰A7bbqutQԿ?5X(T"GM֕`d>ކk^fGxhh2 aYbwK BV\3Z:_bM DtƦoY`qB^SSLrn-Ae-le,[f2˖Y̲e-le,[f2ˣ҉ƙ嬌 ?mh 02k6)  ѲX EŴQo`o9-Yk[Ge;ojq2pRFmVh'=DtQqj B ,fqqR#**yR`Ł #Bm.(IhIb$ M@ 8M9.Ε[?ћq[~b+VLVCHsMk.T] Jm+"F>8W&"5jth=qMPs.r*DУr\(]J.9uix46n я^l75mHoh 8]C-wR#@.DouetAh}YNY2?#? ;4ݢ9ZWa,^ 1J֐:D(Pl_#52l?试z`>t:&,k7[ tkaR7fy5]0gdŗ1ʹ D`D%&E %-Z ,:'8@ ^%NU(͵B{iп#LvLQ5_%=T79Yj68 U(|&0~͝ś=v@?u^gLUD4p~^䜣ϮmK?]_zSo ss[PHnM0m!PKlP{3JnNUB A*+T/o}O(pw`bL)x4 }*f%E `[1pFAZ|ª柫EY-ڋv] @WZV-ҺrJ;{2*\VP,Y p5N^H̊JGX޷d!`OKNlJW ABJ]H2VpTSgjL̶u&6K8Mxw@t,ʈΛ"nHH<FC ,z!i$"=0bj ߗG|I.kj{!T5YZU`gNtDgyFF, pbroݏеaͽ:#:۰͛4kZӐM4[l*>'%w8ZjAŠeϘͦymft$p%ZkAzE kSE\IC8ݪR4LHjq;Qݨ%9-yP8]\ȕBɢu@GͱZRJ]uc?O8/щ2fި:-ltw۫FI_{jX2`ܯ#.tkti^~'5&,ɷ{s:/}gm0@ϧٷ|A#\/"۠?{Շt&宛MFڈ/F w45H+t{W/y?U30jgvZWt8Kݙ?Gu G+ wچJ=:#>롰wB 4_ɟxk)c:JDO[zt@Bj@!]vd֔7kdWKY4H=I2VWxm3 蜗JuxF֜ڈ9uǏkIhL&-FՏsڃz/1GO;b|S[hƒ^4(:mj,%=3KK˨SkZ%D^-'T )0Fj1]$o$/$_ )A&řW(L)U!,R 'Q`V^Ѥ !d | äƦ&љ8_y!ogͽ?6# 3=]D7/痲}ù>_ʤ9db$m *_E FLmlT1p_K@Cyټ38~=i/߲ǾjbW7ۖ\(NFs`\g| qZXM3L:xX-HLe5]0]0 cJ.B@\tdw"$LґgUFvTJk@mS)P G:b!XM8rUFMg?0Z?>@Mﭏو$s+UɎ#z_| uެ[s!CLɠQFIoC'g5kk&Qd0AʴKG\30Z!:JjϛI<-q |n {SϜU^7je:[~-׶l:@÷PBPOnP|۞.ss}950sfFC<ҷ4}9 ~J ZJE*4Pz6s<+Җ3A[/FQ>7Є't!(0^ؒ}JcT 6樜B| 9iPѤѻԄo>I޹8_>13A̮7ږ\d`uzGva`i};n༲^g" s k,tJȖѼ <^gO3V@IcFX,B bNV9,\?`@FuCݤ~-g83L >t0_9Q?I+6zF74 Kxt7'_fӻwO73=}9dt'IzF~y#/ݹ[WwY_qĖ$TNX 8(&KЊ a ]l셤QiUaS[GOm'>rublG8;&池v31jÀ}LO 'd,j' ZSBVLNHa fl(Ru%X[a1h$P[ѤcIYTR!Y&!b 6 16v<\rǾh#q@GӘKqŲ dmh1Kcb|HfȺ.HtIUPSD5;:`8е$:9PLLV J3%$P-` g,g:2.בKl&%mv kt@xGdW}4%HPL$)X,I x\<>>U\JA&‹"d-| X$D8"TYެu>  QJ֒,N* bAo<(HA825f}CPIo  ?9y:!%~zZ/ӫ@jvЄ[$L7x5*`U+&֧җBkЮ+D"Ig:]G PжA=k>F|W^%q# 2e&%SNpfJe*[1{ 76I}e10nq*(H]Mc>#/fxv!Ȼ1E[j3?q)?Z.fW0+Xx!!=/h;p\I)5 pU!Je[חx%wHD 5}ͭrWIo{ɻGhM!\Xۻ 4K! &)xb8xc;}Fνw@J֤6$e-PiK"18fkpVA! zM@J n/.%ml+myc+0jח>Cj vyL g@?=_>/[SߩEscRZ%:ˆ8*/t:łեḳeu5%V5M6!ުH T-D,>k5+[[!+*l)P۰A_Ib0,cxbM<}'ÏΖsA?7鏡Rz#g=>m&1_} 8U`/]Ä Y @ٟ7 5M]Fv ɩ+S*= ݫsY?Q>6Nbt\*EYRTN ^KH21k>3:nP )ܷJYW>!}<.!) ⵷5W椋R(NbAY0waåm4j^;6XPo(@~&{_qe]_v5 w[F|\-V-`e:IO7sMa2/y3 IM'z4`$u O/@ y|ֻ0"([H9ߏ/.żzn^GHYw_-]rՑ/ F yVGGz˫d~yDݫYǨ܍܍wӟjv3TY1fj/{MPί?Y]a롓":PAҥ^G%fIJͺ17'?:3qZ`|̣it}'K>.׼":_K/zғV"?*vOKwOOw i߳g>xcgĭ'S=|/g{VR=]AU!y$ 3.S\p!KVFCۺF| DMĀ&[g;lL˒H6Ύ4=ػFlE9uOfx{bp69^]zFsc,+,f%1>Zl-} 8SMD6dȻ0QS/TI}6bl)aB$CO l8OU w5iPcRy}\ҖI[JE+AE! Pp&ִ ֲm BY\aßػ+Q.IUO9&D=EJQ & OJk)LL E 1kTkBc[%n`]fW]*nu֨.mv[AonfX <s/䘓 j3Ƃrv1. t>9#)ƮZ@G+f }KYʓt1=gNz?<Ȱ9wnܐascsmp,z!J`&Q ۼ ̡k:'bTWluAĮ>wђ5^[Ϟ À-/֖ Ih3Y  L2$&xV)d Λhc>$1.-P )A2JTT(gХufD7H>}:@qfi+U"8P> :7tמ$GݲB&ޞ6`Cd@d%H.転*zS_ ֫oH916q֢I'e'Hui=:]bjRĽgS-5j!P c3zwX\v ?PzƈpYatFqّա%;`eL(Bc! q!m|uDjPA:S*۠A5X볨,cL_-#Q˔.M牲9gZ]1\hg* 5"/x)%s). CHDEl+TK1757אY9g49PB 9&e+BT p`{ˏbpOe,I^BV]B`xNcdMpVTBN("OiL(U;6v"?d[Lu K5sTAjBF'oӂOx'yP&OڳΦ.XyLi( %̄ZU4F3߬1R^ uƳEU&%G!-+FΫ|2ΩhǥNmRE'?qYoeg/.ʶբ˗5b#ZnyٿJmzqGaFEwUMZ`ϗ9ݧ|A7<4.EӅkQ%F @}Ei+cV\1^ҭmt5}9͗|z.yG)+[ Th~(ѯZor:[|ǖ̓w~ۜFײEd۪BjN.)Zj: yǟO旼|8c߭E+rVB>@Bkm#EsW-@=]`/ڎHg}K%[r.;F:bsKfL`Ʀk7ݍ𡛷eQIjsѿ^c#"~7E|}mv-[oټ:iluՌ:q:^씹.Ƈ*\NSCKb߮Hv?'(Y|I{zBRH=xV?ڦ־LOตKONO%פ2RpOKxg`T"w\dcN=ylܾv|fX/mL-kIۓ@3c8bv'~ ^HE e J4Sy_BaDBaz2no_Xҿ'k9?sڤj3ְgЦLے O6(o4gOɩ8D>=3!1VRb%C%Vگ9G5m jEg^jU"Cn\>TkXtY"olzcGy7xUZ v ˋ#\ia;)ZK KXM6ԆZȫMO CXZ=voH1q]? bϚ0wNsRkbA%| DҍM!5h H%IfpPl~пGϮ9 ĀP{e=htKINP U9;}6c r"KR,MF%qA.]!Jw餓׼n8Q\=]H*[KSLpi!$uk$fCZ;j:)Yo31_ŮUMNk]K}u?8Azg!!xCz+wo]qZOKޠ~7ʻOdP[-4WA}L+%*\ )WLMS1fDVr`hkc\`U&f"zZ97)j\/tZwv#c=R IơX(+cXxӣno8\\lw{ͯ硫A/4llƸ=fQin{"qɹv8̛ͅʎn6}65#rFKZF Y $p,3bend ^(܀GA"˝0! +"RUG"n YfEijc\gF%eCq0 F!}gS]-y&Wsb~vJoE#o]m:rB(omԘ&& Y;%8<1r.+'GTt6smq: ׵ͽ-6jlj20sƏGBA:mQa}wj6zRgn GR笾0@]qf&x;߿BL":3AK=Ńe.\#5/嗫MMlxnӞ|e2xI)cZ1DރdKڛ蘍q3*izUl<,+uyavQJ=Cut -UE8F &3G!U#x vJrNP; ܧG"!ϝ:+`i_(A?6(=&7rE-U3'⫤jX:xs~'Fk|y"[|DPIXt !yhiG4B`z !zNd] 1bb FƬ%7CVy( gmgm稼QyY"Q@5Dv}U84Y=OxVf|)䥠d2(aEdZj#EւPm w7弔iX X?Hեs`ve]HwU^r>}N/_ Vd g)5,4.hd@XyOnIHXU)(2޳T^kUEltKIV ]XD~4cy3wMps2"@h@?_Iazw+z~$~g^O ($Jzøʚ3!IQ\I;\Jq!A:"Jaٕ\3d @|YkC BfBzXf8o2=M4ʣt}Tb3ʒ geaM lTsOJ/{vE mɧ!.A*Ȕ"Y2gǽ[ΤS* ,ZR\b qEf1:̑IjBWպտe^qKG,$fwܺۅuٷڵEһ";թnߐ ۞Dsn+f/Y.'α]ηkd]yvyM[;筋.Nn_[wܸs?rzޕ^w^_4l滬6K>I`9U8珐YiYfUst͉$q7h䒑B~:Iכ ~2D^NV,F~4I_FI*dB>bSڜ/g!5Jio߽潨CH(%􂿵r6"Vt|~Nu' ]/3X1"jlnGZ-^r:~+"mQkW ꇄ8K78!f5:~HbWbFL%jB)%6n)q> )&$4YZv댲YOc5'TjƵD0o7rzO1o\ҏuw?ֱv\[4-D |%uWo~6>(K1V=kߓxϮ $_ Ͱ k/&om}.UVMHii }HoLHU%p{K@d[䮹)c%mJf_Lg%AJfy0 Hk֔oxV '`M͵%]d_{vRo{x"Q\}Q`ܳQ,{4FEm/}lXä揅a)wLfhߚW1wnFmR esέ7c_Sxľ_!I Ʊ0*mD2\Rrw%qHGU̽MMwuտz-]5YYU.GMfdq+2iBY+g4nkI}vD''^kS="DhZnEjU}dLj#mTjcNt׃jS624-j^ e*ZTvE)JՊhVQzQ]*s SMT&77FkM R`ܜαFkѴAǨͧ:rY1|wkKJ;!nLV@Gym&5!zp- wO×2:nJ9 o춸ѹk3SD pzG} wx3hԧO,C똕Kǽ; =-tOVh4IUxPIɜ; F`0rih*S=^bj&eHur>3Lˆ"t]޿e|HEb5'=u(s.lbqM_x:(?nN iTE8USI{k%hH%%Ɋ] sr=ʜ%x&V指%JѺ)$6)hK ci_!'R:Cziڈ* RJTmJ@Z7ud&t/Q*ACmj((Sќ Epa7i,7;KPAw/u6((Hv\F*^eWgU]%A/[.:4F^n!GAWHH&9%R,qPBi5 YV+"1۽@VQ=r+ZQCk>82)dPgm _mݵb_*"͘u44U !ؼ($0D!1Mfϰ]cvliM.džiW F=F ڦG1BT8t>o:@CVLAG=$]I4ZTUFB>L: !'icGef5%t_8gP4D.(2ӪAU &deZP>7|OC"AɣJA&ԭ.H#38hGmƢ*YȩՏDC}^jqgE3#mZA]֊NO"]HtG. O&!z*4}4 ]{ #BK|u)}N߃b:EjC$*EPK |CGzv5t3E-=l9c}Jh v% Aڱ- %DWHPlt ,!U3ڄ`ctJ߳yBP΀M@Gص6Fuf 35)JL `?Aj8vGyPgUE תCYeXYv wMϲ] ˉ6BkB L[4C:kˢ9i$x#ϫe}юp=~{OgWpr./e=Ԅе޼zS;~8uCp_*[ZĴ~,4o֯kK.]76l]O/ ɷzCR6C7Ec59iMt1hi1hi1hi1hi1hi1hi1hi1hi1hi1hi1hi1ix=%4 t4OBɠiVGJM >x>I &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bIhi=bꗣ㟗Ԕz\^oVכuGeu\sO 2O],ht&GcZ/`^/Z_82:_%nk\r:ցVqT(T:~D<>fQ?;hً2ʣ慣o</iMθs;?ppbXA`(jRYzT[*@1(#/U:Hz,NE%:睝#m/W9/l˖TC*ày~ztK8fBkk"Ϯz\B<_~.7F^^kzwFHֈo{;_5BYC|6ۤ.'V `9 XJnKj_~ԉ(֋oo1=*DIdkm#8Y#IK3R'ZZL>WHɊSVյ9Q8g3|겯o{Ov|mvTϤ ϤrϤ]jZoL5ͻ(aORh9+:)Ы+6ᅋ]G1e>q*WםRM(#tQWzuuQ`i9$-5i[ m:tP_jQJ*YT&Y/o/[26^4,._]-ǤuG񥶬/Sjwni/q#sh9؈ޔ]-Z!]On@^o5ѮF7z!CNs;7Az-.9ʏo+Ӷo8~秺~7?/_-Woo bsf?R 7ߤ{rsfןd9P\t^k;:gS7QH+2J I*rx[*3(Zc:SPz-:ϡntݷXeʺSev,>]NO/:LRB?s'So -}?WfCR)1rCqSkJYbǼXsb230ZД*As VnS2%" [R}){SV39vels.>Ǟ k)ɒE\9g۷!9.KPƞyzkY sڧd)#bU-wT-]v $ݽYd6s>u=<{F"?%^ @]p@,XS"iRWOՐI/Mi$]ۚfg1bb]Iʨ]hAЖ!(PPTV㖓r V~\b ] _>Ң7h. 3~>~ETXX\ ^י(.(^CE?rV5HTw.нn]gQ 'mNN[!yGף^g !L{!ǜ;x)v<1Tp墕<4 {N6A ΄ſeS =k<~ts|Sr:gfYT ws]@HNAqQ*h*U45zZT4զsӠMP R N5Ɵwc}=Ԟǿ*~0ߒl<~lD\sĈIH *B`cm$Džp@<%˦;{~y7-k[ׇq ƠRpѨ w 7ӶYHh &YMgQR;neAռVS>9_xaVQSq3.Ǣ3ZlD0uԼPjp^j4M80ڶBo Cy-f9=ȝ[\PN<#ů%<޽ 煠1>'r[Yn@JT5\W@CdE]UiTd :Dz42]+֫OrsK!#ARSiXqK2)2R5] תHϸ9>N~c{-+oe0],E^=Ӥ,ۺ[z/@)PIyn}Zl@I5_7<5'NPu'ռC8ET'2+[!RbTƸ9l%C0 ( fGUKy}<Q=e3xn:OJ=@B,R :y2Q/$$uz)!*+U+]ONLq3mOOؽXOQ}x4dT?ւ5k bТgLRfӫֵLuكMW-u sgkST@]Q=QgC% B:'iI鐽#ՍZXTqwqM#@`IthDav'OL[۾ο?xp& o۴Zdֶ*4<&HnB tQ8b f-[448|WظQH~?|!*gǷFn~IXoIj[h~ðV߷$ 1|­#էVn0B^Y.yz,mԶ~TO2[ߋEf%SѨRȭg\2q5eW x|FW;Sn5k?.]>2ĥYȻ au[hQݢy?uko$.#ڇoO?TќQKmZm:YlC~ʱC ^0ySc bZӖ`y맨t21?|`hnoHKw(̾Ë?ӤNO}#&G 'Q'^zJ9e9:ɣ%m !ƌEjn};. C*5'ύWwӮ]dORRA_Kܞ(|3(,'|FK /0QjݩUܩ|iڕj:ѵ3Nu o7M'OP:1wߧ/[|zk+{^4 ViU㣂&E` }c߇#QCQ}8 (( wpFuii ,qoeb;)yU EQeHчR#DNIY%x' jW66w)37o} {xDmi\I0ڞ<ߍ'iPU-o~ub2)]f܅O=޵2Zx#]^;y6ě܂f MRo`~b!? G78,Fsmqd%! yma}V孮}St~2.9RN[.HvS{j&fs̳fiwB uaƧk̭/ /}ZPF?j7EBZT)B)M9F+1\xǞ]uQv ʞtH.!$+&yex|VCI:WZ6eccﳛsKn\bYR"0^-A g]4"' 1q!j&1M-=NP7YR"UEb@qt 2j_z֒yw™l5=9Lv3!dA쎸 ͞|UyZX2jõVA; W6) &d_ka%`nTB`T"@ӓ#D6D KQds<VmMeUjq-mmEmi'B{J..耞h/4~4| wobsP-&u*L&dJ,ʲ|pN9kź(@1)x-(E`2!Mil4*pv \eLhvZ9nd(fWv58jV[VG4|j*ĤLg$J$fm:&¥BZW<#UWKU{X>\!CFd(E[2#) g"llM=lIx<ƮXjqEԕ--boi(TpJЁCO RGK CU-Θ qJRpeJ3!!ZQ!2CȤ["klM$2vq\꒯kMehzC!`8=^]R%b#!Y87CIHQ%bWa58:80aû_1 s<G?PBlG?N>bz\_pFkŃhtǔtrˣ> zT] z\bOZ?ߊB^Å-_3gO.Tgo0J` ֤=Bh-2gh;DVM(d2FIJXD*! u'0 ǬD 3Glj" sYC9 >6vSRX^ ឩfeqs7Cw0ΛM0hoC_M?33Wyɥr.򒦠 rF{Xa<̽+ [R'zp@T0 B R0_j<:Y4VZ8Byfu֖Է~}R0w~#N넀"ہ#W?#J^~4Q`•Vt+%j44@{.c2TDLfwR>51q)R\d2u,ht /ƊT*!H&)9!z]how/l9^w>Rawp&,) YpRyDȌ^WwvТo&سfO5cz(WNd%TÈ^z,/,q3d0\*n#{r:xSVLG͗)" ' o_0(l8Y|V?+f`m0c|4ZDb~?ɜ_L_3,`XhwuT/~~K7ee0 g Fm w&wN;~y(PԖ\~3~wC)y w./%Q2kϔnNgC>N|S{jΜ#ѸuMЖHq£+ޞ"^~ӺdvӆZM6~ [^'HÃBsKAɸ> ֝w<ȕJpJKn\EtZy(:5í&h)p}l=H-ѕK__NW 0 ?/AnBx)-V!{I%Oxciw)|&=G2+O֝~5&LA{ƹ[2WfsPtj1)2)ms.ʱR1FW%.goc8wQTqQ;Ro/ҹRt\mLFL*%OƁcQYRC"qx<7zUi`,MچVӍ^..켭z9[Dr/J1P=5hЯD| Y6``M4p dg_&Xnge5 e5Y&vgDB *ɗ 3(xqY"× 0FzT>֍Xw 8PUogEٙƒ:y Wd`irBUOQ{y&l:1/fv\ ,yZlh0-h1=^}!jg+} j#?F\(_c^R5%k(\<6*h%^Ȟ9\@֡@T6Z3xe},^R3ܶ72󠽃b1謨{@XY;z ի.pn;egϙV!{)p&BޜdJQ@jJ$r;AR7~iN ,(@);\0o qWWjZz7?|;7_nı g97ʁj\4)*Ȓ2/L@-ʺLU\tRlWY?(|uU['QG]Z^xa5,]qvIٻ6&W}JGQ=l6@]} OYE2$[^o)ġHiQTu/Ί=f&8^v֘Qx' OZy@ST  9~x'y|cj fnAobmM040IOnRC}Y:&ތ+Y$BgmTuנd$EE- W_>2ԊΥ=09{Bil-LX%ik p8lM*yBfIk9 |01x>S d#*MyM7֜lC:&x rM? xN^S2 %!Un CnT-:::*;xco4ߥA4G~4m+9Vd}{q st>]qDG8\W)hѢڀB0ː$(.̤!bw}KR ѥȲRh%u=Gkm{CцtBcٌivW [ GU&k@6 MGb} >ZyYn^/ ~oi2ht\>wG<ͼq16hzpeVR1Աx{$Id<]e5_C"3JPnޥ_A_TyW_"t3ɺ3Ԋhj>D8nz,~*3Y &˴iuUHZS3!}Q/YBSJ?{:3Z0g̝ig"7\o~3&_(&;;7m7$2hSvߚ\\݉ *qd1Xe IvYŶ,o"3+URJ_C]*Q. N3 F3ߙ]C擌,=$\VQPkS 8uwՖG2ͬ8yMZ{ /#tLKq:)ΣOn9Uw|bTg^KgR:v, qD"pBן~~qEΨ}j,"MN: w'He %aAp>'Yj3ȅș1tL6);[2ۭ1[ LXy;' /?nޜf;;Q~nh&V]oE5CV<۷{;k[ɲwj?x$}*?]RiO?V;d"wި>jjA}zSLn$.zVZw46Ә/_nAF$x*.T'h!/y=#MM2…r8MJ! Mci4Mۗa d|nw,ko;9ffo?4B;!VziRf揓j|NG$eiUpk(%lNѢ3>J-se.6ruB5.r5/ن"Ǎ%#gs`Xs6q+X~W>{>@¦pQ\ȣ-KΘռqrzĸzPQ3(C0։UUSBjr{ 'o+[Olj㿩9SO'IBIޣU2geEV6p6e 3γ9 %יA1,J04p PRT| )>OYdcGʎ g%dcUN4Wo XN^JΘj4p,w^so(m,wGB'_? H݆3ô@-DmDYPK(kUi 3{He;gY-z8AAªբemDYŮ]w@W0еl[u OTge.˿6KL}؋KF;o4rkc^B,+j{ud`r-#s!vx(AkBHQJ 7 l ɴŒw6޾]+24=\xA;Y/f2>Y@e75 킴tS+7 (G5*YuzP3\mY6EվlX O`R|IWHL =S!<8Zơ%CH_Ɯ,[x8z|=Fs31q/,%'afpo3+׫Dot ;D֝ڈ;u׃cIhL&- 'OuďN'N̍1-6qiRkO zV i,ӶK*͙0SBėpgv{&ɪW)1Fy BƠdkreMJXHq$6 !CdB汄1# B+Rșynj0U㐶-fT~ _=7H\~U_,uM|t*b"c[F6JZkk+1yP.F;g~V<׵K(v) |7oPq{oW\ EzFҸM ݞԀU*v U!CHxu`j^US^gv`1k<:dL!DN 8!@i!,IpU\|wz+Ո$s+3;W#OgA5AK<+pRn}rhh_[4R ֋>oK/!3 m R2epȓ!&Z9rF\2J"Z#slϛIh! 4.=>S}zeu2ƕ%IV1Xd^~dG/QK'dA) ͭSKA%Z$ 322QJ(4)2-rpcp#H\dc9rCPymhƚ%do>&`ɞF4T"+H0zbкGY[6| 4>\f4NWUiF.jvCS-}/鳭i~Lux5pziG rxTuKjRz}4цJe6^YWH%+*YMuj zo o4ޢ慒aﯩot}m~/oǭ3_C7T]P.&/.ck~9趼B+R6ehR |k7atw(&͍R76wqb2n &Qs M0#DҕM!U ^kRIfZ5Lv<"d*2>@Ȳ4bi4/Qyo耬Vtck7mزɒOgk;r.4)@#KR,ٛ@A .1crf?|SEQG ed Khpw6J *FQ+ rT0e$O9/:hlQdZs$JC!D*$X SbTrK4Fby*Űd0 k9o|~n;̓x<8_8bSKS"<8 0"MlIg zH5i !*kyɞ pg6T^3A%vDD02WbY0:G缏ٶ=[}c7A1}UbJ\LD)LN`.Q$C0$~Y $X.3",wb< R ,8攉ʠ+*|I6-e`Q(BÐIH&ɉ @3Q{O/ru[gOGޢݽerc] eф'JKq\tԤΖ{.!$ 2#W <2 VKCIx½V8Aلk,I#YyɘZӑ Syn8ŃC=WF7 IHģ.aML0+9 D0Lx 4 sPtD?4^VgaoYŖ5QBqSzRTٍ BR  + ø"8CSRijl,[F(s$Ɛ8(W^>bFUD- pѐ2a{E ?>pw0åq6ѽM~ 5,^Mom6ufψ4i20nTOP-=Uf5]Jox ۚ Ma4zf[ƆSCWͥ#V̯{9W!xoc[A|Vi]S{ɆڷhgR -5J!=1CWT.YrRH~1nm SXsE˜lO~e WlCb{z:@Lg]L9tܑRS ݏ[^x_f4^T]j6O@78֝9K_*|ͭژhgmތwM?MF??R8Ө hPvIKxu=6;)TxgӬg@M|3++7- φnȲgv⻾9+Ǻ+{9}#/_gҳhoMON?NXtEZ2_+[_FKe_;iFKDe Q +uVޛv]/OʐD j$ X5Tvu 6 NgIJ|K%#yJ? UxJI6:jWLrT7U[uh$3!'U]N\|K/{B}3-/h̑_ ݹqKNpp=vj֮i[h{-+2a՘kފI#ݞ)0*-;eWNys#Jkw#ǴaRr',(eLs *TG#u` `/O`Ā*W8s7V'DWV~nNiMO0l (K9ۢ+0Һ + >}n=w}=$*`חǼ"]NQH05"ᎨsA=" Ysrm %'*TylI(GSpLQvR{P5:s 4+{A5:]9y77*\_-%\ޛD1{sIJ#KY iT R5/M]_7E5߫-w}%%;wmˍ5&?~>`RP)6^:SqJ LuG痌JNN>;"T= o2R(IZUxT~}~q܇׺Jbo^SY[Bqm$ Bsa Oq͍N6p/3PB7#㓩=H͒A?\iB;Pס0Bچi(b4n4vz;\?o&)j?VI<߃P[Ph=g Ů4Kc,M50j;*{vsֻjdfrr+X;mYU G(Ν6V;EvEQrH3i q=OIbhz2;|vܒ2i~tRXhc}f-ӲVMgY>w(T(P8) 8𔱔酶$-t`*ĔՐ8@!D~[1|EmroheLɃLA)XH9i7ZQpED !J. P i==w=6-$j)P&;[?XV(VIB1VRTSVʟw%|F*A_E3ښYHT`b@'f| $N =H8\S:, {GS(&wPM*vN-, Glq^fBLܗ^cy&\G]V*4Ez_{M5~S8$:ޟ=-WkؽmŴ Ot|:7'H:Nt+}쿨BL޼NB!5'>KE iT/?o9ם7|P#ʇQB9ag|N?yr㻳86u]SK-BM8j=!و`D鸹sm#[7oM8Y43@eƿMm@mbg񹶣efоZwvb<ςGN"XbX:=;yR݋v QbW栞I镽F 7חg5`<‡O S(a&R,Ǿrr!!Gp-.ieFyQ7[_2uHQiPO^ZY=Nٞx"uq- qӀ3ӉdkfN[^Me~pZ8>pDğsz>w#oPW206NO٧]$#ԚH]*֒hDl7{LW.m?C^xdaY(Zm3"i_2R 6I7ΐ,2.DLMoOʝp\\w[vW/mT=g> @-&ѳ KBRaS\؆3#X!< сY%݅?z@mq^-rn_5g6jk6ہ߼DK͏oF׋,=ڎq\x". odʹ5Xj̈́qqIOռV¬fGf5$erL{pBΘ蓈:QGej   XɘF <Uq.@̀ʩCP Ikq+GEا$ȱy)&;@< HVK=R OHHjVg3%D#_)AeSQc.QF)(9ؤ _8$eGPm<癠^ɣA^il՝(&tg=kLM]P: :4B. }c/:DRmT2">gϤ h^ؤ@ 2k R9]gƕCwkMoo˼OS1O ɧy6rukBZe~5w=ճlฺ ߎǩ+STW^@> T~Mzp!}5Tvuc3A=Vܨ%7jɍZrܨ%7jɍZrܨ%7jɍZrIa)Y5_ LVS歕ڟ^ q~VΘ" &.D;a:>vhp3RE<yPOKAvu0u;T(x~h+%ƨ`A,dEcva0>D@ JrqCO(5- 5}-1ʓ?Ov.#|q+$(͖dU`Zo2. >Hi_rʒяebdWVG#j'igUS~g&Xԛae@hj_jT*n_KwswΖ;^ّfoKv\[no,m|}[%|17mgge?u՛Ww^^ܓ8^dVoQ|o,; ±>)G#%?{fn#7n^傅<27iE_ VC9Sʶrũ.zBY̺D)OJ({8!N'C 2 @Dy_A`5D#H[S}JLċn~r璈MH"kS y! FȪ.P|(Ѧp mV= \P5ߋ<Jl*HRI6{MB焧ց}{EOZ}So6NB6lm uW-ɸ譲Dl%^_jN Q.X]jf>|bugޕ @TTz2yƥPȳ.'+RUglVݞVi C}n \}x@x/KZ?]N矦˫cP$:S ON014YzIyZ1ju^LJhDUu:S!NP E[Ug,ݢⵛC640znuO ($ [SbV NH31dbDSdb gKSXu ̐J`M:;5)!'14"b*& bR]a=d#TX.'E gqBA:(2W!!WA@ZY;HAFj֬:{bS<EH?.=ϪY,賀3p=Ҵ{gGx̞HEcrBEEB0&(Ci YH]|07w ԣ@Qkfw+8ۿR '^'з&d(윊j\<mL(J1qRH)9)э?ލ&4ZA:(S`0*Jv))[fS a*h4wh'G,caTĪWI%lU.8KU7sG/>6kXͺRevp>=/lPWeF(5*n yjT_qH]Gk7{Ck^umYښ}{W<`}bPf693Z'š>ͨ+/ռ5_//(>m! k5dBX,##7&lR4HlhO?Ƿ-Y¹aCL^3p7V/j.1I@7=/4wx<~)/6ooNn' ?]g]ECp-?L~LfSWS?Χ(WѐCPm0l9Rƴnmc[~[+^aeFo۾HoytVvoB@]KoAqma: 4h6?r`"j4s_"^}?;{N{㗯G_gߟ_6/_u7f[%~z[o5&^M m]7cjev&*Mx9ޱ`cډ#I gr> /)+XT*80_Mě__uFon3|~QN}ӸVe6Pʨ i)%iW7mRav~|i-dUX;޾Cni\\7[\+=ᮍq][k&u_uccc]Z<^_o_xn~kT5E ˹@{ "b-3w)n~or*U-_as6l0=hbFLxO%>.qDBrP@^ RdDW:Bv NȆo+N(8 t*VKG#H8%rR] ) 1s(t::*q㴈.V/Z"]?Y0Gx 9 iUbj$gN~wLGw?i+dJ҅4u0rJ\Nw\zBG0pG0^1Te6Ar)Z pTQH@V{DGgSY֋ +=u1C=t2~n}yw|: `BI FM*IIhtKN>DI[%b4`Dec[-] AJB1` EC,x59/Tu1Ӻ6 b>*n>{;s\=)VIy,Y:oj,w!Vfz胎ـQ9U6]::QB5xJÌx|K;xz|o_>u`]gfS{ Qˈ]b $Xqh4J,EQP2j sԘ*je@Ȩͣ&W/|\LDjYud88?ꆵijMΙraAyV?ޝt'>}03x 52 ENQ[ D] )ĠٻFr$+B=e^{wfi`bKԒ]7`}mٺYEiWvP*J19dD'F<+JȞNZƴF9RyL ו'H6UN) 48Wئs(,hJٝMV;p?6lDP>d,MR01{wS Q sqOA:𪌡h2aW0z8j!~ª5eEYn]@Ws<2uVMDZ=3ͫUXjJc}8t - l[ 6p3GѠtp1gU{Y͸jipC)JNw{?^!'Z5 iQt@η*oV:YN:V:T/.>~FzLs-}nl]]ǗmfC9`ۻCkd|O|9>EWxyF\K9ĭm4gd2 3=ZK^O t1* 0kP$yFԘs.W=Jq!A:"J)ΐ5 YkC vjOY}G@2I N1SaHvߓr n\dIe<ɨ3.h" 62d\II'yݣ8FidVRDɋCIbTZUDc 'j쟨|ޒipߩXhQY(e۰޶7 _KU xs6ZO` Бr%2CZanmM1#2K, דB41;,IQs-= |&'U3V~XT,X_(+B9“µӜ뛂t˒kxy|챹/B[:e& ( s!\aIAīnh Q0((nSj4h&ce.9̂kug$ޣ}լX 6 ^{@;%YB.Z"f( SV\RIĬXuٳ=[4[ì@ !bA&C&&V8Ap$EoxYET'k :T~ؒ^q}X*{D5x#xɚ(B0JoȬRAsCK‹zDg9+گč]ֆh@g0ZDlĤ 0.QY;HbԧjVr_ԕwY\?=/|wP$k/`MM>mɧfx누Y&j4ӲBI"GcKbieL ,q-NDt~} y@SB[1TU `gcYR*h!6jho)p݇qB z8^sY|bfw)s&,) ٴoĸ6d"!/ ;f}{4ɺ-CG PоN= fvgw^^6z| `Yu`1i՞Ӻij-D%>~_-FLfiqj+g3>ҋ|)WovGeDmwG]}.eNsLߌaճcm&k4ғUPi0zKDtfzGF#?{JVQ;j]pAYٽw -TutJ9alV®\A:v{Gc? {cdIX H'nl? 2k=]}?}?^?6NRo\_l^߶2^ӎ*peSeGzD^\]y2&WooQ׆40U6[e nl4ʢy6c`~HSW-]N`DwN_ēmf̽҉α;ˑ[v/MEۋ+tV!@p<VAnj$e>u,H-3rpAa"sLZ{UqzC>of֗\jګŅU8Q* ֏|ʧ62O:{r痋׳\B*  m&ekelSROJ-ߩԲwGD)G~jxQFX xvI×6:V&bN:P7le/\䶔6y .pT;W`V{)R$U QQ{y$ڽYĸnNK ԼTܵ,⟃azkg[@:iA UǭUgy ~BD\4Z_l!A&'`5CH+s= ]*Ⱥ ! !^Y#c֒`Oփ!+)3g>y`R[D9lgH  DfNv8pա.^ L(@);\e7op"7@]jūۅo>U`Z9 6Ԁи]M`t׾SQ9ײ>S'n{|!pF[:R҉l>4KK6!Fn\CO4 Lh\1\($@>ꏫ%y)#HDtTphRDR b= PN'ǹvsdeTֻDȲ GC%90_yWgyhezTӦnb0wmVtT2Txv3d27Ur)sER-o-Ʋn'z}؜Q>]Eι:|k=aE%ڳ-d&ԆZ^)`dƌ- %YX˽ўr pa$"=H/H SQFpièM;3u0 \`H 5gpfM&rٕ^t1{v7'L/OiBO by1٣ɰ eN C- &*|`Azc Uᴦ(9 Kޒ|rdX)N&V^xYOd XsvUJe*o嘔EhjR7qlĠ-rsr!Jّ)B *S( y1E<9n"Ʉ0phyGK:Ʃ(J"V5Q|xЋӢnEVT{~XD pn L!NDO4Q 0F+HFof>⾂ͬ}3kz}|\5|\~,3a[,kmPB=A 7O EC0ZZ!F+`L!ıMF%*C5ګCZ ,OG+L-9W€Z?f4IYQ8qq.NW8R9^sg^6ýa oɒwM.>?P컮_5ޒ [_x-+^\*p~y33 bֵ!~8^TwEU]L^V=+OJ2{B]a+n/|UQLqzߤIYoh<>~{^g>hR7V%hc *׸agCrv>(3;k׻;ߠO>=SNMjI娨G'$:ј9ߕPE1?:7o] D5 楿%䁗Xryi4e14ʃ`n q=>/x߻+C>zC I]NnpˋvS^VTv9ESHug7::Gc0s>zxqFs_U(QZ=o٢Tw3 >1Ⳳ3ɠG5Զ8E%wo]q};ՙ"&FZ|n~!bwoϊ*}Rf/ԪՖGU'8{$jKb$eD8=hȭq-Zku2ݑvlNot3nTNe5'w2FS ^E ۃW1FͼҫkLpE`X|3{yϋj8w&.:?/sJxVͿˏ5RZ/SRa1;EG"w$CMR38$&1o(c)QnC^<麿N)Ӄuj>KgydbmN.SondH`_Sy[9Hd\L}v/?}ɾ߄o$Ė:V:0}.!`f:+Vh4<% MGi?< H-O"y_߯fF' ^Mi)vE/'EeWi8RԮ Nd%g<sL?9=Y RiV ͎mj|Ih;|!?*-k~*!?έۅ.^0c+ 1a=i7rz2NB$3SP&A/f-qZ@鶂r1Tbrs1.5͛NQ);V-_bқ4?U9Kwu滽-ҺI牧gbxQ޴Rďy/t/:5hh%mJ]ui q;i߸Úk3O?9Z?NA4L&l ^9FC$)X<`pLԝ5{;>l7k`c׿XK+/қvuWʺ:V9&Cg{S؏ũmٟ5^u 6MQ@@C2o zxAj~*v=ey:=#< UQ,Ȋ}ge\'-:E*5+h`Jg25^$S /i]qO8_n&MbxCLзzOl/^TeisROIãNNa"X,( ;D0iFޥ=߯S98H0NQBj=$L0dX1P }1Sml*`'%·^Z)a&[b1"B&ijb (%p0 QؐVɛlΙ?#: I.RHD!bu@ufDbFڎͰg}\2J[O2E qrV8+v/#̦&Eyllo776fT,XocRBPs$+6I`4IRK:mk%hc/5I{  G Kmnu4[mQѭg-5c`Q SKM6 ڹE}ʤ6Eރ$(1$"#Q0В&@)A͵1)("҆%4:Ѕ0n#<ѻGm&rʈU/^ňFy ?{WֱeEsU`>4@;$/;R[&բ;wm#Qґ%᮪=UUj`rJcϨ-OXUѕ6 N6` B`:^vNq2Qh?GWܪʃZ]ď$E˳ǚڈ.tt%@\BI!.tA44k4ި^џLso`b਱zi}֡s<@ѪJ(Qkj$*Z0'Uʘx.b ~Jc; y [ZF'SAJJ2ЄWGM8HVG!H0(k@oB&uT[tUW@,TB15V^X<ಬP ++5H 4[A Nqg"fGܤnB?xmBV58 U8ni4:?!dJ kIt u< o 7tJfXtP? "|qgE3#md-ŀb$Sƪ`|?.>qP'K = 2Xkp`DP(c ޣP.O96H/&9 (T—%|̡cLBP=n;x  5a,C)9iDκ$!;V@r0]Zx _3@[vRh ň޲N+}Ƃ0MGص7Fu&M35)JL `TDAoqVUpVX0&a!dE e#@ [҉6kAnw+itҞEw64IxFDj3+jVEC,>j,e4HX6 ئ;AT%LalmRi6A =ZiaKP>Ǯ&;lE]Q("Ndviknw(N_*IV("ZT1"&7cbw ;^CTXd t أt%h#*`hAgp `NmSƚ+,8RH 5i*M3| R5Z3i^&&ci` Z W-EB9y9޳NKiB]hD oZaV*awU|dm ' ZSti ja3^\ iDF9r%q{*: GX6\b$BU4 LCqc I}6ՔF,KC$`PrAu#jI&|$#4@EWKB$Ti_@m#ꊆޙGF'# (+_o?b/nO/][0_a%x=ju Ŗ?@N?{u@O 2ڐ}7)wt[ͦy}Yo@gH-a 8ӻB1fw@07ʝQiYZ@Y =*Bre%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J VUY8N)J3J X_V@ߧms+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@߮:]Ry-%l@EV}J Zb%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V;V%KJڲCw\wF i+`e@4M-+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@ߏz[mͯ3jJ^no7ׯ7|.j^Nkւ/gGdK"jYz|fIr6olr/=7ѿs rX篕V>IimZ0QZ.雞]e>[̏yT/ C٘^PJF6 E\+wrX1+{گʮ#gL$追[Yrjj0Z!zkBIiԋG%_*J<ԏ=y=XmܔfR|bɃI }] X4!N2!Q&94i`oDn",[>vddie0Ϝ>l| ||}φ??Ђh6)~m!w0AbqpH}ո;py ^ǣ kkGT0ê;7>Nr*47叡·~+ 4]ijúx)$ZRk}U^"QQO'6"EbCrr6&iץ2yd, ^ʾgx2៭>E]^ޖz.|g zv~"Ty+{&S֥=7[:[D1AH*&|wi\Y*6I c.!qYWً&Hp*͕;[_Z2 JqET~vq˯f/ofV>?:~}~J\+]i:>܃r)nyd0n>4Cå:@~lJJJyS[ux|wUȣF:|y)ҙTrSv5)),7CY::]eJE1#89' #nco vp?|}!\.w8pw:FhZö=.} iq P\dջzX~I`[q'nãW6Ϻ翊u u(_t]{}M3|)M\.Jj;\KkTEy#-چmAU[-31KN,i32KΘ,im4t)@YIT%\0BM$D*ڻXT~gFA^ *k#n6{H99-͎n "ߗ^E{Edm]^9`i}tbtjDJVZ ')a0'3g}q|2v6 B{W0|fDN,eɤrL*o!r9=:gs^=s`lkwx?:[&?ƙ+h88ĕ[$2-m{Cۯ/f -F^|f{Ϩ-']6:ylWcly62[Nu nK7~s:<Ho1CGG'B @Km5n#pBZŖ !-%'GZK\+*ŇSb0-!Hw ]3%iZfiwg)UH[1Xݻ tVVLɔ4.4o}:Kelɧ,zMYm̤SuSB:Tdl9پɦn} (.glo6 vS7*ͣVX9{T6d)#P9[&H t.֪' )Nי0@&3s`A^K;fDPPYHw0%yS΁p[os K' tмP>cɞnydEi-)Sf{jT &4Y5pײ}w2ymww5@rwM M |%s"ש/ B󛣙=~Tz0IvIFDƟOV&Og1 B19 kۆݖyLPQ*zemudskS\5d?v= }&a<>{ʛ& orjr>}EyI X("&v<yAOFCۜh&t%LӤ]K}Њtx_4[u5&EtԆ]Z5#:u[ÒWPsp}`,Ofm`n^ws{Wes_>̮z~^g[\n[7_^YEhISw䗻S_쬗kNyb.В7[2iwz~FQN&7i/4`|I_no>"0/*d+PRNǻzR)qz/!6鐂P`4m1{$ni_6!FRUhC6W wZI|Ei) 93y)^qjGTaD8~UdH'_5O>L:ؔ9 j,18i 8֮yWZKAD&=#\TvR8c R̗t1jXUo޾ lZb}k b$x 7Bs90WWڏ0~5R/1+4BjT.KH˓>|MU9?^T7:Bg 9M8H3X**Q#ht^׼gS)w%ţj+U*/̑I4vj9g/d)OnjK}~U9yS_ϰRȍ,Is3ڌZ8)]!Zwd7nuzpC+ tiJ*&R8yH iX!$u$ZB6 5]DKӼˉoYnrUMNN:׺kBO4=-7 {oe "sw}vZOkޡ>7ʇ-_2hõVA; W6*r }!:#sqʕ%=u9ƌJ 䂠'mE sVަhs|1Fj9J5,&br'͈ܖ 3a\}ؚ{-na9^ƗWU :e& (=XD申TO8p(&)^uSс̪<`QHQ`SB.LrI9C,Xp#6N'qK jWSQ[UFm5`I|"b2 Fg I1k31Aj\*Rg)zRlR6i "-:d kbbE T"ǘuddՆs[~}Y/XM>NED]"x͆TP3F0q'C _8Z `=. / 8c*ҍV$Q\FD.8؊J B MZ(+*pfyQ#WlդT\4q 8>GtD7Gz3V*6ȹ1r$ դT<\"Oab(Y,uGƪ =t\Q^G8 ޏ/[%88i׮3W+8="mnfQEF0buGG7}Ǿ8z&{4OlA}'OWxpń*zoT,:E ֲ+&D%F>mB %!(A@V)KH#DםTd.JN[%8qSx5TΑNsͧ޴R:Oʮ߬ >O3,9i'_E@Ѧ&s.ɇev)A-wE#}txHm.Y;F(ZiYE]A@ 3K\PѾaΗ@>vBπ<_@H-% LVQǠ}5Mr0Bh fĔTAB}WrqqWQ3 %K3dLk't`H L  |IV}hѷ.رf5C!CEfwH AVK5@,.f1 WiøUI$elY/^LKBdB̕1J>ƿOqMV ?5M逸ӉFTPRoG?}/G]іIW#pƩ|Wg>\ы ݽB~iDm-]i7KNcL/ߌFxn~7JГժhMig⟳wJ_:#+HjviU꡵~8kޓt:z};{#C1@oePڵŃYqZئRؗn>*R?b!a^cArv4W8MW~nF}-R[RlUWF[lډN{VykZu;@ɹ孫!wJsjx=JpIFvֻfU q |sOHQQ.Zc Ͽ@d59i(]CH#Z!03٧!ȺBd=*zAdHn>Wև-QXh@0Q{Q{Y2Րˈѧ*v˒Vݬ^uxxSf|iKA{o$"4Qf8Ȍ4Vlƪ i E ZA]+5Xrtm՝|_ot]뮫lkƿo>]}+hF;6(q&+EeҐ4= A3}-볪ũ[_J{JƫLii*|z"/Dz?I:w.^{Zl 92ۜ|bB5f'PIÓ h>cZ`"!֐Ǒ䕋s%镩#} >B;0/W*lh@c \En0o(@'+4G4c&ߪyfr4ΥN0ĜHԒ Fn-\)6r2TCuuG%)K{%& \,e *d0i6mp',k@QZ~t7uGkr<'o bϩך&.uvҡbMvn;z!v$4d'`K&ɟ5s)A06* o| Nlư (зWLtE܏D?܏|ki:U2 eHn5V̤su KR OeJ.+nF),pB k,BfJa{ɴ<7>a0iHo=bŔ@<AWk[^I@*FIh +hbfERq(L$Jk,.gs)A:ȴ4" eΎ{+#IYMɧdtܥQq1xDǕJ\dN(LezYm8Gw'c_gd9M/w.G4a:9vr d~j;JM A_GUuWWͫ:ȊΨkڴ:G@j]91 mDC3ńlc3e]:cdn"7潘;vm{EiۚJnxH ]2?BM;*n~ϻU祇E~[t{Nm.Ip4Olϟ!Ғ2syNn.:rsq )|o:̌IgF ȟ?hx6ks5v ,ηSJ1t2Oz6OpV9^]J-c1+N\ѱ;uТ_wS =rܥw2wU։fM9Јb/ASrJrK~ߪeXԮ͓>E>-!oYorNMEڪ-  箩=؁i(u9^r-# L|K-9+R 3I̔'9$?B9f7 Œ9K˭I:F I6J m/Tp=Id .՝R+R[2Eelw1>(].v{=Htc./ B 44?$<z5(rtj~9xtձ骒DN1}kh/(֯e' YЩP߽WkK =fV h?'uGA}( ԋ1m $F@S:zcZޜ"K|5bD`~ҫEG|``O۫1}$V Bz=b )Y}{v[ Noֆwr\ |t[^@ FЫ1B~Q 'tN ^OuhMXaT^ v>U 1f77xdx]%M;)~iK궪mkʖKUl7]u\/uǕd6{Jg%I1NľEާ+{[ttPWp.|;I/n;'E'őVJX^ ͬ(BFSÈҌF!Ԅ?8!/EA㨯_1o8,>Wxj.bǟ:OKDs;_/oږ ˪7 hF wԹ3pBcVl?Pf2L9[ozDxS8'ۓ|⒓fwc[dΗ^/-~ͺŵ~S4O&)ph}ua86ޚoUjPrHzlo M+*fggo߿h_TMCUxlKbno2qBn-)ڈN5Zկ4Lf^7k"4X2D۷7~)C)+[7VcxTEoz1c>@mh b^+"[)C|lD}~}u)^tn9Pw]ގ:Bk^?74r~|LE8\:o~o˸|v6onBCdw$X-9gʁ{ dzQoSMjH6 P/r Ue|L2bdIvh>u/VXک94#.Aη/<:VnSaD3zei8,BYCƤ;L)Hّ/g'U>GPGip^poJE oJ#G380AC01D?F/KU9 ߂CϞٜM6V6QK`nL }6'{dJԼm [#J)}-0/>w9$]ؗu4jn)ڛ\k+ASVLXy6mהsM㾦LZtjKk+Z#JCQ *%J!Q!cθ.:=V)n[EĖP!% 3BsȕfkN.ăI Ֆx;ɿhZ޶63;9ZWLZW@LW?^LS\usr24# *L{)c|ogl&2K.YX%!6:YcG%?GAtR E.-,(cˍV$Um6qLgUX]N:2čѿ?)3F&8PkK9͞F0 *$%QB2o-O!-\r]Q6Po}eDMQU󖞻~&*92;w픹LS>7Sl9Câc{E}X5qbO~\GᏚ]g[n,-MS2]VjJ>DŽ\jNScW3;#/!^+k)L@GM,ãjnDh:vFr%9qq>r :<@jRSmkĞ| F)L;)w7z((+9goҤ[$F1K)`)$Ťg40X>t$ӹcL9mVqʵ VlAmJᩰZ%CN \F ,*F{5*Cz x>EW1[,=4*yo:[s Ӧ4y  _7}9xR:^-2E] mCnm\{tБ#?`Xhc?``K,.Mĺtd C/GKI. (NH-@"(0SIXDT`1+:cB)7j<۴*Ȱ, "-J)~јs![svC劅Ba[L9ZzM~p{丹)oƁX/sᢺKǣ[WԳ|Gō¯yzhon|שM{`%4F3DVZrYfn=7Lܜ ā5e4j oG`9}$ c$RJ%)rV+(>"ʮ{$p£uS FXd ̂HB9UaaR"r;jnE Wӊ, }cMNde0߶'u)U:.))G`u&̤gja4J +1Y-P2zE Z#b'Q-0Ffd j0D0R-c춌J6[M2k if[H{[xdwۄ7D^v܀} f8^?gؤI ;auz,EF&e)5R+LQ Q+0$g!r#(xXGHudٚV XlX3[D["n((y[lZ"WNWA4%D+mH_e ,H}z[$]#ȁFWpH"U͐):E)c4gt:1C6|J ڬ4Q QhE$Cy2!%̈́NSm ~xu`=jK֨dSZ\[+ v*F :ࣖR Dc"6֨cSy[z3ybz4ǻǧ~4]88ܾv&}o]'9.RedB >r{dI^b ,gq+e}j[Ԛ͊i $IL0Q ֟=.,!0C2Fv]E3idc-3-lZjדN¢Q >1+׌ Z*iLRI<sv Vi<5$-;83Κ[:XZkk'2|6~-f|j[ X 66i**=Kur1L3{  Ϩ`,D4 с1܇lPhM "$;қ-8lz9꽛Dro-5 -Wt <%c&sQ}pZis\zǍOp6k))봏\1$;ϔrpTfo*YJą3' 8"G 9Zg>Ɗ1yܳ寮XɆfvEgY>e)Rw7u'Z䅠cI T)U]| b{pxD+%LpolPˤ"n9Zgkm84q$=ކK>uU\;*baI$9lp\xJER`mvr-ޛ}7k|08?W[@BrH1㒭n2_Éٕ;P2bs6'P,= R;UiR@*4.Gwrs_ gUi]%#ֺwm\psQ+ i~UϫCYʺeMv虁Ee!'BZFnaa?Pִ3)AH[su9 ,^&*r)ޖIBļNC]/ u&gڙsZ.$IJW hFtṬP%F;ֺMq/f^ 뤥{גvz{k˥烈m=@wPi'T `Ւe濶m &g%5m)KM .Zjh0AG1X©e)pĚTL&ӥV7}= p?0+ʊ-EDw 8AΩR?UtNk ׷k~xOVM.<~^{(n$5@ފI뷾* 8_\61W]!Q%A8??< ཏq{j H"S$^kNJ{Rbx"P׏[ivRLDi+2RNһ"OmޅsPuBeE;\4X6+̉8"F)p }\QB9at><JN8H}[N7$'Qs@H.az&1̇?\{t"S^Nߛ?{4jbE?Dt\?E8:d WMi}iKCj`?koi@M 5=Z>Lh\KLep%R Y=F{;ɽIj&q[uz"C ^sk$s2By'.kouD=7栞2zI̖/:`"DvY|;wj+ԝ*M{mH7n}Fj &SJ \ȌU&Qy <%,K)Lu.2Boc]9%24 LUK j< -Iʠh!gA\PcSֆOp7IF6 hz c2">EٹlY.HVܘ-D-" %Q8R$j +>;hb>(A:dMQvEܶ[n0,bȔ&`D96Q[2(`(iq۪Sj 쮔I, ,Qkֹ)S 6H1p. ~`2q4ܥ s"|Gz G IEhab%$;lc[άR!N¼Sx̑lj_-]*ָv7[of*L>4S5Y;4GJKuɸoy3,*7~~ zjRś|&uI- t1郞{v/0k2Tetb8(ίG[3 }o-0,jФUoFk^y~¯fkS£ڱ̮[MsV>:9+"\LSgg'&NF#xAMⓦRzn  }NFg)F =<.JJR*5A2 B%> qutK&0w'X ߿fhBz{ Vǿ- RO>'JK˴9(r6( p;dj,75ۏcഘ`~do{2`N?7C?J&ie3~^il O\hLT=&P ̭֦Qci콚&p|v~@0[Ǧ.['m(glC{BύU֌*{J#皱vhԙH޹qdyz¨y؉gbI MD"d'w!)--yU?^}櫣wSH%b~;hbuU޻I(C]n\ CFtpnA-t'w& ݖ;3vw5|yjg)#ۅ.RuGW9h5?ppV9ȭۖz=lZk]j&q(+Ծ넱>!n RQE\ l`nWG @~8yZDz?yh>Jd>-w+ro7G.1cKʄJҐ7邮ʪt9E{8Y.$LdHzk߮&)g4îteokn3j\ZI!FBv/rLVv&leVIYm vI<~GoZԘT >ezժk|Jӽ.-aC-8I+xZ] ZZj 9Z,\ &#JcT8JԽs:E_L{UtwCϯgtuzԬ )7Uu`:CIMMԄ1"&Zj;2:`4%И16:{PtFbh!SW'GK8F}>[8eY=ZǬJswHM L7 *)sT!D! ޅKD#N{'ӯDh#!^D`.F]6LvuqJ}"Tjb :fxyF9Yr67)Ss.7MUEzr7NTRZ<'JIIbE1\2'%d >Foa|XkQ%t mRhіbe_'R|0!4dmDkivLcǀ-xNQbѥ`DQ ׄOQdO]ܔ4=gE¥:ht:RȮPB&тw_KͲnGdfɣ^f EQq SiE@ +1ٽ*zM {'C:mXqLڏѳ LnBWˮP`I1YMpmV:E .%ـ 5vB`7Wg\{ƷP`0V֣(հ֞G"*T*JvTZ2ɷ FD[\ː]$vl t[ЍnO+5+.̨Mhhl|`&8c<}RȠU %e5͐jPoB+~XdܠѦ!@4rLcCX,50LbDb0c ƹH̚` ]62HJ@-!Yi65lDC@(;`Wfԑ5#yGx)Z6NҿjC+܌ebEL]Vkx5{TQR1L#.QQjkIk ˠD$A͌46Aj"| C$C}(*мGw]+czh2& yp.֭݌b_*"uQ44u )ؼ;xh"/Z}Y;n?tፍP dUt4CҕjI#J]e ,äc*ZPj6z4_fTcBgsA9pD.h2QӪAUQk2aachэ2AŕUd 2inuG-A@83]UBN5~T}%TUĝ(<m&k)}Y+N";>lO Ajy1j&! %:]%|̡c OSk ) d*2{{(%G6CܖSBGk .IҎn:9.!z-|B bjwXC-D;f Q{Nw8=/ h(Т td..h]acQm\g$1]E¨2,,Hc;&gو.PBЌ-ۉ6BkC M{B:b8i$` ae+Z '4zȕIu!~)Q32%qT5]zN’0r[1x  "qXoCnTM5墑r7D|9\PHjҪ\ >$#4b鱋hI5uhx;S9^Yojrx'"{R\c1-g9=jr djsvGI~[&7B[q9Ϝt1;<';%!`tφ'`L}$PeI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI/P9@QF se|6$l'`qL}$PI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &X@dφ"s|.$EI R&DH: &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &rI z D|H kճ!`< +`K$ dI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI/zk*֫_h)r{}\P8Z]ꮹ@|-sBؘ{6M)t|.(xs!6m<'؉dWgwt(olX+Pc.BmjN8SNVKmʧCޜ놎bZ脵&/O[z,p݋o9W+hhp|[lUA AuB;ӫo5O3X\ι8}۬u> Š\S_7Cs}\Z8a+tۛ{1u"]%SzK ~I߶wQ(DQ`:7=x;kRv'Z/})qY>?ߵ:H7V~2&$ʆZk)E3_-~^\X?O7g?jֻ՛o/ ßRo{GRO7?? = q(pH%Tx׳uey}NOf3ICyFiz=kM-חEc\]^vBzZ7*4VoeyUSEDoͤ #rb#~/uIU0!J~V=½]c8?O(<:`*ܯgKgG6j7={̽a'rTo~c%)Sl-22~wEt>᥺}FJk llO3guvm޽i+|zJBկns߯g3~57ffO؎\e8Z:m?=Nx*Rѯ-һ1WsY0t2⍛O ~;?>oMsz%Z͏,;ɢގ\LKKܜqeOa' UߝVsuy}3.ӷoQvSԒP˗;:,?_Fq8CJVT`¿uAI\OIEGTjk蚦]LybRtjz-:Q{i+?/VYTY;3]ŖGRZ!FFWZT}GnaT!%-8񪥉)NM~&d~!5(3^^s΍^[ũ XGط_b<?˖a;֜ԀJ;WAzr:#4eӃAZל- &`i ]% ~z, j @OTccP{G6#(;;5igH˭;lΝP_/2tZ&)PMكU rUg 7o?T~z yY*O"D6rSBK5kjfKU4{Mb#wȝ@\t"؊Q"$YMŗ * .S+^Gt5U!L ͜l$ Zq6;bf΁q?0pvX:)-ồ^J%jPs%hoRJ!0ltqo?_'GSF?TlHfMJ{ȱ\ 6GwW;0"1}@F?R̡v->DI|- [g3SU}TW*K"DG–ă:ZÔ*bZՠV?iV;g?LقiD'. .ptg= !βz#48ԸhJ46RE\.*E"$VTJfck(iS4ģ!0loQ-A]G,,Kd/)5&Ъ(*xL+b 3F"#sfݝUcVV>Pg!M1;`< ݄ S:\ :1XzSk\PY;k$A:ZX]?]±Ov=vyPtnH冄aɧlw°7{E{z@(1Z"Z=mR*i؊Rm~'݊"lE9j._V,DD6#@]uޫ rpj^2pt+Nd("ħ>D/L;8ATsG5<15䥡Z=^k\Ju>+*H|oAYIY4pWU.A[Vbu 18%'iсk,Pito T4++Nq˩u.,5 Bq3sQHə&Q[%"j(3E% @΅& 8'>p虁D8d'˷R.7~ھk@墕< =' o go4ܿ$~:Iw ɥ|1.J%M5owU1Mv1M4hvw56c}iA8ESeGiZz+㌢\6îRu B6j9db$$pj>.)Q6:jP*ʻ@o_k߲u=` *U[}TbOHdͲ팤k=:wNM&YMN *B|-.7:~|^R٫YDw;r,:FI*Q:PT5`TsE`Tm:S+tFi!o;=XDwM8A4^A?|"裹vq9&$u-?joGwWtPD]{ޚ hH@{fJ"smEP*%0/s+ֹϡGglw&YTʨӥp-+K)\⾀:K6}`:x l4xo]T9s[^$EI{{eZzW#dDCV#Q:Šݘ@b_򬙼'yKf7eQgo: Pwcoc7{3.bѳxeohRCͅз" θd]\/M_t3\Sn.ȵc˟`X DTKg4fA,n{,/y<77L|B0I|n}o3u"3u$XEl=5+bzg^;)p5W zĦ~&-#fߢ|Pdb5}k=5~%) 0ۂ}S(/;.:.`/=%ۜ LCcF"엾! ʚ2QXyg%̃R$@ᛱ<@a83z w,: zVRRȀi?O1w\| ŀ[>Cw !UJ"Uxoڵ"v@f1jؓϧGzKdAHeVYs kK ̤W!-@UYNsœO[' 8+)gy(Eq26 \=*g`T_ѥ댐ٺiq[Rk5۹?4E7 {Ӵ;\o3- MWIXg~<]~7MGΤϿvEe&Mnmq~S9C6o^=շWE3yyzY-fNFq[㩦]pqm/\{qoWzH?v+^eŤ43ʒ M^ond`O BZЖr:"I + ZҜ㗤A¬#9v8'NkהbxRU-CP"$Xb>#88.Xx4."0>+Da"Q&-'rE'n}pBchive 3\mL!є,HjB!,B4;#m,tt̫ۡ{xnZ6kݧ IcnFxr[! N0PWhQA"0%2%9s6mBVe=c pDHZGEbZAqd){+IYolt wBqR!8T械(2%x'g̋j9Um9cMM[-/^׃I|k[ >:~8 *u6ݣrd~LoXf Xzr;.:ɝS&m!v{ F} N0xvgOgwstuΞi-.aw=g#?p<;`3wt21nɸth17ݧ|v;Օh.oy">>磖|oz}F/"\ľOsяb^ -]*\vfR"r̭*Vb\xeJ]vu~.;#ݝ˞t ]BT-&yex|Vȑt:8ku^s7]-&.voGv|Id.)"qeKx [$]4"' 1q!0C) +"+fܠ. w鷔 $JALhzهU3Vyikϭ(pu6>lR ݙ \ ,TJ&Hj ؒR QMOAT\ɣp#=ZyX2V~XT$X[(+B9“Gќ!-%$.F??F #T :&A2%PSZt=uex{(*ΞOd6Ѩ`T%[зcP.+gJ̅E ՚~&bծ&ZmUjjvKYi0,DR,&ЍKn],'-Ua@s2"CЋ]p |H p̔\|HE'ѩtީ_v{&x.|kue8X]/JŠ *8 Xh j9GK CU-Θ }㬕d(dgBB2!2CI e\EX"V~MW.v˝KuVcl`KWY: ;~~3V]ld1$p( Dx x.tkme{xd LXݟEkܰ~TjQ}xm/%+K۫w6=\<*՚V ]j1%"t4lz 7=s>ROsD-?gP#=Q"X"`2I%zZ=beޕq$eO"#Q} 0sѧ$"!()Q$M4gUwuj4{Rvs[ƼlXF=Srbb&Jar%@@}aH"4HIT $X.3F؉Fȼܲq:Ajac'%Ar<'\Fb; ,Q" Eh(%T240p1CMP3ə9,YֳŠ97+ n4# 9\9O1"yYr|5[ܜM`=Evk:?,a "N2@IKq:jRΑ6IHP WxIZiyz%{ q e,I3YyɘZӑup^彦NR9G,aML0+9 D0Lx 4 sPṫI~7zZMG MD NmNd!TBgn]}h=u&N)@u2"<҅@E`&t.2- :N^o ΐoOx/ѺQ6[:e,% 6bVэCn|vT/O{mQ-řEP?^WZ?mYvz6+44 'zbQU͌iMZsgV@BKbG]ija/\ۥHzą|S !Z0I;3+åϋ~»V@Av=%LSto0氫5O̟l/ Ɋ䀄^ıg?~3Y%:_z0?هߦC3]ꗃ8%ne5ӬFKy_[iFKvx-f'jseN"T5k -^o!IT?J᠆ZŽU~5/[YUSU@@x |*s>t}t-tJIV:RX"]iMKeCmfy?&jBkݔE e ,~'UݵB da~NGsEׯ]vЫφ/_!+7{;~9kYt۫>k{k?jiѲjiGK| h]?;-~k=?I:uåK3ڢG/_4 d!7E]'Uύ//Z %|]1oe㏟2&e6Ao~ i?ںݖyߚI 0|i J2K}+m2;1:4/V"c Pt5إ^Y& Wj޽?77տ鴀3|`r|)ezr>l,c{Ҧ:Wl.Z3( ľ2EOKe߿bw秇BuB}l|YqώOvLBD|䆔[@`8:,uWօۆ;&=<;If6ѽðMLx'f5I[$/ uqk) (0lЎDOzó(Q҅y\<)c:1!Ak2Imb*%Tu;E6|{٪^dk!j_>HȐ>r 6H \(/\̣rRZt-o+2t\7Nd )=fY\(W3}QiU떠X~q+[l h`ztx"z`zmzس}tT%%O4.d)&SL"*\@Ț+K/ xHS啲Q[P'M1E d$ ktm+Adjs2^~vi.G?,?o-%\͵@xRRq$C7T*NsrtRXuM0g'0E"EEvQ*-P1:͗?mYKOU ؚk0j㥩8I^>}vUo<|(UGTяv.^{Ekδ",Z^3#?m˭|˙/twى?:/xF_z%8iJF 4;!=<)&ep[JjMy9hN܃usaȅ1Y܅M< xJ)OXk"jZD)<ͦX.U3ԲBRۼpL9]JN .-Np453I&tUBV1Djn0EZS 1PKwQOr #-!2#,{̶eP*OJ[JTBZ͕ WQDpofʱi=KJNIUIS/εNWnsm'ACZ `A * "F@`}'N^%)[%!*<^ַ"RdV#Z;끜.FPZYv>L ܳUMBVST/ 5wk{;#w#B\Ƃ2Ksc!%zkH>*סȧ~9qk3Oɍ3op]n*rPgxw6V5¤\8˵"p)!N1hc \2b YǤ E%* KmJjk%3E*mG'#f ^IZh޲ۻg0FP{޴(|ϴEzի\Vpj8~qZ  C= {NSǃ^^~H{y|-} 4O0x3,x߁ʴثl׼*?m꾶`\5\~YFyZM!; #t~s5Q<\GW?UQ梩3ryKJC0'ATj5,(\Ig&(Aћ΅Lj^kfE4 $k"I$_r'ADjι1mmlHuΡcS @mffnaC,R-,#T'vgTԗgYlhqE!|{ ajCvCgct|Gf +qZr?k][PwﳋK-_Y4Y#;>y9 _XÏ %u^-=6r/70̷}i+r+zRI&Ⱥ."Ⱥ.":"Ⱥ.ӥEuu]d]YEu("Ⱥ."Ⱥγڳ4,S wRAIT%K-쪸.0WU(vy_^Βb/_SV`˓I\Cd w Ǔد&.̏hBlq-lMtn>׬g&gj] ?{QY(tsZoy!풡 - @8!/vBQQT0b๨$Ў7]ꢢзgٛB[uDEi( i@ `ƿ[:Qd< 4ErK>Hc$'u<3P;T9T㊑sb\֡eJDA}9MqmzL詯Ѧ m>y9i>nɯ}W\,mv͏t5rQ?het(jý AF!"(|P U)%#S^8h'$\ %X"١ Y+oP,cHRܠW Xgљ17b*Kq& A4#ՠ02K#5tJK"'rV 嬑/O@fgaT}!X,J*JM P@c(+A1!4J Z3ʵD|x** ?ΓC "n#(E#QIYs,S)P'J:U jI:?i.}վF Zj)Ik]4bLwث#/xyC&f2%STKAa>h R9]kor+ w;RځI>܋pXTdHA{AJ25)ذEq==SuEg+LE(##2 #Bs;q҃E]|rՇ8c]ZUY,ʂ%a}7T4f(qTc 4;LBvX;Hw&蔶?]jJ{ '1%<٤υꖱUsJ oS1% <=\(ED+B ,u ZM !8-a/2zE Љ:Ť{\aDgtLD@Z2]?x|5tz3&hl:_ ne,,dhݰ!@[1cuN4HLBoil[)e^Ҍܛ˫QHiU+~*?R0RrT ĐWeTF-"$;U0&Gk"ԨF7ʨ\%l~/`oOoD̹k{<_5M=d_u/dL0˭Fr6VߨT4ya[R}#H~2H{|zֵHn50"ҝ֠-}I.hv >{].]_` s5*H& $ %U[g!LbM0䐶V*-&dQJv&%2@*FOw^ &M+ En;oH2OlA<\ϭu}}+5yBKhFF *HZC^ :U" H y&<8)*ٻ$g( DՄ#26aeӭa __xd'EOͿKPȖ[\jғx;ec|,4tEeM:Cqk>@׸}ks[ŅHz:y/IiOGƈtNGǓuW)3̾fYye^z7%1}uoMys|IܾO+ka|"Lu oPRH-885_KI\tgIhV?%&n_Y> et/^ӯ++eO@/?3$ ?CSrF[vU>޾Sd& ZJ2߮ߚWdM=@~|8Z(?`fSSb^ vmOI>821/Ӛ~a>&lYJ7̶şxRapuiܛ5'qOP#K¦w"HL~VK_A*(k]sמŢ\dܦPkZz2^F"נˀPiK/piTRt t^L&^I]HQt9~;:F̿Q\Nڧ!倇[(2A=GCkT-ZZ]Pc }xg[F5nxǦS&adIxdUPzV22Eh419_8 ғOfJNZU4-l|vWܖBiz.|s2."6΋o|m:w;T}lTv)j3 LU hI 9cOI5Ktܑ,6)71xP쵖|#-ap|B?Z `6M]:rk/}QŵPRiMEH.KV]1ѱyLKV>gZk{IG݊@M5&12B"ELQ@6F0|,JP"#]- )R\5kCUۅkY9 YmkM dif!9U2eHEjEhNolHjHVЎ7W)S1*m5EKVsys`ix}r_  Ѯw{|!b#t6ŷȴx|w@2yvHC.3ֳn%[_ʆZ/8 )'-Jo )Z't3y_QZӕhBYavk5FM@Z*2=%As Qe$-ՠ/IQJkI-`ӹ2UIƮPl h gvvx 8ӎgvt]:tۿ m_19^ŖHP-L.T!E-"Ƌ>C0{r=m uR`Vx wjI 9{1*lj3љUȾo' TeB1TH;jolb]jj.gNͤlŢ$M1"9ȍ*!(ɱ5eP{X"XiC&d( yщkf+MA 9 \mM~{;@$}ǮlhGxīZ}u*ɥW6  :"T ^@r@ebP1"B8o7.hKQL̄VL!=^H]{8tjpb'0a<y;+'}g> e^뼧I4>Ipc7~ F.8W(W3.HzxDSmkȨڨCn)g2t4&=;%=^ƾz|}}(E}'yBUѣHt&>zkB{2ghZ'1lMuvhHFyBT-! +rU*ބҨ܈K{ MI=>_>uʮcr﮿HOV1w;<^k+~o<sy~hb*go7a9)Ȣ@AT!K.C`\ jUc+a/C c73| F#p(rQaf$6m&g=U:H;6>`x E8X\b"2lTZ@;$ya8F1v$# YwSF;ۮ3tw{oiN#`Hm(0Xi[ AR.G6EHPD(AXٙ? y4ɣ\RB1jSR#˙1eM5#JG'F)/l* fTӋIpݸQ^A&8Xr|^9F1"o)sMCVOSf&6 L 0/UbhW:Xf}6Ionh7 "{K}񴀛F7yY0G[)g<>{$4{+*XMB 0r ?]ty MǘpۂD h>_:WVp8Lot1W |o4y2+U^|*ܬ~̮^|JmL]/g߾i Qj}YOs1]wk.59y_ZK6u%6ϦeWUz~Υ&s@V%fȦ].uu?i<_ooyo|a?x'O_Ŭph~ ~:^^hy`]檄Uѩ㿞2r4n;= V}x<=IlTJ#EZ@Wuq pZHUI%&:+oS9«ǰ:~3ú\_1M Y~)|U*WMLp6gY*dZTTEk D-t]Mt$ִwA|,ߚNnpeƇh.G'R0,TY HVVVt&ߢ,..,Lky^ĘTf͓io"f[(&U̷8oodeHԔg/bPs{%hV@|/}d}1]/AOc͋)1 ܍(^CQc9wZo }^osw#m`2늴ԜZkTxAMǼ3h[MI75d՚U;z:~DDiWGS7ySzY??OxֈRg,"+c !¬!NzcE)2˃ègN piH@w` .P 4rTh1ӎۈ0`eo׋Ki^y@xo[Ox/JM`fHQ{?] {r6:ƙ^z8-B;eUΤJQo ɐg?M~{w{'lNYx ba$%Hʡ肊 *%J!Q!cθ~hXFY; ZDl haR0#4'(x\}gFꋜmTrygus݉ѽަ) B}#+Ǡug+@uSe삘^} J(JDl瓑´B0|({)88:IrfA#JBltd2@V]EFG^YI}K' 0i\-ڎF(G@[nД$nPRs49i וgF_?o# F`(ݛJ(F"A$ f[EaI6\:]V:5oP^mU-lWʹv&sM(cZF9F!L+k'ty&!Xk2g]Ϧ'琽})eKFg[/{m]j9QƉUS}:Oմ]Nym]#N?k^+k)L@jGM,GX5u[}3g=砻70jr y T&L;)pejfhBΕeUa8s~:}IN\rmP&qxEsE-ĝB;Y&[ZӾ. &6F8Z[P)" ښ(r:h0eE%CU5*Cz >1[,=g79V St)>5g^]"V¡lF+=:F;:㲎R%Snb1gDsAFBF#r|d~ύ ɸH5aF@Òc* ֺsL(QmGs Ȣ` *rbrX+B"\荜$t~MB!05Ȱ#^?&" 8);〭b>Tlb />^% qS4g@8arQZ0^~R>qg")ƹ)3VJ$ 1E:dSAbYFR$֔[V`B<UX-zg՘IDk1hFs+%gx9[7;c_@dx6UxhW+&WN@vVD^E2YM }&ܨ ]7Zxɬɬx+ݘuqȵ u$ 9Uʒ d?[D͘Zu3sn,rۼrvyѸmϝaZkknQ8nfk˳|CM"vF}=xvl6]]vfԤw5r'`CfNLr;cHFha_ΆdS"+|&g6˱d/C7}VNۏ#t 6=,:e?0*I~V^]b h4+χ |k|m fn*z2WӶ4zY_ gU<_V7w~]mq0wIu;.˚/gdew,Ԏfdf*.sR',i8?,vRTʿIz^1fU-#hc*kS4̲0m6 1LHql4djP91.<9~Y1m7"I~29 ltE#LLt> LY ;3)0ys_  85!*U!W0֡{۴(xV'/ gkWl>3^ڳ[?ײlT&ڢUMmN}jh+?l;i=J #v)02Z^N@'0S2ĥ0LOVPXY`u\O'nsZj( LE)v4Z/Jy1QA"}V}2T-UEJA=UVERmotUhѠ=S{qUxb]vDŻ(eWmrVm, :oA=pͲ^72*5n*EoG'F.K,<,z%9X>ޜVNB9§W EeVDgyx?z@a5a KrVL"q~:1QV̸?/NEDAK|GJ1":͵:Yé91!\9SCNN^?`M$a TzᒒvboJnp.)0%tadى}VNU:f+Wi1F3A`BB$A-A Q=& iiB6L3c'4F$ޞ=>a`[}k ٰZ`奼|hzO2mH^6(o4gЧT좥=|{e|_x` V #!J.eRe+$Z3QV+ 2f/]+J\.w44_mΦi*k>TN`M55Y(d;6!lh3IO֭tb{s19 Ijc o^311"͊9I;krR] 9$&m^ǎ6n/ӊ.mܑ˘&`| ,AfF奥*8#e$/a@eHYr1y"I:sQ♃Ht`\jv>44׼Rz}yQ\_klfeZ湬^+lE8s^1Ess0"2bmFLh%rD=mg+RVdfF`^6јo7͸38ȉO:%Pi_,Mtvy.8TBY>XSGWbhRu߽[:l~6-;CvƝ D٩س1U5qtQZ(TNy4 ϘtD5H̻ËDBA?y$Iԇ_?(m2&ޛ*ykL7AUH.:E,űsbh,Yfl-Q̨%e,HL5rƈLZ1^VkΎ}0ݶ7p\4x1mTu8 +wpp}$]5Fn,$x6>Rl޽|΍ڴ0\6vs ^uҎ_P3_ȅm^n nᅯnWOs0~Ej۟f[!ld3뭻Wޫ\Iwy07?ݿwA-2G*Ebp1zρr Dn\dIe`|nEx(QJ\=PTG-M\]vO, >ezf8d(Yd\ynAIs0t6[:BqZfcHcF-!WCFmM1#V[䂠'21;(IQs-= |2S5qdUaa/X(;,<(Ln 3yezr8̌~oGܗIE:e& ( k3s!\aIA«1`QHQ`Sj4h&ce.9̂k5q .ISAjc_Ԇʨ jw v[/kj\$PV'A$Hf,h&H%˞I؂R)dbHE aMLhHa&HE<Ƭ" $2Vkx؊y$Ǿ*#Cĭn9I F(d w#RJպ1XP𢐡w6@8)%2 Fb+hr# #JʈX9;c=Opq<\:jd_\ԕqQwtDuoԎtiȨ ,`4*r59@omd.O}TC*w?mq\gD֏>Vf˳&ohPC*:6dԌ{]OgY=F@Y z f蘃'aW&L)m]֡Y4vR:%=We۳L3mYz\FS#rFKĬQ WeK/2@pa/UC B;`xrŠ%wj,I.n YfEAf41x.32.YfxJO蚗+}=&&w6.A%&EY{-3۱ eL:hew9E@cbieL ,q-vDg8ǎ|ly>6!!4X.0q DXX֡D Z孍3ۄJJ_^SA'Fg?ꝏX,.e΄גE23P>-'01.MřHun[XJ-6]G Pcz/RT[] 8۽R /xV~tbuc'oƈ*dSONiT\2jVq選۬;˅{Z1M;/ܼ]_F|/{?=/z?a⻳߼ڋ}Wr9ՂތBkt~\T B( nZu?\?/ُh[nOEUlsZk8,?+f{a4m*Å DFbܹ^1x_?c-IzrsDl{Ť$'#B@%~Jُ|{\ػLtz.KDkg >ܿ73iei?\fPxJ{%pdM(nq%ʃa^ڒY|@9}l9fԾ F& 3d.o4Wr>08\ {f'a3&Vz~&̔E#WKWD~;weH<0+x&ͼNwXTFenwVcn2 C_VvǠ*޻]ZZ(-44{r3_lCZ{1'8U` iQcFC2}:90s֞Tx*Wz`gMr*NsV݇p>|jс|jK}g-vH6A$/muBkLĜt^m-1r\699 wSHvdZrO>8SY]X>]b5&wܶ}lb.bҺgH*iI Uǁ 4Uvpr!Wz[NOɧA `6s1<ޞ{q\xV~"#b}gbD.W߯zr)r9$g%rE vYUwM"<_Ax{Ҳ9*Lmn(4>P$Jr Za( .lw]\Fc٦ e{ܥwwFڽpIkyTkp'FbFnK2:2r< db sUpw h,vEަwimB]Icg VGގGaq!FnDM6ԘI W*p! q;:E߃qXA ^p"T:C"4P~(-c.H1hia dE>DwL䔔bfԷd]M3tyoCt _A|YNG9TIbvS`қhlF)asQjS,{v- U; 9׈$K TY' "Ǎ!ʆ9;Fv ^Oiްs{d޷N䞰Y~R]G{ګ㊙dֆ0S A)gjY9:>+◌KD'-_^A%ł",,[tČAD&] V#[4bHQSt :{m }C!:a6xpH"Fv8+Ur4{sS/Lom㸽=ʓn,XDgyE]^*2ӥ]ilxP,xJ7k IQ袤E|:)DR'E9 ʋϚۛ]w vX/|ֳhA1B );Et༣hN"YʘS(e'B##4`І M : fZTʞ7r:YW?5B|bA:&XsùY u KS]+ h P4F:kVUOxXf;/,'OPPE%0'i qҊ EK #J`4 zZa: ydtRd6cw5Qyky@M@A28FBXt(-cZ{9R9L){VUAG,Ǥ,rђP!5i3H 8D\2IT6xmze~ 7<腱=oDv N%AHə2dD}JBs5Leu79_z[,ϜDܜ)q~XMloꙗSeW0L(H.͉j4 ` s:WYٵ\⿋eqy_Wͨq83qRū0/A/Ťu[gZ@b }ĽsB x&>SQ3e M/Cy7][.:X6]ϯmVfSUEiԴ-~=4pQUwInݻ7ͫћh(k"gŕ4g+ʽ-p-ac0m!PKeƍ3BJ uCi"Q)RJ_C].m\A}U8pZAZuqQemEYtprbkٞVHKwDZB)dpAf:|%ڍX09Q1,NBYF}5Nc\d4P}5m09F^|20nkNEw,qFy~sQl咵'^]cHH^،yS){7u﫶TSeR]ݨ A3GW;jfOGwOTN`uk kDk:.Ռ)~rv>Ϝ]kVl:"zx`[fHb?*߼03. Z&ABYJ5&ASNQ b֟'b0M~ DMTӶ*ʋLǓW&&T6z+AŻPkeB%̦:Xg)ë7%EfU݆qzD!%ʵ 4/fe٬.՚ ިY&h'}C/;_Mu;Cp?lcfhVm,h]ZMЅ bP)w4)гvquwppςy!cqn3졔CSzk\[wQ'{-_(L+?j//Tm򓧛 ˔~iAC[%z^$"s! *@BFFhwVeݱOf낄zM麺*kӢJYK!Y ?2zw] (MX )d?V-\sQhpssrj*w8+jg\]MuY}Bbd1S5]Z-:'BAM/8#\Vmnԥ /+CҜjtyxCI[W~Xb?shAgiF0#r_O&CFkM8b2|Oij{uؙ[,8@ `Ot> *^M"ܯKшWsdWB/Ș:ע?bwݷgg;k':p-#Ҋ.OWnRWK\dfyiVm6&X-:e%﷞y_N W43AB{e} As#$o # 5 Z` rW wsfȠSУEJYB@Ӹe~ݳkzN&lE`wX k{:;qY3O&;#,-/;&ea&5)ĩ`cء`|q:f8s`^ gvnL6ž|"hf' 3gO0ف,[Wn&?Ω'C-Vwtؼq}o/>oߎ)ejwʓp^ ?V<ݘl($.6 4G$BʳȞJ`';l07AGYeGlFg7f \ TgZ2x~mkAտ0l*+fHkIwOvGO79BXa#<8/o~7tQ aaT.i:dcHc"נ %,j~˶ogP@SZN>i ]zk[ENbZW s;_BQU@U 'T.RZUz?|ď*IP{sX.YMIRGtmn{{/5 ntLVm'/&)v?u=xRF1} Nz1YGd7D7))R^qǔP0QUZ,k $b4i.lM+1# fS.a-hP!6e aOMޤЅtwAIi2I=YINi=`Iҿ '0.iFxѪ K>Qc\Ip<YhY֚\F`qe~1<`ČC!ei"Hsd1F1FbAgZ{8_i 8AR`g'"@ "L\{IeJݎ>X"շ㜪:QG3^J@udP8(x+G) ,eĻ@46ZyyX7kP s1@"KN\Qs(J[2z%P1c.wvLc+ y :ΒwTB\1`!MM(fcT܏iVctNeHK;&PՀ\I1~hd\AS@zR !X !`gEhH}UTUR: xDw.U[c Y3ja;pĠT$UMɷ&:ʩllK@dǍhA[**dY){h*3jƀ(Q.(ʨqG+X5*f= 0|[.HPSAHz`W&[ԥWߣ.%eQ$"/=kw(@H*+9JOG!BI1UV  ѩ-C^b-jI`UWuD8M8d3鲭_mݍb_|[1ʣ8I`RTRXC8q"^d wgn0BDC-c5ܡ*MNzDղw!T8MuY ǀ0^ RP8<(}rmU28Ut)\]Hh)7VSc2(3(v57V[/5~a=(9@i@2QӲB Q`rQR0*972Ab)ЭLmQx nEe06ی"Yԏ/X?ON_EiVp6SRe%)8 dxqc0 vnt_|#d*,hb⮆,kK0 w@܆>pi/F9 (T&o%bT@5ʪ+x  BL)ڽX|*#m60Ze 5Glns mA*BwPzTaH>Jk$=|P@z[`}VFu&id YuEr\O.DN!Fu/ϘV0 .TmBF8RŢ4xT##>j Cz i,27hâHYa=+NFp `Niu)"72uH 5j*\#|r\gR,JLT@$4.t-cQ]r5WhKH*R3QbTqZi#b(5rfF8ae%3f@D2/ĆLčrQ |T)0j2zJ&`َ\03ҭ ?H`7jgC1IXհ\|3_v)&T,Ҭl:H"r S((`r *-w$X~vW$<Neg,2B-#W;gq|Ά n2E+edJ.99[G= #Zmt7>T?飝^{棉Rlu֗ g׏nMa:m'_x>6]:Ӵ?I.duuLHk|л5sY|BKbyvzʅ=agX|zٲ.\NU=κW×bҾd }ʟʪ-L'٪0dL3TmS>$%A2H(}J Xy F$%r OJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%Зrm!)NR\u^ Lt^ҋ [ΧO/QPy92"XkV%;jZ5vm6 l)~>0Qer1/_̿CFJO?O3A;4笔^yz REXVam_xmFmտ8DoQo}775U$8x!;09eAṷ<;E#a|S/{Tisa7 vNR|5uL6u165u@~|~] f;K9#mlܪm#3L,vƈtrW'jI~ΰl{yHJS(iu:'|NүyeYtCnbۤ6Y!/W C8;eVes{\е5+̛qsar:>Pvb! [v",K e5jzZ`=ΩOo^9 apޒaiuOG'L/dSb\|d:!t: 1"J_ W\)k*?_~?Gf]IZNe9uYV 6}W?O=>br07XL˶'7YTD<,yd<ߝ5# rYӶ)F&ycG=?C0qvoJiOIlC*gv}}ؕ 3Gۏ{}>{e]iJ.W\6}1ʲR(,ޠnZ>T$dP1[s14#a`\K kOE8덐"IM=i]a_kN^tq2;aye`((t}K)zmkOl({EzWeN?{WƑ 6a`?lvElC^SE -ߪ!RL 'ug eLΚ^(F` K(bЯGpid Pccv%2Ulc9%lܚ[f:#6eB#T f?Ux;)Cw&ya6b#%u 0%3A]7/Yz7|h0v3) $\'X,vq3upw&m4'Td_׳'lXޚ],uFR9̯W)$y|?~#[|K1aW.A&fZ"\gww1mTmo2հ#Z֍/l:RdJ ̭$*L$R$L_Mз&2l_$WFj`A $bR ePs(gA\Pc&҆op=Rt=(4eL XNB(g%.0gM%U\1t@]*z{_oϭa[}5m1A*aX{}×yR>+CDrJQ4|JVugCm[N{)~Cr}+Ctf!&#FD4%ȉ!ȣ2)W䜁2(`(㶨QDQFbMGS OgMNTx#zw&,spʬ~WΕx a7YV*uUOrﭼFX]MzxzZ5:Tb)BGG`j jC6 NA-gV)e'@c%[}Nۗ Lfi~/6١ߴ`P npֲFo[>s8~33 8{ʁb$ˋZM҉`UŌpWr xa[9h5ᡏ,^Z 02*{fYBA)duB33sF ":Y sQbT^AbTl8}:!4͐6j\ m3U xD4 <4[~l#Q{(\$ꡋ29_f& %Kx4&omCfJL՝Et:qb8r&9)ݒ#qois].df-IK,-:G!{EL 9S *ɭ5縑~n0Rz]<{EezYa|·jNu:߾'߽-OoC-}Y߅9PkL/qMO81kyP̘w]-}8>\3d童+Szb2Uvc[ ·L(bo$7Hl<k@:c*!|2JH~IsZ,s7#)%Tř `MLd %Dl&N;Ǎ4ᣱ&B'-| 3v+Vz# _^Xl8+VIż gG-ݬŸW?0~Od<ܹ8ؑe|R)vuSaޞ()b@C+E1] VtaȶtfSLk]n pǵܷ+ %>$%E2BJKafl SqL,KFPyΖ:#rNKKI 1PRrR!eĈLxe,wVzW<^Ξl>%bIn bwLlꠞ QKp= 3Itt޿pv?鹽uh \&п:$Bۻ^/M}À!#lnn[Wۻٯwt:ocoMвtǻ]ݝOy0 7;C~/5ŝs;:žu<ޒ.c>5g?koZ l 65r)Kf~>ET5V7"ݜikԑus'*5 ? ctK8̈́`IU&TYLpY2M<Ъ_.NOe'ۓP٣ p&BfPֆ.p-dțHo:ImV7@)iJeGR-H9Ch:7mP,E !RL$P%F-uіauS@ If᳤)& Ix͸^kbBiǾR-|(-@KJ'r[txx)7D̆fo|VEQyw(0bJG0\RT( &&TS Jh!EFet1{O0T3 '1[Ϣ)Y8;Jp42nd,Uaaq(X[,<*.振oH69|ß4rHn=:jРQq\u|ɉ[~%kꋟw0_r>!K^I8^{kX'GHƱJ/00Ab4Lf8h܄`=3)ŝa .1xTIVeY@1 d$E\3#> .jጪ-ϲcv>3hRq 'Q?H|m !K$OW\=V"6x5>(*Fc]Ȣ3; T9IZ mH*aEԌTKdCd%OړHN"#9 "ẙSv8]R{gXS&p^:ktSӬUc˞X=4Wx̔('X҄IX#ϚPkm+GE.'HVa11؝yٗHu1}v["ӱl4;9:!>~YbToDъKeKֵ qV 83 [*7Q>vEeAu6y۸s,裏çwJ; UxU)@PCR|>R!3̣Y~w]irEd<ܺ_N;(6jVwq&мEѩأ9ϾU=82[QZbQb5W2l\Ң,Zr+&RÎGGnnA/v5}{9(aHhp1[-{Y<^]͸fSOYSjThXJjT3!l3եL/gyؗڳ-ۥgVljk ťE 4Y! PrLCXkHG?nKĮ"״ ht8>#zᦃ?WJeNi1q1hobf2-jKtqi_U[9GuMuHu.ʫ}/~G[Bm:zƽ^X(Htq4a+szcS:6A6 TԂvm2fGzjn#8;hS $cr(EKRTH-6* * #|w8ݢoLhd$rxw߸p@:]QIr+ ֒blQ1:U9oհDUF i퉢ŶZRH抩(_R;ȹ&;{p'dznYv׳I-M]1*Rr^v*t4L$b|V5:{uh߁!ӯ!۳-cjp",6Ȝ'\H\ª(R2Y٘zZECI^% Rdg !`5ȿ7E֜)RQW2Wiz,mɄoy'_BZ]~rTǷK?zӸ/P7EA2F%fWDʼntq&)RUfjc<ëNu^79S ą(#&v6R2l rKh Jjm"JT5` `` (XJ@ﹴ ukΎ~O|o0V+NSm2\C:hmt52%㶤mB \&clD4ڢ] ||*p_ƉE%,@X,>!Z4-%MY3.V7OF4ꤍv 䝷Nfl(BHG-SrP" ] hƔMPY;G8@n617v4-}Yk,F'2:9dm!b *FEe\""m0+CP#=?f":_|CD\<8_U*r5FBqd,E$}K.+T=dž izP;9.~5œW|2,nh|'Dş` 6Uն BQR Mqâr Cj )(ظ~mᄢՖgRIB(!脨;4ia t,8^WC U*E Avl!Hǁe1"[dJHL*Co~ګ9;7ؠ;0~lnm̵/jV&+,@ڛȨl'm rQ' 1bɞmJB9 EUj]Ĝ1#&B;;H|,Fx9M8T3ʼ!{!mT۫ + B %ޱvYYf%_։l'4WV9J:RZU DbJL 芢d/5$ь kC1+E%pu_zoXhaӮzPq<:.>Nߒ˰wjsmW 糳arp:13Uy`HF D7ff!rd+%wg%_R9gcduonyy yԋH h1EE(K#JLlsZhL"Pߝz4k7˦WGoodS'y\&m M~hm9}m^$˂ L.[:ysݫ_OߝN?n]˺m~ɟN F6u-j fƩtTCH+E(Q9>g2=;-{T`"bC+kNnʒ=ģ5ͷ/^#Y֭,fIҕyr2{dd;2LȴKȄoq?4~eЫߦy<<OT1LAʲ5i2=-w]wzmrwy/z/Inpvt@5(Gb\y1H}{1i7eo吝,Р4{~V}Ե_-.څcpoˏbV,_0 :Yv|tD?d,66c:=n2z9o/b>ͦf_w5󛝴]FV$#{#u`=Ȼ,mQc(Z]CPHvL2páQ;>>{}v¼7F;T&{mT:AKMB<a40EQ:2rq#/7JS篮 _}~- ZPXѺsKyK;y"FufA^&׿hyإ/&B޼P.-m@kli C lk%*hV*;u}#1gW/٦Ǧn)~qX_,_׻O/O/~Q>s.=&I^~j:HwxDziu!߻FwAtric^..ǓAo5·4{I ?`5!bbP.,W D)Zl\-sOr[÷"Q(8E)Rykll(%]5QS mY~4@+rfm:Vd#9-3Zn1F>u8 RsJBYjf+P]aJY:@_׵P$^NrEm?#]l|5\\'ImUb:_GʷdzɅ^:&#kV4೨/u5==seVm1#j) "-QVdZ;U@72vk؏K2B 툅w;ИdˋBTU;yONNOuP'M:Ue3% 8`,@& A+MAY#«+b[؋(llK{P+uAV!UT|M3bwk|Ab jw}Q:6=2!mDE ʡw5#$YfL ʈU[YW*9#!&FA0C -A&48 FsuY%%35v<\DbvKCAn/"ΈFD⁵ jtFX).,GŬZR0D]242 R>hj>hp(:؂)6l R=ꌈݚCʎEʹi싋3.G\9B$t@yoű]ˬ!9k(Ոݬc_< pTa-S)?ؔQav;n:ty;-8x:\.e1FA,Cus* l)F锳˴:a#ZkT~SAxpm p* eC \SBP) W,,T6A}{aP*t,_!o"-'1VEUC9㘆ŤtHa{ʮ䘷 sdz}7g$>{wJϫFj>}9_{cYGϳ?p}UR+jxGH\ rQNTmT5EuЈ͖Wf<  14}KTڥjW$/霢6UpsW` v2ҔCƉsZ]4wk%'kj ݚs;0rCB@ 'W6*O;䎛Ӽ}{^98OېY)TO8ZWs*xў1LgoH>BψΞ;x  R:tt!iyZ b5ϮZydgoja(ֹ,M/(߼<<MNZ~_\j_-z''Gl~VhBo]lb_ӓϿ2~p1]m|)_*w|9C'{?~kr=#!dΊ؋j%^C8}MC\ Y/ӎWٜ~6A/_|Yc?R?Bs7py)ݟ͟z{~yo?=d4`~>~Ņ_ k.,?ںZJ2x8?n @`M@a]4ׯYxױGr>&d/wfR_E>#nuyKd͸akG(#Ǿ]B-t]y0"ovcIN3[()(ӟ3h]|sqs5aչ8ܘ)#[$ZD΍ѻga9J `<h[#Z[6TlAQd8&짼oΗIL^mea^;eBDD"K4hZܹLMHȐ>r 6H \(/\̣qRZ-o+b 鬪3 mi/6:ZY\z[;oJv>k~nA@ݵG~֟&W:wJRH#V$Kg4F2L'GǴaRr',(eL3ǺՎFx:E--Xi#1 *͌TJ}jܢОg%Tk\z>b, §R ,?4t:ZL Z@LW J T?b0AI >M \μ UV@В zDŘK^v@8@Hˁi݉E'2VYC*ڂ> h )Jm'+ NX1eJ*SʅbFh5 La[J{zߍ* Jtn8\וࣃ1Y3v ̟/>_/q1Ӽut&2)W /u%cgk'5W'XT.<&;043YSI }m"@,jxJZQp^]Q4NScp@ :g(@,J@mrpk1Y@gϺ߱6KRliH omO^a~4z{uYC1hGO!4^N'Ӊ_zL89s?hf5yb~r|6!FET CN}V1ԄabaM/ kDn4oz۲(r^b;?mrȀAWB{]7>?r>>s;/Ln3/||Γ9([)%7ʊOr% E13|Q_Ǘ+`S_l!/&WQS`%*/DsҊn_{ wpy\k9z6RV`ϩy!Vǣd骱͡n6ˏb_rTwGWx7oi[3_.1rzF[[;(~ךG{ѯb/ʕ;yL޸rɻCOkxE6cߋ2(k抺l#e*Fy)f%mɒ (%=SQI^06 H#/@9v|GL%=s iesVT5!|Xaj0bU>Q :7pL9gIX|)),p\Ѵ=^Y6 0ظ+h;7P)IEiJEfb EʍF^&'TN(`轋QeIz|R_[W7Uw#:ڄ둂.,"<(A|T gZ_KMQ;AY4kJ4R"&mh#B\ƂyW" DyƄ[BJ/VD r|_pJ&zZFwݡ*{9;_B=ho'0bZj;a7@0)r-B'ܩtrJS XX+Hr1iCQ 䴳FFC4R9XV[\.i%T)/>7!<585N 8 ~uu[.|r4vзgC}XU +ǝe84ޕ!U:֦l5v3k[[HOR^yZp]Ohئ$z4WweWa͇u~Am(xujv#X'OzOM ~pC<-<#w/O &O\jspkm iI&1BqCNvBUQT0bV๨$Ў7ORuUQk9i9uTEi( i@ryJpD㿎[:Qd< 4EqD9V FdrRGΣqQ;Ca JJ)__:WLɸlCTeJTA駸R# 37>y.'~S/-S/kjyD)&'1XVmw!(aQ$**yd 4V7I_) `UHadr WjC,s4@&qR,]eI\1Τ!cr",3LG f D'R8{YW utmK{Q!$s ^s6⊊!4j Z3ʵT |x.aӏ999,6R4r嚱:C~T & jZ~6.NZ)o[^ i\7t5Z.B;Eb"S$8(b8ZRlM<ލqx/S1EMT 3'+ȈsY>[bb)A.'G⃞E-COVe>uh%('ĺF@Àp74& NSrc+6hGrvڡ$h9e?Z8AOi $rɲ +6ybř)_lRBM{ߓMo*E^iˢdhuWy9hR@C_pޞRd*JQ"c!QAԅfIPc-x=cJ 2IYeI&!:KQ_(Em0e$pPx;m<8v(~Q\P`H8{vv鳣i)kBϜrR2 @'RD"ei`̥˥7&Y4!&]2Zd鸷# Dit K4w{1q WC)KEhjR7quD:I9B.V t9AR8*CY&я0;gѠeF?ki^YB'7@o0q;Œ"%qjKQ#_;* 9{HqlK/br&.CtR+e< ,M(b93"hG%F;%3 p^6I+/Xx\~($-\;{&GA[xmvH]ڒҁQ$0~])4 S4FC-3R ?~CbhA N<f,)rMQ5>j) z-;GڴsM>&ӣ òRE^ T Px%Ei(*HFȄhS00#1drnXK_apQ! wcٞ3vsVi-99mAgmp"+dsRLL-?w>6&4 E;uu%0>ӊ]R2b20 C8:NES3vj&aL##LZkqPʊ Nׄ3Ka jN(7v݉?y¹L 6Y}-54IO,QpJ8BgR)OWnr,({pEYE8H'>>)~+$EC$yi߆ooUkvr5fEIlхC oQP ⭘$~ׅ,]AˊEE 'Uzz ]/\tm|7ռvJ{Rv1iQ5BY/:Lb /iZ> *2AC46~j4nCj`?'vpqᴙ@8Ȧ/I]4w&& X# 2tlڼiT KVhrfx"8 N1YOdHb-Yćp/RKI&dV'On_X~.qr3FFSo22E@g ^AsOCNmN=gC!qӸ~{1sfӎ5Q\*<}K{6Qι-ktR]&=c~8iYqKҩĤ 1'h&)EKPn]%w6UH55 .*- KC$ǔ!x$F:h-m֜VeXTVYn"btqfhOFI6+tܖת6s_EF>z_,qE(C{$ -Ⴕ/(>uxo_)p:SGE5Hi\PZV(a#hC6Ue48 1_ z.~o5f/o/ֿyi~F2֒֜lu$6eBk[fPTL%g?k.azDm& d_+"%ruQ֢2_46f>U-uQwakvTXKEs`Ƿ? >r[ ZH`6_? {QJDZf{޻K S8Eհ=2gCm3T0RBfLŨ2 ה-a,[J&g0ϔ$(U-/L"$)YqA1e²6|PZuȚ("ͱ(ҏ8skbnvMaLvɼظ+rrwƑ,S4DL]Kϙ+%Lçde삘n>qH9#S"E\52)W2(`(iq*)@\vW$fZs΅LI\&jDun5n`:q2]G=(̗j~ha.&ê\o?rh94#x:}|x+ ¢qɔL57;ey i7ȏD.Al1ʊ%ۍ^],JhI9pV\r L}sd+1B@,>Ҟ3Ob&\_+'߿>M Aͣ?O;6mc Wa QY{oN44~ [pc`A~9~4_oH6qV-Us˛zͫWׯ~'(|2|Q`&&3]Y:OJTJ6L0SP(زw7st. U]: gpp#.JJR*5A^18ÎW7Z1Gw~+K/ϣR@|Z,R\Q[.ʔaMCL 8w`M>ׯ~ɡEG8,\V'^J^O>ІGkB9sr0X)=[&S 6x&y|ע;]|F+:@Q(o{O& FfE)cPJy:m#5N&!2ƔBYd\镐Ȥ9eE"4NḨ̌01'[q/9(5Y l` ƹF'Cɶ{K?,PVBmɪk'NA~?XxءgBG\ [GGqXGj"c  sJej`Igje,A\EOqPtF:'IkUPZH($42+J<@\3 W>9kmĻ$ޠ C[aU~j5XCy߈>Ay|t $F}x,gPxi ^ֳ@>|TRRR23C7Ԧ༷TDY&1bf#u;˝^[! /[kNd0FHii2LQÌV <L,KFPyΖ:#˜IwWlś^%)cB_ð0_?^.ly4â貪}E/Q5  }E'8qq*#o/Ƨ;gu"RoP2Bjirj~lhW+?I1{-3kKQQLUA[̮PL۔GMݘfK^ɵ1gh_!]'_Cɼy+Fe:~!k/aJ\'ϸW'S#y"#N(w1]y}*kp73}tf"8!<\i}67KcWǓ_m7FsTu+wp!7뫮i:Grw"abvzio|k;7rkӪL:M9vhlk=30݆}2Zl7t c)zxJeHf]!,d9ݫzo|]g.;ԼTrݠ^ ct9& 7,ҤVG֬t&xxN&]?('= .ʠ5LfF,*: 1;]Z19J:NRp'ඛT!m=9)F9B׹N"bb)\`"J.(L5hImfަ f@!l8|?X! D3.1*/ךжn[͹{Im,M!mMNftoiF#?ڰ=:تV;!bA1EZ`uV.)*t[ *d*3hkwQc梌S'1[ϢYp:EEwJ0cۚܭ*$㡺 y Wvs7l.H@WWU!YeoЯƦZ04(S&ruRq+qG89Mk!PPvaDޥuM ',j' ZSBVLNHs` al(Ru%Za1h$pPGT;"AR!Y&!b  16~.iǮxᎭ8w?mgxWܘHُlꉝU~냃7 #;tƈЁڧN I)XMz%=?.Ƚ˾O(z<}9%T]7Z :ᢳP`H ̕?3)h`UY;!u*xww|93Kghx ~Ч05o>,15=SL+EZ/".\Iʏ PuzF`x`m-k\!;*X[' gqBIt&ѧ1!!g7dSfL8?9n=v l^|i{t F>ZliDF'})dhf \!( TAOx&aF!RؔD)&t^D$KYL@L6Y@Gi<:z4Bmއ$R\"\PjE!=봆༊YH]0DJ_O^oXh*`ġ R:; ,pR tƢs'oFΩś$gLIXLR\s5T|]`.X;ah5O;z`i_vyՅ,_~֧d ?9y{ڻ+Tmu||ra_~Y_c=|_~&Y^gvw>g'٫rO-?RYu89:~oMϏ/S/^kM guī6/>?c\ڥt|>~XŎw/c݅_c4>__?o}|t2xݯ]}˓p/Ud8Ճׯ&SʗB[JBI¼PFMd:gv~MqCU?aznn]5k}-i{R`jڞM1q{ ׽_+]k /~~$?Y ͏ c~ K.tzݯk Lk2|\^he챞\y1z秕v΍zOTzLD~o33cݤ=%j/2fTn ݧlLmSg[~1%kfN&g_ߗ20_p5/rsO9G(qǃ1+=|`zF=n\RΦGo!b8Xwڮ KT!Z,Č M.4-sMuw/x[t )D xM!Ġȥ@sI&)/._LOx]<5)M%:IY !d=zA%x2HE QvP~5^ЫHW@rhn\"בQ #u:#B.P1.ʒrR."iBhd{9vj,ͱ(1*Uϸ꬞ttxR0sЊ}omi 椋R(NbA0wqå u-X30O(A~!y_e])v{l:dǶ9 tz2vOAc+At )KW nspAR|xAݎ=`P'Tve C+n:aAc8S9c9@r1- U1m*(U^L QReDg>H Ȋb 5ؼ~{NM2zm@Z8IŬ<@tG~jnQjou(#mqL'vJ;Z G=6d6,q!&=]t+>Zuz drڽ{G7l8@jUIcH*Rvd'vp@yo07DLD%ArQXiBzPvqd}o[w`$< MON(u|~:[lZpP+ 7$&&?%%:cJ`D!*ږ'/MKi2d] zC/1XDA:B56Bufog`d^6\dлaqY>Rdi@H<\kW'psH.:D 4ݐo` L掲|{u#c$0"Q |!Zm׆d9`,G`K>f^0gs@`RkL)`|}/G%GMͯyӵ+UZ34f;F)nMpM,dá5Ŕl6Fl̆+E+ʑ*rY:+F4=U)yrلH+VAiJB ʜ h|:v%SZY/ ,\T U@y9/W `ҍ8[YoW>3[CQk`K[})qbRɢx mdԄA0 耨F;@Kx\[#?Sx հH ɬ1[P\Шjq>RP$2(jZͨ͏8iV;DL}c5Vi {$NtsTNI9x8Ǡ1O\Q6ne)!+Jhɺ.tAA<4\aΧ;ygC1f*Ʒ{jp N:D,dGJ6a*',\}O7(aPt}^;:u@Ik@Jc*S86._j=Td$RQR˻$ uCH9KG\DNE4qNzAĀ&%y5?m%Ζ Vhm a9:Rb+!JkA$4F/ NA[])]M4Qw!`mG $>xBXHY &D2j{@p7{J X J_0>[&mDݫT:gbv Zq&2T >k^Yagd3_z3Xiav斓)?#-Ww<[x9SwUOώKe:cWTV]myg#Dp3Ve!rbrs1+xguOۣipr ]ʧlrĜhwm·cyTV29Ԑ$ r,&|2pHQ>YI'c9rLaBĸDm3X/寖MOyoz{u ha>ݛ2ܳ{{W?9ui/קE?pkzv $g7ޞUx%co,G0Զ\LǍ{hJHDm Qы^N۸ґR=B+d ϻwɫ۝:ZpNk;v> $RpWG=%u">tiU5e˲Vt]#]􉑮ug;2aZ 8R=*E#FdpT]#oL ="s0=\2Ԁ Ū1lÉuT`FN"6hz 2CJcL*9uZ35s7咭tLirpN2nXX@R&KC &^+R\@(֊֐+/_ujxm/Q.Um)grBP9 WJPЃ!N,m$G@86#pwd$&.6~T˄(KJ#ȿ_ IQ"iQȦʀE4T22SݕԵ1 BOYͬI 1A#N#dtF\ds8D5GF4R!'B]> ,c+ZD`S'AbGˤu7JM|H2}rwtuIap.{xrfɹ<~S:0No0<&/w2 U؄HR^nGC0Tbx_}?MƎm;E6B3ٌ \G::?I+ҸQ9?sw<5Eyd=Lm[|;6q/$Mhׇyn34lcsV. WC1i-tJHMF!կa8bѳY9dMPmR_t[Q θdpOPK.aL,?NQBw0 $yHt: mbjpnnsS%ynT۟ CF ҏ|FMfюg-Nj%kVSjn.Ԭ@en!w6ڛv?iL/Ӯ:gaA[cuzu=~N&*]A0v~$6$?kܚAoդ^As~@q=h}F*gK;V:h@;vP0c{>\ $W}(i i:u)YbYe@p2յ K `2\GsP`VD|&X/9+9y)Ev: #XAD9@/]4., R9덐`}v & 9+ZXwTV򬢊tG1LU5|@5)NՓ(ḤVt2`8=|бLj;C\Hoغze{nZ@4AJ*ŝnAMdՅV] _C:%yqG&Ytq%4 ar6e%;cQvE%Df.#tvإ;J1dzT5pQq230N(<9/G:ZaLr..&aB"W4^9W-!/ 3!zNח|qf2:ٔB( TsT&"p1Ž6 SYylxlNj}Qí6 kkOzfa6}`D!hrőעSv8>AQP΄Ou٦ѽv7H63`56>{f'g`U,B8!4Rd0>I%3)KӠH2]PXj('(ADAݰB89j\6[&<EcTscx9h\82_n|(]S;qp &%&1&SX "$GRxcsJV{y۷Lgnڬ&&7{aQ[\bUYa(7QotN3:LR~ׁWL- ,֑Ģ{1x5J}afqjP31% T3T4CafQIeJd*!R0ҩa$BJ2CX⦨bI#ιTD]ڈvetmuM)CҔl!g'^O2GJ]]s]74S_)YW2}mnӃ>&>̂MY/NzTIJ$!]@9x \Z%먣"&%˄Ew3P):gwN9hkc&ˬTdYe@.ӡN*ȸ{geJT^oH,vY׾068ևCJq-6{SzNUwqՄо{5&a) H]P]6K=0wvG14?2iRgA}t4)|8[UVO R jrm9nFXk5ș[h}hFUz1}sC35\Mmղ&-_hK^l֍l4'$ËTگٙ/u/w~}ޖ$U@SO-imbYa+^^ lX/%AZ:>p%Yx)Ĝ41,űQZ Ǣda-0Dx%W^\Bjd!}ɻ ?tTcRcRMussZk*Qg/wq#\+ >.fV<:JkxYgDx`ws<m2 $JIFgy`Yx˒]HZYw@="Oǥ9zl=vכfre Ŷ¬vهk{gH  ay8l4٠=Nb1u0{b 7G&qF۝6?7f`޳.{RlH2Xwvr]E&PXn/Lax'$[7 _w'̺=f/7>_.wfi30=ͪ6$H՘CDvwt^ۛyjCSBzHO܉|;r݉do?Nzs>FssYd4cF-mm? V&" Ǚ 4]FPY\]KTh?n0~579y߸v*?l θu;(a8? %dC3L^86Y/|ӌg|:J #n[ʻMxo 2|v|Q){$cҷvz'MQ`}ooH>OA=!an@;@';O_#uo \POy^L \VE:iT4QT\{;p=w9Z/bԀd!3iLaCY,E0sU,&nX\ Ԝ^Wi%;΄SKh-@@ Ȉ%sBdxpt,hJhqi^r] Z*E딣t8*AZF4X5p=$Dr6@F#3#>Ԍ6fx\(%GWI%ΗC`fk BYy%Q-k̳E> x ]>-4JH&:XU(ar&Sk UZ\%bLs3D` sN[o Q-@eVGSh)!)3g]H\$-隋dEa#RyD)cŨ|2 7Ќ"&e7kAU[(#rH՚c$۬HX1\<:Vy_ ]0y n*#aPI)8DUPF+iUh2-%P\$.y<L(6VaқؘJpirh%H:[%J9&e .!XiJ[5 s.12EhE\r;hA`!d@ Ti/ El\'[ө,$WJ+Bw(ِMB7PҠ)XV(Az`<"#Z?Vaٻ6,WvAȇ &1Ȏ&,Z&E=)QJ;;&3by>ι] 8%;J(&{:@=--V !D,AmTz a q2M!{ H rJcFc#n0!vD zciƗI1ά :~c-n `I. 䈬 @M 4Ù@(qHs`tdH^Q@$ڐ&zȮ\F)^j sOJ "f_R]7=iDWBb|M6y-Qd i{lElNil,@E@H x(*мGwXZ I xfp; Xv7~)1&8i`>b )ؼCH`""N6j%‘ !Y+&{@\!i$RUFB>L8bP~c2 ЃVs@$5T^Ue^i C.7xѴy= %$deYLZ[]hQx nGep6O V'ES U_~^j lbZ1pߩƪ`|~4G R{IyH! %:.p /sCGub,C)9iDκ$!;V@r0]Zx _3>c &˽e;VtPJ"Ԕصo,L$fj2RAJL `TDAoqVUpVX0&a!dE e#@ DUb|؀ M[!td10Mh%lD<jꭊ#X}ѠMJw a:J@;mڨT%La^ ڪmry;ۼX貚fӹa|MKͷ]uLU#[!n[6 5 ]Zp[ nk;-,fOm5EZSR@ΓDak7vS^%zl؈-'' wCP^: 5\Q py݈Vm~nA4(QJj *P22LQ%fTiOdPH;@qoڪGdR6v"+TOc7[QW6b]K ;XIxyŴ ap`R7J-*FjQDFb1;_hWYI{P\GJFdU!x\M'<^gޗ~GU05))-{:J QQk@=z%Ԛ@ߠ f%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J VUyTOH OF D*T@d] DVV}J %Pb%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V}J tOI l(`OF DţWJJoR @b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X *6<%% (2J XWJmX -*ץ`%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J VQ]_Z}z^ϞZRSzqy}~xZ]mtj=Pb[;sO#OQ2[Eˏw/gLE}>ﶙq, ]ga80|^VCMr1|x~'kO&_WR܉V<|7[ml֕K]RC&g ;m01!n2@#eS8d@窤6hF'U6BukO4hu`P<d y0En@:|eǓenۋӣ{.@[:˟G?sՕS` VHOg sfS?m'w$?7Z~TNPضlz06= X=K/\=@,j jgFw/Vܘ O!y6ً`@ʡw{ldq5Tz._:񫧋 _lύ-N߷߾V?}Oԗyݣ]zh2t^ԔO.wU ri{~?ܞ>ֿG<7߈F-rMm)94h,V׬.-wT{hUE{Dl$qHܩZc[#\JxkTU!Lʩ$[5IGz]ѵIl@dgc\79*>3bR#tL4@I/]a ]59Q7z6TC`km> Oa #5rsƺ.)weuAȦl>gal.9bldd[ecƆGta4X^ʇs,ӶYl(UݛeQ1}H >N֏;9<gOv3P 7LcI1w6)u,0r.zôwöYzB-a}r|'h,9]55-5C A!D|G%H;ukρ͘uKi~y1,SRVb,8[wx9ߍ/E=^΁^g7vɝ ˗xç&Q7bA>8U\*RJmzu6~. =xn2ׇm1[:k1cw}SWwxA,wnrӇ#9Œ־]C`u/V"פ}>祥RΞ`MWa./'5rv1\@tvum&jHRʐ A*M{~H|;is8y0)!:(Z1qׁntݷ.e1ʺev GG'O t],ҢCgCZ򨝡,ٟNRC٘^Ûνu 34i`C9]vFe u]w'yRK1-~ǣ l:jjCL.6YcWZ`,*7C5wuqi=få Ehw@g'A+:+uWw{%ty߽賳?m6}}8RvV8y^\9`kEEɔ@ #zumXuTQEu@[ Յ 8ZH9yK6&Ɛ8Xm/5֘yt]6a#{)ν֦éPIdgM(x>&)+6{* >pI:IYZKż5qǍ㤽iy`=h;r.Yin8ӾԪJZbZ3_ɴQNJc"Ǧqj( -1&<$$ɺBf:73JԐH \HCT B[pq=j2B$h5pYՠ=P _[^}Yuyq#g}2ֿZQlXQo}7[>o#;:cõFo{x;;:f?EpءqS'&=)+Ң޵Oc?fxʚZlowG3m7.rlz6e~xB'vw̧i(:twћ-?MICK߶+s*pTڿZɶpY<͍y}uF I'nnE-B޵mdBX ɶ>gȇnw@-.CsVQv3Ų-ɎBt´%r4syΝ֛IH٪C7U3)^rqnV &o> qr.^⽽DGUfoܳK35_G8>޹ef*o;Z즅k?ޚfȱu:n/ ;]gɜ h_2^1v#;v3) }qmw S7{nTf }'45y;<'옳XnsƵ!GZiH@⼏_ PB JNG@(H ƖӀUዂQS\ jQmQ$q[ ԀUm~S,lE|luZ$fPx4 _heӪ*h.g+Sa\T'͟GImbm'O\ɳŸɫ'y%OO^/Kuw2zɯEkۛqMm/&\]!>Z {Brˏx%(ϟ&Kz2{r.zD3M|#秊B) 'U1NF]Q{,W(hWkWw"~_> ZM})&̍KiE˻;G{'j-tocvȟ`2ƈĘ A%)!Zʕ)qS ЛءM6R)APNPpL7$*I^10BQǤg;6$DyTAQ>9`ɭ}pA:Y &<]ʽ骋qkioQT(8Ƒ)@.\.ȜWŵO("hE4ߺ+YۀV~a@;T̕rhy>k"5I+(ȍx8i ] ϩ!˳x qӐn 6P!egN܃hKg! \1RBȠ PrӨQjH!W 5Akڹ)iMQ:sֳ-v`9Ͻ29xx4YxGTu YPޣ4q4Be)(@=AzRt.ۗ@,fټׇܢt̠IBbԺ܀o;aM 7KQ2˻*-*٘$#yڀV][Ux),qu}PܸD0V!YYTr pBQ1'D̂^fUSKIznl $'ꤍI$BzF ?[gnK;pi-+³}JYIrH@JKq$:jR JYN$q e.!$A"e ]QCg<|le=8GR")Om"8ƒ L1irA$Btidn k:tq=0xTolD^*y_nr2#TT Z)!|y퇪 ٙLQR`J%f%: { +Hˁi-j|-H+!)NFB[v44%(UPR' -ϥ{i֬+ͱ(&_i3 G]Fc2SbD -g({BTl6*itB[Rc) xe$$2LzutIUO!kJƣ~*|FE32ʿ|Ƚt4[=:-rӏjM`9.:E dcS8p#Dp\ Rp3Bhbo$|;|^RҠ(5s0{&@۩0 R>w:'ԄMۜEJcJ!,][:%5hE74vؽuCCRr 4%JlŝW(IZoA\'TE4`Y(DHOr)j T0Qpq\öƖswQ"4t5uKk׀itD(d=?jvp##)4o jAe!~2RxGK)6E'utlEGG6GKkHFyt|tV*q9TK5p5tr镋¡xdX e}Cۉ]Az1eUȦ, ġbʏÏC[Ge\pK@PD%P5wJ% ^-rϵ4/m*:z甛$-$HaT2'mYk&+(1L 8&.x뜡BD Dy &g F:kR Ɩ@n"WBJCs=n Nb(TN"bd<M ϭ׽+^* H7RxrxyVԂy Iv_ X`FWɐQUUuW-B<4EN1EfmpLCY܏-m[J`'m5y>έ'PLM&b>Jo*Zզ'4]@ާfv nZ>Y2K [/[i_Q9BYBNyI>'dC:^=M;_78tx7mn}f}M,.h?S]g7g[׮|3ԆsN`㙛Dr&d:;9T4J#WܚLuAB.Ô rw=E_=םc؇ޓoonkz;pWw&y2Ȼ~M=y*)G&^z*f\tqߵ~݇h6:wѶ'I[I}[tg L>rtfo.z7fHFךyم.Onמi (T,@[bvڻ KsT#*okW{ebv_c pc#KRMJXc)\ۀyƕT̃e:4o7#3V0=פن2J19xHIIJ"^Eo vn[٭b8-MAqSnMN-wS 6;<%jwȠC]*;nY9(-f#y_RdVfcnjv5c!F#, ӓB[51[/#-IQs- \&#նel9-c{X5[ؚdk e˶Pvk .ӛ(.|̓?vCOl+gȃuLFd H3&AzkCJ IAWI(30gEBb66Ad,&:Y0vRq[mpW\SڭIK@Y3/ S^KI 7e.ܺX[-ìq@2!CE ٚX!Ml}|*@Nu2εl[kn{8 ^bO"&ZDղETE,⃤9$F(ϘOJ0+ ERJJ&]@x(pI8QP[ ׁ6@qR p \쌏V r#@#v"~L4՞dtJ֤X[]|H⽃$-F%>jZQd ,hT\k9HYg.=lM:e{h`ZL~_TCIp]3e?rn~tᒬJmٞVq2%;8BCi| I*dN$B)c- Ei2vKX6d#3-tԓ\gRaX*f > LO$f,ri<),̤4ٚL|q`~ݬnmF܍ uwN39]Z(Ot|~GQ>ȍAQ $jRcB$+:zTΞ =8{2@ D4Dh3A唖rM۠%0E~饍x5{獸"7:sD^.V דێt U6G;*fG?&7J )uGsNV58XGNsm.Ip >q19n ٗP}́ZsvkFqa["0^~ud%Gg٦g,RVۺ]7^;󰚸AÏ˙nPD? ys B#6C5Ub] h{*qXO7j4hPOǣ~~}~1Zp(: Y-RHABYJ5XM ,NݝNS 0/1x>tX)K\YK^%fY7/Ё{% >dYbjwqƬH`E Y:¼7'q(D/2ɚ1MKqLE0 fN)G*%|gJ~ϛ3T,(o@ AlUBcdڂ* 6#sX( 3ìGH{$_-,v⣙L< 3||IҷQ&{~x1]l{V/) /Jd%i;Ii$wCY1j^^HC*Vu+6iJ>d5FY%CpT?o^W}ɊؖW;z6{;gt۳~%{Ue:b ]`1HeFJx N* ,%>9L<;g fVlBbmG%ٲ<[z|?6~ZoC dF$pL_"`par>(a`(!(#YBכs.kzVdN[!R2<H8e#2pyn:& e0b[)r~.& :mEyi_{3MS~t =-;tƠ[S7&|%?HDCmԏ,fF"=ڑC^jYA.[jQյ@58rګcyu(զ:%x5Y>慏Րf!Tg/02NM!DV'ijmt`i!$8v R\H\֥ȲR%dm3d `{n1dὡ? m`ZsvV 8KvdH ?|q GOn^;Lgh瘬FCSthsvײw`mԅrāȝc^+6E{ip{Jʊϭ?8NќVhibv)*%4՜ ^LIgrR[jiGRj& 1 sC\: Ia9!hdtNZQc*{0)-y% 8ɚ5ٻh9DL3$w\il$Pj|-1n_'ϑI=;FxPy[Ҝ~ &%8&IbP4vQK(@Og9廇|qkP\Sˮ L$ߍMܫ9:۪.6g5Oם'R5[>qFn%X/iZ ?4gRNDֻSۼ Wb&#7_b{8Őg͒ۻ3Rhrz[q_К/ n2I׾|25P1 Kd]'Y+'N: LzB VHHćH%/r&9!;x8|׍qsc^bٿ< =f.[:!9zdNm%zRZ_ꋯw~k;ݎ옓"WZZ1 yK{6єs[vJڕ11ݕLoo?Q;=U[L900% 0Eb.ՕqȥuLbҁ$n.ˁ^ M9f]XiC޵e~8@vvG.lr6 @|OEr9dy3%%)Q09Ӭ*}J!9LIɜ.0NL%dhdq0]|_Tz1_L=Su2jW@ڰ2Xi9甧0@A}yW#ӿ'l?ESt[ϴK wb~=UCvi[{7˴3C^EB""/< pNH. lɣLQC, 3_D}O>o]Ss =KjAYfFAd!Ni0L!JE/p-wG ʥl)jMg5S*!dG DM7{, T)i4;GҔbɑ"joKG˒㖲 ]HH2 aSPܨ`*:b@ #'LV x$ )IwZ D}E\hFs+={Dc/w^bbT* l2x q*HP 3KcH=WoI! loE-<`IUUcNP4uZ&:I]%AK `®W",o#hz3/I>F]2cU8.2-Xܘ k/|Ӗ2~P)nE=/x_Sc J!UO"d|]oAXb z% cS۾(koX$t87I8O$]:$w}Yv=Kisnc!WV Aukd֬:z@=bؖ%?.WwsPtfM2]4e -fn{;oj'm\a<=ssϛy\<7>tMM;wlWfC%% &QQD߮'̰Kχ;~r= X/|66{YɊبK᠂ L90A`&=ST2z,DjAcF kB})B))&=)Q[ ha`̙IuMs#c6q#c> mfBaNj"1%b'7t@Wוڠ{+Gll9tz,EF&Q]jʹ`z((+oCE#K3 j-$& fh/#vHHLBDNU spL}Alc[fQu1U#>Ԙ^afD\ M @NÌ1"`ƚ).ÖЖxY1 !f`E!(ِD91XG ɌYQnO/m|l<3";D\5I2 0A NE5*r E))s]P`1$FHHqS/ g!4(&LR#1sZjfD&zD: -bF.퓯3(Ef\.vj[@ o+4=pG-`Why9% U` WʣwFǶx(3ۆwaû2W*~|G&}wZYw$NN.3=Z<,aKJK/ {ut=~Al߂ݭ32QWiq1L h8^>÷}2̆͢w5Y=A:` Xik2y%R PD'%r2g@4+8w0bQLY+ F2""&2#xMCʘtf!8yqhL/~'C7N7g4{չ"3j~uۺ\ˍЩ_:C!?$νx-d/po|1_l'0SPl%:Dkb xgCh=.T:E1ͥN).C_н=MLpJlޛJ@ś>:[0Q7L< ZӞ9>=& =+m&o{1H< 8)7"Y~ {&DwvBlnƇU1侼 !ŊS;ݱ:&֙|N0.4ܱH͎v’Q vG:6~3mzL[MI=Y}saB9? 9qu4OJcoYۍdQꌥ`"+c !¬!NzcE)^#1²sdW{Z]H!Zi䨠b0`?:zixRޅ}ِmlUiK&BPC qycu4D{Rr,z̝vX-"ro0 _)a e4.riqH`ڎF(G@[nД$yMڋ}ZB;8dSTs2cd3 %0r!i#oBR3TH止wR݂K[S5 -V(7 UumUonl']e])M8ʫ+9*xlQ%ӊ ]FfCХ}KuEQT}p~ŪT12eg\3ںjح󀰤鞫9U9UjZneMNyvY'k^+k)L@zGM,GX5u[9b$yyG *Bۙ0Odh̕n´/d8>{ʲ+9?koҤ]$F1K)`)$ŤhX4rMDQDQ#ێ3GUW2/p| _[KglT蹎8o|ϱޚ;`Y.H0]=kɦ/7~e-^Kd@FOrol+&vhqi0Nh6Ϡe^}ؒ>lobG+Ve[͈󷶡t=%cd=ls6>_|C\d--DMs,_"YQ՘9΁ècdxAS`L(J:\Z٠њ%`E#XZQ~H̸Xz7[wqn uv8~}[9D:jF&h>J%sȲw^({`9fo*YΕ$ ' EkMK(ێ>\[sq x|D`.=[n-~E,ȳ\IhqF4ăpy>\a hϧai6jM,j6#w zx>:4>'?{_s8h%vʹ*Mꕏ9Hѿ.[}jӻ4}DL]$i~[軛v~hKh&½ 9{(ދ;8Ӱ+5p6;[MiTOrFHyu&+t9$t@/"ۛ涛^UN%bAK*LI ZKK^Mt#gNZHf)cNf9B eᔸLdVeRr(3٤R>"ٳ֜ӫ?_-oT5ti'Säu2:20:fל M@6r )\UUORPN?o >XAzk}ve6GCvn(m X٫Sg8@pDz]UPi cro,i˜4W )E/1Ȏ{ab6)9dD}Er2%&5B'N& qCo>C  !j8 K8N7fFя?f8Q9Iq{B'.kFS3r;Z(у7{`9Uo[tV:%ԕ;*C3{?LE`pѸt !X,[G]U5tq/ËhyY]^m&et4K)_y+065tAxr7 N㏤f^Tǯ?:RdKrrR @kmYۜN@ceLË#]Fs߽L 㨽cu+~v7!x"2Kb w%P=tV͞KgC;/%RAV _4巟~]"YCJ $F"Ф2269N0SN%JcTd"#v5ݜJ x~H߃ϮU 4*5 _iV,ZD(LY0f6,KX" ؞~`꟯Zv*>ϴ -5t+I}OOΧt`fl/A?Sm .}riMwREOJ8W7_+\%spU¹pU¹o:.Se_ *\%spqpU¹J8W *\%/|䪯`3nR [g IH)dL$EKbс0$I^C )i],#+k3p!ӺA=Ƙ4A9w6WP.rknx;#ɜ-f4A_4}oN0tFB'>hЇl:^T%8ӥbr4Zj SQFgyFt:ge4@Qhc 1SĽ`S176wӕTRzK3C r-m=P'ߢ!iA<%U׵i<nA?<~Q$pDR-P%PFB$$lbYeq`JrH&rǿZMkj!6l+OtdtLBEWѓqU@M`02ڔvˆ&!:M^&A0ˈRm*UZ$4t&H1 [rII,A-?W,Jha)7+NKB)4>i%u.EDVӴz+Ω['UIĿ9S3dzi  sҡJdk&dCZ2UΫ"H3I:DduֻhM xfKQ6-J}5'&]*SsC}@5(986s|;wqIMJ`VuO/8-4,їtތ?(44Mz7EͻTąiǨphln|dfsFJmFH/J θdxŽK.ñ?r<;'8 mQ,9 EzB)_-fu`ڤk0/AxE)輛4翎~H2:a+~RguW˟V9Ί`pB{GukZЊ6yu 씹.gs*\k2n7snBQv&_dwd mpmfϥQ)ºiܙ5H']9ëSL3 )R}TZ;k3}[.L21'Z O6_+ݾl\&&wmm#T._|djl嚫D"i-o忟 "JS`;)'|GML7:e`tnHڻSS*l1Ʃ}lz贒گ?̊9C\r+Gqvxx5tRy/ʹ.Ѹ4ҤU@Bkcw?eI_qCѩHS H2B$ p\6@& juBTVPLjpʴ)L/Tj4vCP quZ+n@Ze'UR,*-4UV SvqfJI- R{eQT #*Z*IgE|dMuR9LIHBQIr^TUQcfF=)x^`D\tz0IQ8]9?TtzN曛ADկAKi~ǔetknH]Ƅ"pN_0h "X"S5KJۉ}-oR^pS7<0m'7\~:hP1?WxtsQꡓ|8{SwR| +3\%֘%<: 7/L|mpI<\׷"L/JṊ= ?XcwVϾUCvi[{7˴3C^EB""/< pNH. l).X"zfeO>]Ss =KjAYfFAd!Ni0L!JE7/p7G ʥl)jUDj͔pb1 DM7{,l9se݆(9M7(L5B=[SJ1vw87{]0GڧimVn[ +G5{-e ~7d߀yC=;XaF6SfQI TEIJ@q7lviIϑZ=V([im6'rAs;F|!2 b-"̈{Dqӈ'1(%8m!֨UJ/|itAI]KĐD"V!!Ɓ3JLd$& XҠK0I̥u18wf9 -bUʵOld[\qQo T@0`Y_qƢQ:䬗c,Vу_)z\. f̌[n(꾪j9Nug~|oy-A6au~\r[#yz||!aY%Ȕ [RZ}Ivsۥ> zd}c_]z?W;cw[zGԉ\J_++Gg":)3?ڡmK*LjfBXY0]ה1IE( &%"n*RD3[ ĹcWNϗOa[ze1o*ټDAwtr\_;c#L=Wq 1ZfC$:S Ai_ F6:b.dPa'r;{`A(%x+?"0%.ƙJ h30$\"mYbYgSFELޝܭ3t;nk$e{x e9J" 4-rKA鰎q"u9\ Lb73N|O>BO<(,vJ(FuJjd9;cQ8ttnFbF#OItݸaݵV#ǣzm\jey}> ճuI`^Ցu~fZa~6Y6|]Ǹ~sh.@_ `Kx͊M]~ ht{_߿˫ͮKt<pq .:&^_ggf{D7;Dgp co$WBxj%FC~H8|t-uG'ZOçi81Ls]yQ"3Ht]Ro@K zT]Nb>#L{ w̠)7$ƾ.\xYu6&,B-^w**tˢD] а | Z3? quovu+6>Ds>\|xJqf!Se5xDr {*|`{49=>jj F0@Bm(7731 RXS2P2Ab9c\)Ʃx2I*8U+Kx i[AqG Aa2:7V2 [yՌ0-6X##|b1B- RC4/pM\uxw|=SMJ m#\k\t>Χ:n=#&ۺ_/ciE_L 8Y|.kz65`ON;u;ga[6u=a^MFi˚3tuǵn@o롃9mLo؞(fV-վ0VoK7嚯{̜mdh[;Qa$(*oM0Rr5|K<_C /d"6 N{f>ݓd:%op"[ּmyΠ=&aɽ;4n4w'7걓q6'!rxՃmW+zG'},QQA3zei18ayt^z~ bngQZ!2ᣕ2xF -fq1qAC01DA/ qY%Hsh{U8m C)XgxlBΞ{A5^ƙĞ}puHGNYU2b#RۨeM>QGDww:hG 3% U\+u+ *:,|Ӗ(MFA8n J38`1waʽ3À|9aFhNP@r%]6ոOEm䃳ɜu{zMS- 0W\M,|nXe]\.ߛp6N?/,UcKB(VN22+L?{WF O3]D툉އَyh;KX"5ԶOxx*YC" (d"L$=7 x[覿&98fWUidJW^sڻrLiI}yʊ|oNy0[/ 9Nޅ%%Sʤv5(N{h*!'0:|$++#wpjOm¼ Bo.|R1݌Ƨ4i}tTOƓs1502jzfG/֔KեL}CL9C)2Ŷ:IZ+_Јg ;8̵d&#TSH;'e&$BV(W ާZ_@hbq6yw.qlX 1;Ln8Zg_ixek ~t6¡b 6:rtM4Qe(jȣ/  ccb  % ^*<:#uGT3Xi D1B!X)Y(Fe6dw˒(jO9P1Fb52HE[EKי ug7:B#¿-p=_z]baMc|>zBS:sLj~ ;َegx *ȂtYQll= Y%ڪ4A%N9}*K󲯸UL/qx.qt-+-lGוMϤNƧq7/) ޴<1cDncqy8i{o3Oi&RLk<{hp#%hsHnؤX,Y{6_NՕNnڴT]sMխTWٹ;mĝ3ֻGm>`xnz|ws\.hK{yyw' K| i 5W`DKϸ_,UH`lN^WpLmȕ8 %1$F~3"qD/=# *iEU,J Q4z5h$˳Y>>ĸW1Zr%X2bPIEN'AED +g(QUL&bnbhƀ+nq, -؄mpN~5k.}ctT /E %Tr Ƙ(t %R sM5рR@`5 AQj(&f =yPHrlo->E$;i7t?)no>̡lg(դ̞$Fz% TBLJ@JND J,N58z %@f/$eme #z6uugY߀Ų!swW9w[ů=˗,`%[TMphN7Ș1QCC@$qA0N^݃Y"y&rс}}9_LqŊCh ŗ\N |Q%)Eg1BDeRM,u!J`a Y5D;!b]Ciow7w|&촍~'VZ^j%ɨ)SJNck/7T⢡};1F5u!%Pd%,DF*^s;fQ5D'٩wƪ|Y"Sw6"ɫX boUdH{/Z (Qie`4t2P(\FBXjYr{;ֳκz7fL-ʗ.ejTXR*YOMm::j]V p1SR;۫@*"zu)?!VHS62km̦ iTK$(h1EP3o.{?{܉s"^d\chgMB>1;ן>fe)KvY֫bdH׺5fZ;jiM>"aDv;M=9LDU>X(…l "M 89aB ߍ̅L$9ѝz,؂3 9T;\~AJy9toi[kIHyBBlB&h16k2nQ*JHehU1\B&ZbڸTpl\i}<(w3  ]c/2P(0:]5?%sJ$m_h9rAk6=I ,ɋdƧ]ug;0~hnms.uxZ6AJFP\`}?ЉOhG" ;6H=T:O^;! HY4K%"DLKt֝*H;M%2嘔%9C[2hFԽ:+ Bx,&[ A2q͗eʮ#RM+q~7e9| ɠ$Ge %2jT=b MK~=Z*6^mgAm>0B9Jt Oݳ DKF!7Q쇇W-ɔv8 ~RxМ^y!%TF 2iv5d&/&|rFb>YNђg Ppa#zl%43lbٌ;n/S<[hyV6v/)hd0m~:]~:7??GEg+r݁HNfޖ /<(a[J. i ZeӉ֘i)}Li^RigzHu|8jG}8j -zaUQVvb׃uy9~ekH념 Fs񬢜gH?vK1:TS9T+oCVer+X\אKs?Xm8u#['APlIYZFH]H*VpTyZ%q5s&f7%[}z1n{--GeJsH &6xId#8*8pIR)rEkb)/: C+/ڦfDNXIC8o zRVLjƪx{>NS<d^Y"I!ZӐm@h5B9(:jPd?̶>s27s-0)JDDHiRbu=aPh%o%jj'eH*!ZBk;s<iyK^:]rt./$U4>WK왭%X 6cY5_rҴ0Nj@*7L.·5s6vf=f_oFe[+rF(VvlY"U t4LK[zIkp~X/3<2(LE{u~}Qcֶ֨yV,Z iQ n5W/f0V%3ރ?Y7hrM1l}gဇcAl>Y nuFjmRnV5%R ur-Nk M&*\~9ܰM:ɗ]dXQKARG)b@w9M_xlWÛihnvFXf:U|];ڗe8;*27*X{5_ XvUY/恝:5>U:5[{$-=Z˒W= =ÒlyfwsN"ቓ#1)& )ؒq dEn1'sO^/Kd'7I{YRHI˓@7cb%bСRŒ gKD*rI16&jأO4K(|r`a 캔,0 R TW!e22I>h b,耗>H#E팕z20NeQhмvTiƍ׋/fZ%75IeROM7aJ /G]'K⻟=hwߕ{׼.}NYpV4jyߞ"y/)ysϾ+z`2|/,M<,>Sœs ;ٍ&Y(v5c.Mh&}*7A:lڌ3qlY:ia1Y_ Ax^L~.'+wBOlc7[,;F#uͅdNV PEDh%/7(aY햎)^Z~,Y9tE&kʾ8+ĊM?d|$Y%cMZ.~NL.h6zϬtyv 7x+H4B oBQ#]J}t)Csb,)ٽ;bުogϒjF~2RJd\BufQq6d$|&d49vBsē>79ߺ{nZ6kz+9 xd"k/|Ɩ׬oi,dpL}>o'!:ȳ I[Yw+عg'FP #!Jm.*\jr V2f'mUQRH+ey,w6#B+'\fdb";.ʢ@Vd'S GMͰT+8!S;ݿMߍEzig^?>iWN򫪕c?J KAD&>:&SXv"ĶRX5"xL䝣aiyدƠR{PZ 7o pv4~7eL[GĶI0XM4Ԣҭ@-8[_rK`{VX Րff[g':cS*I¡YpZd R\H\֥&_ΐ5 EĐ/^ZO80kjZBHlvx;y+mvss]!)=u Iˁ͋`}^ ӏ}h9i!vag`Ԣ9EА P9&l땷mJ›=rF?Xo>]t[z >, UNEIz!&9󲩳Hd\+i-c)N#;HPÞqo銦7k}f[YkO%9ZOX{:PJ,Vz-ӛɗ~W_kZ ؂ |MB1v_%wNgA}穉z rKlDEB^ZK"0s 6;a#G D}*1YREg}*P;p=KQ? Y֝.;@o-yfn2`,nW^Z G]C-hv*+U.,ʎFJp"$aqLōRkF5@h?~F%;9)%KZ"!|Nړ+'sBp52f- Ƹ YspFvv8Y5uk2 02,9\X (d@$W6cd s@Zwi]h'_N{﻾\ W%He˲~/VV\Wr8S^*Z̆q=R)DCQ,@ *K5Uyj̫1Ecae_&`DMQnY:4sWBYvH[$ tjk\M/?RHœ>gJ`xE+^ɒ=BKZX79d4 9cPm]S9y`#ZfgA6jQ!Vyn"uΣMWr UH'vB ՔeIt_iűGG@LGY}IV9,u&ʏ>׈)&Bu6Xp9lSꋺO;Ht ?r%2^9Iku'Hwq5g^$yFqo>9km=N 횅Na;+Bk b08ࣇ1Z+`W땩/K $ W2I4l4]Bz@NAf0jo$,(cڮFkY:Jo+^_L\Lw$ ٶE-"ަ"-{tj|o7{3~3s?&#DX!3r=lcoZƮ8te^Ŏx %N~\-{ÿLoU+ m[KĦW&G:?Nڴv׮КMV/h@B.mnM4Z7^0<3wxݙ)c kݛO4<zUxIVzG\ԧ۲g~?Ej[ J+)m#+m.v ,9Y]ƌIlD0xI']T<9=!ipQcd*uс6wAPsT#*oSݔ}>%L>v`hpNԔc pc#KRMJXc)\ۀyƕTI'y@X\ 6$Qg˧<$4IJ"^EoaIGZC֝EU꫍5=Q_[z <,lXO6{%VȠC-4W hu.i&JxľpUvGtCkYd]xMoH2V?lD652L$^ф}14 pᚥMQ > "-_ݯ &*r[cޒ&7 |8O( Yv*dT3B|pEfk0fr7_qw7qe}?5#@5 UOk&i8 e\kDjBXHT_D'My"<7&J S2|βUDM-*(g͙aGlw5=AMq+#oC,Ei"KP*Z-JҺ c2Y]P 2W^#zZc d.En7 r"4'] ]~0o+1Y+ЭR"@yް+㔢%58I )Ml@xh#p}!ˣE>}!_jn^^?F; |ǧ\aЗ \qb>]ιEz^t\ ȅ٪h)x<9ٿ^B ;_/wڕL(LV?@!IT n+țx\$ĨDžNF}ϥTi35t~|F.ow.Cq֛}!]R]heG؝+]~@U-6Y~M]pp>?f*"v.3ӛO-ׯ.Տx.Og'm #͋ȍI\x`|30\>vuW0zgPwnnjIl;`$8:Ql}[="\?ba(f$ywѮP/ UA^u|5V>%_V9͟˓?B⿴-o_^t [ܢ5ޛ,|#j~@3ď,"գ|"9e?pݛٝxg%m;.,kOk7]4m1IO!{^ sLq6 xo* )JE!j$){Zd)} _D&xO'ziygUᗦ$Jww]xWtqԭbg/u<.ia;9Rώ˳+3Y_f:of6/M ;a{k#PbߥJۭųᜯf뷋n!zxk{gG_q79vNuT~C㿮+ٰ:9:;gfTlU~Kެ[;{= bd|Xm6Myq`Mp-kKQ뜂y6%٪= 0; r3вvn5Kf=j\'f<1 CŜQduyk_c`^XA?cJ_#sJ󦪚j<8C콊E()nK I!JEƐ# $ζۋw=_ხ>?UTcxc过S/?'r8}Z%g0w8ݖwJYW_GG];߫{X]f7wr!oؔ)? y2trvd/ G5xs?gZʥ)Z c9ļ8=^/oWnߨY~F:Bs9z?@-#~yaVCL~e>#9Iӟ/ *v1IxE 5*稵2UD#JҦ@VVI[iGipnxt[P\i؇}b:MǶoÏy|/Wq6RE&'RF풑QM%Hjgt_I_(!9 ʘm2ʸRQhjԜ(69N1es Hqڛ%_AySJx U^e`t*$Q (nEƖb1d]RvTZ4אyu_8 oåujBdit\(*)V!Z& s>Tb ؔ L)Jv8LYT[bgGpEK3Rz_QȐeq0hB"n9o aM:xRTa<!!ާ\1*ciͻCss&E 66 .XX!=B;.#|m==9^,U6#mJ V!Ec}J?W29.c *K'jjdU9挑9QJr-^4kYS)lޙGpFin )a;QƪT09GXGWV #&eǤ`xk3¦ TCHq@D}"XHȔ OZjKU&8l*$QhTXROY{E/3tHqVbP[bӮ,<7F LV$l, P(`RJ!BQ-].%Ɇ0!^>`4.Z=&7d)\Bl xHLv+ ,*T0:Y-Y?6|EAh7x0Qj1`yUc-Jd0 '-s#a&t9 G(h0ŎV7 +CPfM[J*Ç Xʰ74Dkjt[T9 wS Å?f\,U+V#IPdmFP(Xq&0+꫐,%i*X&kyX*" +zZFk(`B]A IBlU{CkN#MABXF@$rJa5dA6ܗ&F6jd%L֔ I j*?%( a2Ur9pȤ``g<"D1.,!2Ф֙"3T3!(pseĸQaMY`H@:ɿ RBB6]veBsBVqOʺ@J2@/k.6raK`OA] Rh 2RDA ՔFDYl $DO;09x"?Z6U8JdgzL /;iZLKQDFIqQBu"3;IDDmqSmlL|0Gg\u0߼7][Vд-^wF]ěz nx/ OC5W  .pdX,*(]45-*h`ffSx`gܸ!IE `҃ZRV T$1"2NOjp ’MS$H.:/eڃ̊G͠ ۀ8nYɂNU~D}%*U Ul+ɒ3!/`$S`ªb|.ż=GUD|ҷJ>^b=^!EXB.` 4AJ,./oo聼S",C)9i0 6#2ZN:G@ȃ,D 6^@|6ٕ;&,1iLMD0jhk r|#`EȒ,L\v"(Τ 2S"I/&˛ۈV;Q9\-J",2'a'DHAV ;K$@A2֥9MـJ6?a%NY2!k$$YLe3HmT 5i[ۨ&Mw+(a:H0U#?ow^KT Cb8Ahma{HrDr%BV .Ș4(Ba[NJ3B*y  <%"` rhSbHP/H7<{$NOꔈr9Vw )LHH H*(mz x""0fA=-d}F=N P@>ٻ涑cWXr4avݺpj \S‡_[o$!@ISlD`tOAYm +ADQ᳾6KKn=\؈o(iÂ\AHȣb2b(!G=TRX0̏<mHGK96Zq`DGj ӨJ|YwxZ.Id L9LOyk~f݃2/,nWśRކS0p8k 6@v'H- !y /=9HzZ @%Xiũ:૊RRc+Dǂhpd9 2|׻.HxA&λV Q|vLWv@z'QN7ɇ}Ju5632ϡYazo}r\~6K&h}|`o/T ]&.Z<]ou=/M.uF#((~ӓpA2aȬS6+E ]z.jJa+^!̗泊Hu?Y&4_Y/@&$` e^gʓR@I]'n_)^GEXogPo _gG>h#Of?Qޅ|v ~“ k9![H,յ@gr;z?J{5kk lP$24p/zF`ٳ1K(oKb;B[٥0~eqQD{EX0&2)AѤ /2z;YRl8{ɟFe)vӬ./snO/$,vmQо d?n*L e.?1K?C>>*V?(Ńԏk i+ HXV+䐮^) A:K.Ki' )V  b[ R d<[,: 9 ^% uE)P3qZ|,] b- 5ۜ|+_;~+?詀|I"5 (u cc-9ӥdiE(BΠ0OEì%=>Ks L9Kv#6a,Pry#E)hUz2Wªsy!& pGm6(zr(3v_bT%E$ϭ!40id/ HX+, e. #$<[ 0CYYȶt:cMb`uW?n-Y}I_'ױ˜:tӦ(;iE]Vb~¶.ׂZ Zrb M,1Vt^l(fl݅JQ&gAfJ1NgNBL΅qKQ:o(N)V7ZEҢ19we%?iTicla]c @cS"v/|Ajd:4<6dgrh/ΎF7ڞ1Fu17Wo0}$r2UE6_ גJX&8Rdƶ8*FZ;Z܌YngS6ǰLzw0{l) ըzo<Э3HfZ@ګbܿQJάC>zU_lwAXv'^^: .G&ͯZ&^Npp B?_cb- *EQ+hBRB)BT7$x٘"b)f0^'^'`{D*ZEl6t/ܽ)%l_קTV5^>8+o$_yblMQN '5T ՓrAF?ׯ#SbJOO;1^m3';Q[5"%/ݻ|7jȩ%*](= @ Ygj9ctҜkwl]۵VM6Y&V[U6ω>lҜnXhP6UϔLmJs#ާM%LXN˳wkJ.M-|l~$m?fu$]6L~f:] L=0kl鍲A> k8eiΰD -Zjܳa6,nUF`aÐ76Lx&=UJ KCx4,?T(D4> DբrI-ߎG tYFnl8y\H!Vf{1$}MW˥^o3Qh|;Vc$Ԭ07h4hxCYNgJłPYM|AU_aL %Dɻi~7cL&E\t)ؚ3y+XWkݻ0vqHF>AG48Լc,AQ[񿫕5όa25 uKOܗ~ %oOh8T2 Dʍj1+)4+ҬTKݽa} ` wFj7mI2yܪ%ڼ w2SW7u2@,7?Ffo{v1~ZlS2AۉX -2xv(3 YXPzhNii .YMKL1 8̺jz4n#c|ٱ\YhGؖkSTx(miqhtY%igDu䁃 z<5Kp L~1b͜'H0&֫zu%kG[}׾hP#5FU~А4H6=`ɸ#ۃF)IڣXKgDoA39ȰF%eVE4e =lYyKL#0m4f- RDuBH9hl`4!E5W w'ld,vld#_c6 ++2iIDJlASoS_DEթ/j[H@5GEeU=Ɂ/}rRcoy~>|]G/8&x}óݤU|ΗnR%bJ[}HVR,z>#<9Zj|WRcJ R 7/B õ[}ڤzV _5ɋin}nUo6z[5b@z g'*>9ɯɡN`6>nfY%t*\}ғ7[; lb~ PHSMif^h/SS^9]FlPNN|(sH3?hW~:v+џS~^嵺gS7&b$~Վ`4v A f' D(vHjcO*:S~~4QB9]aqG?k{%PAٰCjWU{]6Té H`u@;]cҞ5ݧjS2@}!{GJ#9| i3,A#5s}} HǪDSU=Ӕ\n.q6 K]54s2L__&3{=)=!ػ]0]j7 {TĴcaf]I*ޛn>ĸyc\hWc\n*ZXkh=ˣޛ bepvt6Cxej-k%,6<6}-i7+[:޳C4PkqóXf|w򪪢n'<˧9\$BI/s]p.iQ 64RRܖ NhɍI_t>؈؞WQDHvؘ("YRUVb&22G׫(ͰAnF9iBhdC"H _#D$%R4"A)MQ|[};6@s۱9pߎsrOy-| /"뿂E(\k3zQ|Fuv/I S,$9E11QQ,UGގt1_yP=8-V"=d9F@ᡉGc^4i0I"aXN2&#Dꤽ{21tT;IY ‰|Xbf=ĬcJvcVZOX(d9H  c3_T<3fθJ!qltl yICv9 mr>@cKħ*א[n&%6ĶB NBhh\_$e)u)hK4aO1+THj 0dvJf wSYCq3 8IVA!r[bbaVfWlP^cV{qr&-{bY=u;JI !3A2NR RIV 7!B~ay K5昡KK,+9  9EJW21᤽2r92tL,ِrc,7~NJYn @ȇH1+,(Ce }>p8DeqIRF9$Kp"Puۘh/&#n Y*54[?'!"A$,>EEnݽMdY|ZriYˈp_ڗX:hN|Qq,жk *9i/1 'ƆRp, w_A,O17$ղ@ 7$=ǬoI%xoO5ZFu!GYM[b|{nŨSޕff֞2ʙI&>QB%H y.bجnqЍ~_Ho= U4Ø'CH1o6U( 'Q(Cwj.?D]vg]P"rK`H8ؒC3nKqأ;I!5(e4z㢒sw؞/0dqp':d/N!4$relMx׭ܘ>}0c8%_xq,AЋ 7) AU,HDe5W@~% jkoYE1&]e ZV|I6yA"')+:` $*aV"4f1+,Ʀ¡\p ruc^7 Ɩ[fP q21阸I!- YcMYVcT^g/9 rhooY/jΦCc7hG-;2cfY\ؐ$t2ع  p([ez-wb/SZr^ɒbu|?;--机/_z;^'3McI#diLXDxoɧo,{- ,fM#jHUfךl_̜o2\Q8zmc T&HxDbCzng#q XՊ#Y+>&/Ź? f,ͷ-FBde &DI2|NWM,Ϻ48`T=(ѵR{uQ͒lyL\,瑥=M?!:25N?O Qov_Ex6y61(CG/wt+4hR]Vesozԇe\3TS+фj}{J4z""*T| eͧ=޶^mQ`$[4ɦ0OGb Uza"9 ?~}9!FRD(E~J/^_GxnFKAo~͢ի!DŴ艭C&o+wng3+fthQ_;4dQkfb\LLp6T[hWl<|˞*iqӵwo;6א`͛fv(Bb۩7dߖufsøvvFښU`0OU?sIyr]s8ɱyFl*7`PւDB#,1?dͭqͼq?ksrf*:7j؀$IiOIF)M3zxw'qee1QzVE@<6QyDk0kC&GjJ;O²0?eiǰ2,B(&L5ٚPњ#BkTIFP-Z!ZeaĢ -,S [c{˰yD3L:sIkaX2.cl*#0# W-Gpr'Z6fŦ%kpcJ_:ײj eC aPLd S- cӔoY-8P1i1BrD*esdrF?n e]oR5fVk-C~cVT^ AvZ3b-MM6O۬DkEfEцhsMB:?ovyXI\%OƯ_>13Rן/GGVFj_2(Y""D+5Ɨ8g!;NqH+ D "G*3qLL$yj,dgh=1 f5e_2& *HX馾$ȓrH*ĘFO‹N[01hl !X#Q\w! ׮;EALwz=Xv9/!ʾ馦,nKpMPnY /H9aV8X2fI5` "Q5M˻"_=lהm&8#z|;>!wtQ\${R, Cokē %vݘ h#cXrtܷY`0.D&iB#W.'S ,7穗(,v9=T[ud5wt֟h^$A A!13-lʈrΦ\lΪYbX ̋~B|&E]Cz=?.M-2Uqw+rnr9m܄5.o*>J,+u^|'%9ȿ܋>+#S(l5ɿܫϛM!e@W =kSA3 d DHP _c7Hr2=jѱMķk%?)ܝ66,? 1NLgS`dT"#»|h||*>'l OEDG_-6?H!ٱ^['MOE.Tt|ƀtzL*4]P;3sXDtXM1㟊TcqRoSy0FY[k:3>1ypzewd0W^b]j3( $3Ty쏍RT)%zg|D#*Di2YB!o`;IOӭ#;ڇzZK8׏,W/_(Du f#H[*:^64"߮CV2.!B,͇^]w dZ&bc )<\*IW\Jj2߫W.4U%u^MRJÏbڎw8pZvn{ \ 6lY<^pH;Ե95ċjϙh*9湃z1Ǿn")Y78C=4bhki{I2|[ /mDZUQ?|b.O L)8j6(ED;ץR˜/FƎ*>o+e '5jhRXbԩ\#~$,^Lj>Ȯ᝶⻋{CwBu2pκ&DD>Mҥ~hb m8$py ꂯjpV?w5އN*,!2Q01g<)-RxTΉ溜8.P]0Uuiqyozy w)7ӵݨbDQDV[t%$ˑ΄@M%K;8Mr?X'gǁZ],?HuiçэvbźHfCx<+sJ}8=y^noM֑z7,i=^. \B&4z|,iGsJCω`JY'-!E|LV^L0.R&7%,ln`APY|ҵ ,L/ VD]> `&B 5Gojyb,A.]ڇhQ:/BeT~*{C,UKO|]Gls5SMS|v@j3"Q޲6_Cjgb7 2fZXotw$͹!7+'\V*Y| B6mM6ؼ)5cRu>!w$eG.NTK ;YVZ-ym{ZU5)ȿ0 ȉZ, O' pn.nO-+O`KQ>оQRT4"H0g$q*u;CPi~d]͏"CiC^,Jh:c Q@)2awRt,Ī+kel)9Dx~Q{Y;[?<߯=ejXǔ.Ÿ]sŲTp OhT2ǖ]O*ҳZ]K$gz[Ǒ_ ^yLNcyZ,(%>ql۹KؖM}3*XU.BО;7}p3 +JiE ,%JJю8έ]25 Wq٪@PbYFܢuwsHىcN0}9z) ZhS-X)&{Ӆ؉KDvr$Ƨ3!Pά-+uB0C FǏ6 vvB_u Eq$c <,*.>jAOQthWH %;gbg ֳ!bIŽ zNH MSu-[+ZdlC1@pNvާ/I9 r 7>F'ŴQX9~@8Eg7*ۭQG9-AHi0hO:((] 25S(=x^ R/!LQ$՞o usĈ)ռUzL!D *mmlΧ^&jn,#gHVys/6\za:݄(ge3mAz "92M{|uqŽѨԀhԷugOST!W9։`3TFoaYȠKqZyvIE<|o)~+ˍЍZ0M5{1L:jU/VNTqϥ{~}<͈tt~3E^Vs:\?Gc)[phw8N7JXJC $y%Dǘ(۪ޓ߹oTt/jaJ, X&֒kVQ2F2Cjcr/GaN m3\\0=hאj؎: =F $%[{p"8]&Wd"# )p0$"Ğ&(gx<Ƿȸ 8Ky u&j>F2au D]rt+k7% x,&o*ًyήBv amrFEQ!O; ]q'3!z/Av-m玑 C%T9"sKӯԄ﵀ 0|,ìNv02-j? eE9ϣfig2LG`8AD餫q,i'Qj؁/N2ׯ ]>mșG<à+x73Vs:6RRʮ [d=vWU /U7P>eSvqd' gQvXT t6U!$tQ[j{$Ûwlՠ(QI]sM5\8aS'0p֐C[YZa ߵ^ˁP8 = Eƨ@}Q?xZmSʒFcp`||plqvI'QIЍ)yؘgï~s|(  {޸Jcq 85 ߜ/jj)q{f狫oWò򢫿O3*\9?ϿrnVP2XL$ZFK֍_§Fy ,Y#$_;*m;t?y nO_CFHUk02r͎f=vs~*f̟_?Ie12~;W+ջ*f|VmZNLt%GoڈXd g1ZKϽ':S}`eHɊH奷9kS66~VEXRU$W|m{KZFcCe*͗e:e 41oL%: f1@βi6D.N'uUUo3m79\O Q9#^/ni1Ȝ~#1;Υ)c;ya}Is{\VQxuE|Q ,| W¿oC%mȧD-3y n'x^?cUyܡy7[~$p\(袾94 3HLݿOK.˒*e9# Fx8lYnfEyJ˥ҵJ[ ]RQ!8w|/0ŬD*4Pa'/e VHkm >RҸȤ,fE~es!n\H)ݭĩ//9aAO "hrj?{i# qI`šFm%;+fYJ!dNI $콝Z1[qgYRg)"7g*F;eU./fu"3f7K_6q{oQi2ɍ+YiӰ|tw{xm6\ruф7:kIVPHSt֢Vu@kff 9: %|z(3-@mbmQE3pu,d1byW%,7%Yy0dGt]Or8)ɾic,FsM 'ǢDޤ Y`[D9ۺt{08j@f˺b%W B?E-JעQZ~f8LK~CE(uHq}b@ a(K!ݝ[B17nW*HR{}wZ%'*9 3> P҄9{t+$Z u%z"n1B !]I[P;YUBݽ@Ļ;"Oa;u#Of2D&b)CN]0ѝw7Q*0+o +PT$d]F;#%$f@xԘ7z׃j"#v:w&a4t`db44= OFY( \m܁vfVE-s9{{Ǽ8,!ԯ,i/uDKETiD┮[h O@9I?ͦ®Ρ=k*x'\l-_!©,ηъ^Ts /ُOK*qXJۛҩw4JqҸro>ď ?^R_`L/3H`I @|CƜJZh{h0IM);  PQPu}jt!ݓʸ =$Odel{wD@Yf+TE LM+G[b!H Zmyo,) [ oԄ5O_ZZdlthfخQP>ҷs K'WV>;ca$ZP؅/=D)Hn%z IPp%i1p |+/[b0y"' X,# -|x\j ત9O*3d8шsPɛw+uHJ/Oj|"I4Ar+h$h,I9V' nu=&3Il=}t4ig^=!54,Ƃ!ޢ,-cTwtB;|ɣ9,j'P i gރ^ ӶpE%`D݁gVW'~y'W^\3t=lBrmv_9A%֜Š7$ýU BĵaDȹc7@Ys6Q1S yam*yPG֦} >BHj/߾'IZjR]oe@&cVXej$Z# ҂)dZNY \B fw=Z/$_NxX3G%1J"|=]Q]B pqw+z?Hkn.,;LΨ# Ys\aG#Ց ^q@q9\9ޏEuP"j5˖2wR> G1ȖȸMMJJji,48Ϛ%9ưf2WCXLMUM:Aۄlh݌plh3¡![_N$*z 1(z \&j0tJ;2,T0NQ/`Zژ@9/Ѫp^DPgBp#]obU?,ww4Eu[az,/kΥW!uN N'+HŶdrv ;!9zX,bZUUR%7{jVD1h&S~Qk ͑f_!:Df@qxPy륐6*J@>bE 3i; BEƸjPdAaajU̬?l_rY̮L>߯tj=窸UOWYn'Q+prf+ܷKU)5X.v9rtLiPԠ ! OD빪Wy w%J:W+q#_E0nv0`1X`dfAr$۽G%RUr 0n|;&T_XIKԞ?Nو_4+yK`K2YI%ufITg/l<.o\Gpa<ؽ|݂ b8zLFF a>)w<2l%4n~uD ƿ1sE7tu_F…6ճ%4Z P>g@dKk;( i2t 6]2se\ LAߡD] Ć_`)->e @ڧw]~-|VCe;BMjSEn'dw6ѪfpN6Xw5G2f0L iUS`r W7b6*ca@j# 9l'I6VM]n;T2Ϭr|8mгxRrd_jUU<_[y{rb: 9RXTq0g>1,rIzќcD9ܖ$3V8C'UT礕_B zkV0S+-rtR;v]3՗IǗN=í< hKmKmԎ[ML!o2>]ƒ\Ug,3m5խGjsTkʮmWP^IvH '"koT J1z4K霹ucskpؑP`i)~/k=# p3gPvVtaYѧMeԹ远X_3Kھ0*^}ʊWôL-gQ\ иMFjw {SJU?Thg^BiLcv8t<KLi)>rrfؖc -G*+gEZ^%s%8z0m/8}}I:&OB)JhD1M!l`IJ eQZA -rJeT`4'b<+sgC3 ޮPu)ĉe>u8HNܑ{muV(~=[xf?aƈuFl# =BMj{5ɪ~s%UrD+{7Ђ=qC$=W({A,iLKKʇ@L;ڰv#=[rT~8izCǎJ[*)d]*2lWk$N> Jx3c>5u[?r6k3P-'G$s]%qIX‹w/.M/*=S .3tK^[y/)9QF'5l&tꁭ X9G(%oUG";G]rj"6ad,,$nZh>eD-l1M_މr1rw(Ub8ûS-ygPZr+x;b(*dN|+.:\WRGbz AQΊvu9n\}v!=iqԁ,_NE JeM %4 8#28 ~~yԆP): ,{x<ȗ9OUh3%s*7=pݰ=&{&TD!BF 2$ "AH$ҜbL0(Lbbh ;X0L{ې+QyxwR=)_8]·b,\IGwK |*Vjg}2ta v&3.I{a RpLmĨ*$%(f-ţ^8D$tGr mtSn-"ϲLdz7gw7cv2K(3.cݞ%1wϩ@?+D5Boe.и9Yxk*rHJ>E} 'zmxQ5,k{')Ę1 qfeGTLLgIh(ȳ$\2l薣FҺ>zLJ4 sWP񦞾ȼJ=`z,H& NB@Ф,*hqҰ3뀂GfJ!E=FJ_t ow"㘃βX N `A%`~_7AGA>mw %qGΆf-iAL>@ 6t%6xExA N BλKY5f[ewѱFub7M6WU8a'}]i0>:ma6QݰC! 꿪Êǎ%'Ol}+GP"pd@z&$M76?U(-KhtZds(%4irer c͏1ggJD+ﱚx#C=R 8'oSv]+8( FX7Ѻ15a Qw'f!|\{.rtan=T+zj]gF &e}Uo?˯ϵ½taee#ǝB_g1THv1}-)3rA x7?B,4r嘃Մ`J:BpyS9z$O΄vc {D44^PiZXekr^Zd8s{i5S5zv}IK{L'T:S7D.HIhݗ;9l3=58 o=e7U qJKsOr O19(-1% 3KX"tN*"4Xۛng Lw1K*3'lD\cZh, ̷#m6jFYXi̝gRCU\}^",еcy婤$>Rq_F/ cկ'kԨTZ0!s2.Vh괘!.<.pKu.v +[t昰?J1e5G}fES:9{T0{.IZ+ڢm\i9g!rU7es@kmWXU4"{+,!D <~%)Rs/yEh=8ԋUD1<;+Jb|I  S0]~|OHr0 !܎-w>a-ݙ`8YJp0;5{we 0Gŝm^ſgt9~.$\o͈B?^q/Οx|rt?oG5X 9 7??t2%Ÿn2lsiz\`*1UVٷ)j@6s?^R[FmWK*Mw*R?۽rnD -ocq aVm0fۧԍ"陏g9x{G߱m%O wׯ`l&6< ]='y=틪/41:ʘ7e;qdeGHR.Gڝ<:ZBp6np[1dCu@֭47;̆ Zy< J&i-X;h.*y(0I`|X-j)ŇUaʨCUoLZe<`Z 5sp}0+Lo: &Fܴm`Vڸ|f{(mnP*sux/oGk҂3F,ROَ/Xō[KWGaB9)gBN ͍=Sa䍄5@ڧwB÷5kHCKrw7YӖ Sg҆ZU`W.{5e'B Xw9mޫa 4.C?M~\v͟0C=b$WVsyiזnwn@[u۝k k5\yoRd}5c~u.ROezqTPOe ֭X`^lbJ%Bg%Xj'?-^ȇXcZb!oaf0kļ/ɣ5fL%ierVx5[~i?Ozge8p2\[%78"e*B ߫Uyƙٽ7ȏC#QX{5r{|\${ o7 (ދS)ZSpÕuo\ߘks#F튮>d )0n&}k@ XL~ S_&HͧfirƢ@%INQCE]]\% &Ͽ6 /NF ;C4zfz=0MRإPL((jT8A -I!+k<&c>,h2(OUj3GcĨ̜3G$1c"ā9~0”g 3*Y$A S-˂GeC64:,EpX`Nh-D)eb(  $:Nv"CH0gQꗬNB8 SYdLx%4'K@mDt,?=P6-S*w `3{f'<3LɄD 5 XEAHB$H)*-eVLIId>dCO^%W;zֳLhQIK<~z?|pp"%gzI_!n> mxfv6QyIlQEVK^w$xY,VhK7EuEVFW?HC)!4=AcyV]Jb"( G(l54a^z瞋+ Da8aIs(K MB5DN{KH  h:rrS/MO7A5 ӿ, /r<=?K/}>%$s9K_ݑr:.g+aS_ a.?\EY|L) ]4\͗  $XFux,=gJ!hb8qtQ#CނtXj$8"Ơ[)͊RFj&.J,z΁[GW}lTKvL*T?ˇ\HttҧM1s{l׻ 3LjQ\Traw֭rxаh8@n[sc׈P/GҌxdn +m(VhAɡՠ"vgQj#s;/hƪÙrR#G\{$4׃9@ȧ>]mͪS+tAXj/^GBgYD%T\vxEo5x0AD ,TDqTAŦ6Yx ӝ \Fp1X0'@s8 FWC}{? +hB,m[ "=z3M3˛0:1\x8mG lc7Q97FkQ so 4a]F4|pt fԦ. byFXl6,hT7À|fX͓Pٕ'I>A/mL{*գ1iPp*Xa-%x .&䀹)V67y5X++ui>q _}Q~I%sܛY"q:/vv~㲵Z{Z{֞dp(G!. `3D:ۙ,d,)5<h%Gz_g|>'H*:ZitP/|/̨ rFs9CgA~P^18ʴ:l/}Zxm#R.maL#K7N15Ac]?r=oh:׷UW0svKpMʤO[ФS ,\_ S16ojZ .jyA웶g[U J0dQD3860!n kW.RIRhjCxVq88iR7 )bP_*l#YJW r Jρ=F9Gc>X19v\Ջl"Ӳ0޶|ɨN.N퓜ypfUA,J^s1j6ڂz~CrZlGt/V:p}6 spx`) /-^ﵟ^_{FQx*A[828OrȬw ΧH_bNYc$Zy5%(0jJl*|Kn\Gm!"dJA3Y*F q=4(}PJ;k7+bZrVvhI b&R8fZ4(k)~kb-`!suPNQb-"6ip5+G9!x\4 ɰ ԃyhG Ylu/sn쥌  ƕ:J &*0"t\#Mh/A?sfrHaE(eNj<ΪvGυ+ O1@JҊy(T5aDA::P +9 \Z8j[$+jAcf<-P)k8س :t@=EY*>M.}HN `m-\+YbH5( PM"#-&`R$@qmda6ZݚʸM+MQ#0K8Yg!l'rm qßT ҧ(ZTMY$3."kh7Dv$Fպ)A a&E>~>~z%E Fip9tknfCUXYn" (jS4 5K4$~&ȶT 7۾ x(jDswx5xC Y7 L}C5JaCf*".ڶ B`M.L ٲNnj/`UwS\spg7sOajL qŨ ZAvCKRÎبZSUK.[G2(*~/,aF f`F f_:E}K\/MH"ȔB;+|¥m ,4%(hJ㯩VmN\m0k&omJ 96pjx[O[O*naUPbUX؊`Ba_CF=Tꅍ!DkєP#9nJp96}{s^ּ4ԃ|VIw$`[>_5Zz&` opޤ6`XV$RtAGiv^ZR1qXwH r,D8:Fn0n#X?K5X"x9Mh5/Cr6B,nԌP9c?=c?=xUEOpA'c/rAEɕF"@Vl)FnsS˳á^.WD%__*b?*bX= 3ф`j%cGlVkA׶AQ"_mJoJ(] /}jv3i&V2_;7H.WTwy䑑h<`2GchJaƎKEV0! _qtUJU0CɵgV s e1wxt[#NB6"1y "\o.h<2wpqr4[sƒ62#.DDY2:fh[P (7t*rh>&ԈKl˰e. Fp~ne8Ds=;3)h ̓oш"RZyOK %yejWQ9*9{(6caRLcߍlic͒]PK8~@[yO9ٙ4 g2pȃBwDw*+ZwI ?n&'%e Vk+(Y<jr"pd,IcBT"#!4=Y2,LQYR( WBjqY͘˅ָ)4~Jus7?A={`[:G͘>]ef}gR_\h)G^ 6A bVKF{l}Ѥƫgdxa+"RJ-@=c;kC[14sS3g\@QCts~YTe1p]k~!}!ܝ((<"1b`DZm~Y\>($V>rj:?-4Ƈ1~isLMcg^]ϛt9u2HzPeT=D8 0YHJO~]gG2?//'0}^ aC,U=Ե#f-uqQ:ƚ{\'۶2?M?O@4QyOlGo߫({Vީw{  ӯxp]GDk A;Cq|W" FV1I Tw"A j*:|q@* \/W7RO1ZI96_/cL>~a:Jm$o+ SNVA~l`89q 'Pʼnt…&rb?77#;n'ܩˊ#~ʙ^!s ƆR)4w]4*HwaKwD.OݚJmNtJv}?3;+췕SKYo|3amY0K+@UG ձt#ƝqFDU^|fYefء`T $+BTzb]yuQaSg+3m#ӌOeNDsKNmruSdo닲hk_}{ -+ .2NC hGiBA yh5rYj:ӌzvq8Sn 4jf=t!hq 7%(b/OAwmq,rNNS}n|l'` 'Wo& TR"!\pg YD)80SasZ =sUN!Nw ={vu{jOWe޿oՋ]v·CW[MRL"ฒTyL}5w0,kLLkxci3wɹOW oNbBσ]7?eGFw_ǁIdϹROCz۾W1=x;0LHS$ϗ]e=Ty,~~~1fWח1+:-?+.{VL^+&G1hJ%>OEw(Ӂlۿ&d;R٫#U2*6wY:{e0V@+6Ʊ*+ǯĦ >U,:&fZxv_]țW<9.9~Hޛ3cPɰSt>:uk Hhѩ?J:QՃq"Ӗ}~%[Qb): XU,X͖Sh]aw|׭h,*Jt_&]92WKE{YѢ5 x%qS"Lq; Whyu,!gJa4N\yI@E6doI+wѽW?%Xc +?m#Bm̙N2-ɍx7M[z %iZp ,I_y.Yd ȠO$yPP=c`<.,meQݛ ^ ܀DK5ӴL3XD% -cIAV#uTdM2IhxY%Ch)#*<'tK|bAEtRR20rhchzdޒlD'Yen%7(@&%WyRNBKs$n4;Hsi0"G7~ ̩5@<5WOOEɷ66@ceK2s/Z=HBdeed E("3 wFz+&YQF Ch Qȇ\*mմ1VXMcݴ^b?0juuZ<09?rp5+FYd]u0xS_][^~0J? xMmwF )";s|-:2! W\'fH"/PDBdSAau) gсHT΁T(gUdȉ@qVЪ 륱!ֲځ&!׉k"°4s- BOhJq(! |xʌN}-zUzqCO2DXx̳ Pb̳1F/]?ظ9<gCdԈ  3Nnlas[nMjAJGĀ1*JfS,9)e`{R&1ii\l{H^?a%O8$ D0bTgBc:/Nf'f9F+E(PEDR !< PJ/K:5zL [Bf!P]ta(HQkgNL:Q.Ca\hA8QJ}(\%SȊיH\$ 8ܑ#E,!{S D&xDa0bhUɰTԗ7)5-9MdI3 2_̢VAD"CO΢F:#5Lܔ{P56"Ie. O6)m=vKR!י)Ԓb b"-EzԊ&[RvRɸ}:(%V;n^Xf`OiY/3JהULx:54pCe/[gSpOPux#zpdEtG}&[pOV</"͏[D. Bh=Z*StΎ@,Aw *FXƤtZ@hx/֤Ԣ!OIAGHQ{ϭM0 ]Uruŗ}wŝ{rqP͟1dڑƎQ* x2 mg0"K`ONcNn 9`~~/zfHC2mq6¡f[YUD锂Mvx]mȽZ`1OUTFO2KJ5+W0{N(Y{*VqH$s-V{/7nz䪒)] k,`N]4$SXI4^T;j^4-&*[Cu꺮QHh{"q֑bPZN,Z @O1u1YStەC\_N/#i6(~#ߴy>Aqv(H#Jry&xd#o~+ǠDSv>NhLBORqo x| ;7Z)'OU;5}\iLf59=5ɂX\3I|J;FETgCc%t=UM ڨM!}'Z۸0>&H1՘8[T%i5hN{w9m)Y^FR(VW$bj-(U,*+s{gAʧwnCf̐# = 66Uk5ީ6P@*(JIGvԩjj6s2䜼Ѧ/.ؖ ֯ 5#T`lO-m yP-bRL9jnоN-/T$_I?xIy{pί `T8(SxJ<[?c1& ^%44Eiن>WkQb;elR|p^} &Гn=L= vi|PK#9܂{c[I3ԼMn#%itξm}pXqз~l^Ok p8UrCkAOOO/NN6[7}1i0oĔ@[#T``uv' [K~mEvfs}?_5ZN~8@f9]c]e+tDŹF*6Hv[F“& OX#ܑ[>عB`t+ȱg w| 9^A,"=a?pyA FiA7G9(Q rJ;?~rW\۞s{RrץmDw(pp"z,rsh_p'`ŘN|l_>]SԨegQ]ٓpkԍ$G){Z EVJ'>ܞ^'CӋ)Ġkcr[Y+~{QZiL:\O(X PS35)"pXٝإ+5F>!`{b 8zlvHYDyه9>O:.o}vuº@kgr+OW^rңSزi?{5㫩g~/w<,[L{￾UJȰ..>/ -.Ύk~ /A,i:pЇM9c[ÅMC_޵Z _ 3%'鳕r3gjr\s,lV>ҸjJy6Z{O V-,]D-aW/_y_KU] _\KohY<Ӕob=K){D/V k3Myy4ew+|"݄a:\ḏUďD*|=m _˾J}(+w_Il^ߍgh7Cϸ^|vt7C 3!c a鉍u3oW/ 0^}F0fVh!Gޚ-WZ_=\ǝ?" Ϗ?IO+Pzp{ˤ+Ջ7 LKi%4JCUS 6ۥ̀[l* (%- Qb9)2XEBu^%瞱lkfQGN씄dO*HlJr?VTD0w0{I1k`IJzLI)I! N/)贛sUP=3MԠ(F9S|1s6ˁI ب `*(ɒ4,ȖZ#o1VBT)ʱ9Xn]^k| YCPDъnm205%(, U.Y ^e/ZUi Ǔdm̶mAoܲ1!9?xPXiW^nf9(Ѻ@ pVc"S1aUwO/3JU,Q#lMS!4E74b: IekGygvߍ77R6U7X#p?J&Iziι?2Tft!ӫX^?88$M 9&Ԫcdo4^yypFpJٱ8t!$yb(D4ɦ,T`v ֗jQ5hm}qV*)2m?8R^n]E ITjlh!avIZgg j,bM^DW|<GQ3.a(Ðyה-$,NLGٗ\,hVC蠭uZvb ,x`N+,F kx>[+ poY;[H``z%(T5iOMSKڂ5C1z= .~7{.f~ǧ&1ss#QL.7N1'}"#:;:AWkY5T?__J8):a2\%'F?1F W}Դc*ZQhClTuk-W|iD'V9kٛC#>_Hf#eb/P@GokIܓ"+6-rzù;FvNz1I։OO}oҼNJ<B}QOp ITρ8zLR*x3 GC/k uA-lF' Iɘn\5h(Jڅ՚bwrvG9VF:6,};oñsҵ ^DhF)\ԇB z6aU:\4B٩X Yفp楘2ќ{(aIӐpt^šS"%fÛ; RZo5 K Q!VHSCjWE` @r:/qG5t!_\,'5QR,qr䨂EЍ>jB3 rj(ƥS ydfc*wPA5㽘W4b/uaS 5MCg:XXձe()jSH0TۨUz6) ƢRKk?ɤ{>vp2\2, =iWdQ|8w5Iwm% D%CEfmCNT;O rm; /:OoM^~d N2<332$c>EgsKؒ1>|X5]yQLh)gRl%DL8p2yB!&5n0lCjokΑȹH^o͘m"}S]o4p >aRA,J%N߾߯* }ha1s/#!,:&MhTd&ƐM5"ЁMBHPcU;b.q{Ɠb椆ɰX|te[܀(FG}X0fM#J?>bJQAOEydzs彾P^C1kJ#_$9sOqKFSN󈠇5'ư0l@ x-dp@q a"BpW=d^_h2 퍂sgjBaph=y Ê^d8q ùoE?NGs\dba%#?e.YY>2^hdknuP33+v^d kע1^r`-f&o#-fˁm kNw:yJ|z14yNM5կ0K3 /\vv×r4?:׿4M?A޽ZW*/[yu4+n.? tP]{ݞW(ߥz9E%#/ Vż"35#Ӕ% t_2%*Ki00ɂl(yq4"cD iX3KJzI!By~}1ݥ*Sьʢ0ZJM&НWݷh77t]SR f͏kjb 9eS#Zt /#[&кT'+T> :fg;d6Ñl$S֦R4Lٱ$*VjzX$D;MWǹ=)wtn!&Lw֝?ezs18\&ck54/bOƩj &]mtLEa6:8[b,s6Qk}ɵ쑚#.Tw\'crdb2B7Zl65m pf)*$8ׄ ä{vy,1^鍾zVyuЧF-M\G#Am/jc -TK>;Zp F ͆\HR2dPq112J֧#x^殳`da5AK\4D`!FH=ܭUDwld?f6;O \k{ڸٲ>qR{sB4;w2:2"tK⽾η?cL ѽwA f@k 7rfzEfZa f@_$9wON1fѻF>s6χ`7r79!ٙ-!fdfȍ߆Bon޾1ƭ/l%SvloltDA 0}¾&we>4;4s#2 ;:ӪLW Jv_8+yOֲwru6m -mTI=΍4=6r# 9l}׼9 .?|~~*o"ag cjgYXm7Nxu$wY\'FUdꢇ),1A4QmAEq?r'oy>s.8ߨOg|<3C#Ywy$=BJ9="y\!8Hr~0-,z ϩ~z wЗϗ ( LHxVY>cN~~="TRS 0}ػN u>z(8Pd|! ]| H\PjjФmv`!ꖁFqa`,FƧ'{cc nۛ(o?axs]]+|P!cn DϨUzPW`~5֑;ߏ]iQyaw촎ʈX{.>}ɂK.V5M9AerT , Ч t{RXYbNqRVk.&`k[eof9,G=qy4deueO"xZݾ(9n_{|eO~,ڎy VAzڇCw9YO*-ϗR :Sc OG.0@sУŨ^T"4FfDJ}Eya4. hn%q{RrHƨ.6y9_m'Ox'd [bpwpoߍ`;(.zgb^ΉקiK>Λ,ΊՈA>bZSc.Uf3OOZ^@}%Dl:Jn.!`1XklNScG^G~< \1/9_`͞ט7oge$>Լ5% 0Ǟ9}2.爐ػ~3;t uG[ma (Y3aDG+tK<@pkzDVI8bRM99-rŒkQZ5/~T s F9+}㘵P5y>^ڴC,p`~ٻ6,;m2`ㅍɗI`%3RZdSjQb#F QdXQ-Qsa/@6ek:to׼]nCcݽ1OPkͳn(YG{ 8b̾x @ O W̩a5la hÜcOt}G*ڃW!)w[c+VˠBMqA{MgR8'ŴDӥ;s q<}9=>a(*jZ|(P 2bM|Q(ARl:t;B=+SjS/y 1J.ՠ}3 u[Jn=s6,7זv~)).>+{a,!';f;)D3my:#-@'sMEܤlmph`Ysܣsh78#R$JDjPvGb 0S 䄘#8HLUIh D\]lINۅk:͹V0aPfz{2\w6#=Cɚ.:!-vb}%A)VP@Bv?.:YR#DN Y`!UP;b8FH{fT 2 &*Iy"y=cЩn_P#݁ڴA'NKZQKf ee cݵc.!gV#Mlˮ(YO9I-v(SIj^PHEP}'{$OILaϨ((}2B'EB24N}J&Q[wNWXxZZ1g<М}x6b&VQSF4Ŵ3nj;DL>p{C$kʻԂ1OG-RJjOť!͕kJ>8 l Ygkx.g'ITtzzͳә߶,Z@oqf'9j0UHp$YΉYQ(bb CYV~-֠X=+K<^%!eRg}"~&Β( 'Gǭ ץu$͞?>A֫а dPWHn$*9pD-iڼT]$6@u>s0=w9cE+T<\ue6䎥^z-Uٵ&FTrˬ/vD I\DkTbt8P8ҵ~ǼG D1 g7=)F=fA<ɘѬIaH39:*7 1(1z{FJMO0 i[ -6 ~MGi)1y_|>_ÈC8(Z>tx7 DK* aj\U 1X:K׋+>X { jPJahAt|>l3%bf -l.Z,2UO>X ԁoZTPn⎄U2yCPD#$#jdtn5.x:=w҈{=Eh~6j; \tDucpYn -/CHtsOchq -9 ٬Zh]0BE&PFfGRYć3/&9wa#-<=2Ћ)RziȒ: $HH~lDb֏pM9+sx'g4qGc1VZ۴tjdRinb< F:di#&2q]EІgkڥjAۛp 4"][x!v%n8P"Ԙ2S&FAJ덱 GK1PS㊩V+W{4^!4ƈ[gRZB^-P&ĝ"g(0P\fZ.q1ՙ7J-Z?9&R; 1 1@#zxzFމ,J'́bYN B8+2B&o*yɇ9i.O. jy"]T!pzP ߣh(3o.zo8UhAD1`4lOM[IT枎 TSګsp8\Ɓ)8-R j!jΨ޿tsEqh5P:@&ear-֠غ;]4mMsT G{#L$|5Q j b÷aG_ᜍ^5*1=ok]B Wh=G|xGFՐy[aBBeRI,85M,T >MUU[a4g2R.8٠S#ּQµq">y$XSr1FG//(׬9 DFzT m:\=[ MŸ{΋6z;M]M ξ~CKn2/8W毸Ak&^]&9t =ToWY~>,G2)2\c3$ [x\qM2i[W2ݑ 6.|=đmВ]cCɐtnqͲyVC+3CV\UR(RG@V4 `ske"R>|E>ޜHв0C~'S4:ҩ)1^^ _^#/;:)cEO m-[G¬\w1%8x llFHm&*C]%+5yfv]fd_g!I@),c&XU^bT4Nh|vO9xOW̙ tաLGc}{ GA!/s95LzJCGq\ 8iMTC ^Wٚ$Z#q]ߴ6]t} h>v`~ 3:I< nl 0*) S_<`_7.&qz:a\,!5̰3ʨr4i<\I{WmSQQdpq Bq.Tjy | :`U10ɿN/^ %[R^mIqE3vW>^/8_OSbk{a֤O3ş:y|n#ɠ9$Q998KH\LKWn{UQLuǛW7eɃoDvvRh trRlZܦY\jDdG)A2QbM^(™srCx(.6H$mh3d645M Q^Cū p0T|K1@exD%x&#H}¼!4,g6^z iRFF"ZFZA#9ZֵŹ ꖭyՁlF۴IAKwX#/˻HT}_`s_a{6櫞]FTFV? W/:Ӱ^{w;W/c&M Z@,* k\TZi _T FJT@@5\R056GS&mdzTڮ~Ȅ2'OgqɧƼ23c&ÉQDzڲ<2:6(J ՞qoۜ`6F+f2Wr::WL+N y+ax5a_9e ͪ|>㱽O3(/ޠ$Ň޺u-FO >@W`Ssq=qNDe̼xΫn0=A#Ϟۦ}!*h]M\tI9ώ f8|q9//%JuR:H<єK‹8Lӛ_[׃j)L?.%kݱ;n6Ӌp%}ŝ3 SnY3Zz Ny5Ih^"kSϦwHr{_d`L>dqA&0H6vFK-YnK-l)"Xm5YUzU9,J+S{ 9p%S*5[Wmk~ުgؤv2N,@O!7B,CO`,\:K'ZUgS6d? K6ntts;={\{yw;^ՃwR?WFg`Tg j2QQJب aE͖p4e::SbLe)g jURK ˲7~JTwK-R}:EѰvk"=i~"3Ҫ*p Lr*~")zemU\ZPHl F8dw_UL ,cڌ ryShwggoT *L0Uq>,W>Qv7 &'w%ArⱡA/a:qBL1BD@7d{!ky]tb[|:#Av \jXQimFvٵ7%,nϑPِ9+r58Ȍt)C V>I^1ؕRGT3_/;0zu=&CZJ&CfU$ŌH@$5CV vV!17V~R_Q모~1l2$d`=b#Җò*@"{#xBO[ׁ״ GůcVFatw( >M0kSWɣه&mH=ϮC;ibD@ǗW$V;ك-]~kI*h඀5u?z Vf &!)62<7TՓ-o|Xxpܲ&6*od,]hIOkݯz-9椹:lUBj<Ɔk):ΟiR Oq~*s`u |V4$oD`ڶ6&Gf%Z&.H4~snt)Tx`.": F3Q*d0۠K21U/A#$`*$Me5 x4xȝ-ĥ6k%5'y]ZȔis">fDjJa(D/J Q#GEEfavFEW i_ upӮ9(\"Gêsiw $i:$uݲ$}#FK.F!1*:K: -I*_F[&HFs¸֍|xtHMtHMfӭ]Xgxey3TW(>"b/M jWOƵH`lIVvz,IhiM^b% Nc6i}ȨYHic2::`X쫄+lTk{Q'22aN,%m`ʒ)dז-ZOY.{mLc/2*e[N3bKq5}8Cx[ssn78*M5'YݬNTь}wa/ _ Y הog䂀*>NJ  ` UfTq-Bj>H\MeCqW}슁/~+|jXk$ftoArۨ| گU (m5b&!'Y}HuvE \Z\pxU`jl76͟?ycqsS$C"!Aǟ>!e&[u曢l]ɦÅ{Pl&[DcAaAe(90M` _֨A./wĚnaz6ś_/t'x2"P*;Ly„&}rMIK߿n?|:p}Yunx…>N.QBuiIQHޓB$I7kJ{u4B+;ólԣ ۞ٖh%RVF:kqF;Vu0%np%)ciE%LӸT9`oVu ٙbC7J-bdLG Z硽Y";,ߟq̳DɭVK$koVh2{^2sHfjaA[lyFkeuv6Z! ٔY:]0:vtQ9|9,vO{~wZK:%w5Jjty7Ϊ s%iȓ[)<q` hQI+D%tA:\g6;M(rGVF% .eΓ=>Y^{ʇ PMz*!ߠ96ހ%;0%I!}HEft7tv_ɧHb5<&CѾҪ܆~raޗr;˒|Q+[\d 3e9$;7Lݳ%^1 j]zCovISy;nd?\WuU <{-\j#'0]F^oE}U5kTL}e;ݼ!+bV1 DM@]}gu TANly2G*_Ģ{Fqˡv3=욍ߊvL\5rUFi";~<UV8( =XU: E ¢% 6&|CLC5q ߽iҘWwk0r{s3 fͱbzv| "wƵ > CIV4e\xs5֠y׵  XNÊ $Wj0S>jFQ{k1C0]0>tVOꀞrkpEW_Yzi5?=scυI#U C%WjC˛uR?4 gE0qU _M ~v`v!S _;x "t`xȎwmI% 9o:.XkY҉r{~34Hsșs!aW0ڴVA/{2&kz+BE榹3IsGq>Or$ZRJWbe4Vu/6~^1M`+9d|ࢊyb 5z\cfںN3uAd5SH ۔<W)/ HD dB!s-fJΞ)>O@3rL‡$ iIKB@A'C,plꉫQ%ϲ] z(gPj~Ȃ. =v_}_QzDKsׁ3bJNRe</RJ\됌 AkGzٗ4|E 4`m4s/LkZa:}Sde'6uG7BfMsOD`N]ʫerp#pJ~"5֔L[{M:Kߦ췕 [V-`T]T'U6[*4WNF]IA#,XyF HBgFtHIf d,՜朁ID@Ȟ< es* fpT  )I뽊D hW}@$]ɲq4赝~n,dB+W"|H6`T]Pd2 zB)RQZC55H6NŃ9[hqkET:$۲ KyO;E9"3\> ./JCEcbuVzJ aj|jd=_)--W]z`^M[o:E{Yh~&b~ ONHjޅ1vX?s^]?s@񇦍Iહ0 v؛խ3^w crځzᡢY=b8MOfVP1+fbrHrX:ۜ"ݻv̠.oӡ(AkAn@t"KO ZO)OC(Sqcw2 g)?XHDo:bjuO%HJ* ,V&*DQxbM ? ⒏+zl^Cq&@=x/˩A蓎2uzc޻*+m (W- |E뷳oGh o[ A2K_Ӌn}ONNŹiթWjGp :*Vb_WV9a[TQ<|h. _kd`oIiԩE-_8ГҕScRϷx~]"rݢc{|D_sdS"2~Q!]N+ ܵBSH OvPU$~X:}vS~տr=-jpWwqpr˃5G؄F!,yW9 ǫ#`dٿ补~<9^و #nh׺^MrESv$+I~9='=onVtnK+F<2DXX_޾~Ă1v|nt Xr}!?U 0.>zHGIF!ޥ;# (ZXxgZSm\4ӳ?F?ψwp:- V 3*Qm 27o1w=;cgx:OGr!Z=Q@ BT]JkXߔ61čy5%V Jia(61'bԊ9m㴝6|eVzkoz詤}jAɽo/>Äh;9z]m(بӧ>H97gDžbVX >B1 Vp埩W<ժ yd< Va|hQWG] f"oSXGntyj[lZqOfo?Gzf#>;D} tw2BܒNњj R%z?}u5,sĶZ1{H֧{-BMlHS{'8jf]\kk;8W]6Al/nnAZ5%vewxmRS t7vIRm)z'쌔ژ}V"rvIcrv0~fFl8߹iJZA}jPj-hgq>uBϯ'?IUҩ,mɺZ\0s4q-U@fa5A7Z耻%H>J$3<+/UxMW(j+sL̜W8`g`L,;,Ab'0̎ 8:O8)ܺisRvu#d.IXV2r&VoXH>*m0;wpoE@,sdyE{s2FZD9 )= OTj4]dzG]4ڑ. !rӁRr=JH(4˄C`]Jk@ ~$ :2! !$dhU($9X@v$&YtqApT 'ƚM"fM>[D\*ƃ$9 Z:(9i ?K+LVgEK%E5Tv4ܤ 7tIR^VE qwBX|E @O&7P٧HڒH A™5ÍDZiڣ6%z9@꽀>XZYKiݕfxQXub&G UgS}BRrPLi@qH)"eKz;ٸQqڑ]$! !̈́`7Y^q'6 >ibLxI lN[sNeA#Yx$,d\dd699$`v{['!̔ !H!(P=ġB) +y>I>j1IL\.MlywL@%׸;c7L(kܙN;6|sgZ)wf'>b#IQ݆ 4ul4\:[L; zHo|vqߑR¾#(CDRV-RܘVKPE2>mI.QP^QЭoSt sסeTHA4.P vm8k:];멝􂓸3iN4kkZE:aKt.^хn;ҦBw2ΧU)X#qZm#+ Rg;:gN8F@6cI+i36_7IIk͗1dDQdWՂ,0䂂*%!$ q8Ie nL׃ jf1 ET[QIl}P`]ʀJF"CG2BE/!JQ4QVf| Ќc*@A(IdB1aRqbE VR%ͫ1:1ڔJ+[&/]{ÞZzj_[=%*SW8أ # !jEޗ8JI ~a ?g;kԞ=yVH!N7.{E ;a{5AxWf`ՖViBnܑw…NyUrNaMmX9B}8g\8k۬Ixt]M̫ib^I{]K#Ds$5CѐiS_g]{֝˂ūV܃zq>j'x.jzyGZ0:v[aٜUi]K9Wl/f-,Rq =a\ U!5!-ڑֽO>& BS jwy|2A"cN@\_a+wŚes3C2ҁ =4ŐVֹ9z-=\kux3hc14ޑ KsK`t1M ΎU"5Qy3jx$PNn:7Sf,v{q*Vxy<<&N"^ n 2ŪԅH_SVYx)괼agPl<BhD!1J`̠-LrAr+d Zusi/ڑWؔ^+8n,24pjhHi|EË^o'56 ÔLBGCcl3`Vc'UyUl|<Cg㜠FtnA=lX#k(8)O&V,)II?&WU;T]1.?-1n1>5`sU:zssϚS;kSTHx[@~xG8_KD-*nAt+D9W?(vp }Ђ-ĥJPe_Sޓ h/R^/Fa[p0 J`TC5@qSm!2kax e`1ŋ7S11I!B^R<DMoQ|[}=knlf kنz|ya2SiëYtOI`(B350 w90x3]<ȼh=zAusVp ΉhI3$Fݵ.PEHN5rYHК1OfE 0%{0XCzsD2R#Y'9. Ke"zFN8)ݖ  Ω2VnKiyۛϚhTۦE]&NGUΆsJ9u,pqh1*r4Y);Lʹ#kk`Y]X3wΌfHJѿ0Z-D=rb6u# '"20 $$AB"48%Cצ8=f=dK+'4Y#ie0ٖaWuoXl Q(DZ|?>]JFj`BE~YT-k9)jDs/s"N7cAEZ>AvnlBM8کj"*c^tu :-n".v.bnh mPuCi BNu6aļ!Pkpf]W(\C×8X>Y{8մ~)4'o1^韀o zfY"Zzo5H>g? Z^6u[ v:cn^]9 'z3A)y!#y>ƃUFcI!IOF$DXmY(Ha2"b`R4#cj?Y> wJZ jNzO0j)zǼJqp@(2$d"Bqy5Fp~ř "\I(xd ,C$:4F/y >j*HY}hfeGddVjWNC(P 0*g#4" 90= @v@\/fFyrGs̚BG:fz1M?:"=UI6GèoS9e?w1fj9" %I ?\.onFNCFj=lf4Te]έTeWXϕl\t|JkQh8O:1eu'o*֛cA쯾 ߌ0npKzPTb艀mxˁ19fX#2ֿaC,Td˔^>7_A [{ywfWA`<#i?WGzq=W|ߓ'lpf2[|X^\5=F|pmWb+üGiGھty=`vEk/:z<$h/{lvdFM^ʿMU/閭'o͇z\{|] q`xcI$Zj1Ձf:y;(:ry>qjQWW&ҫ5~UϮMic}_ ,OkuhF{36օi2[g._z<{ꃡlxkT3_o^O8U7g ?AU?N&O)KK?F5Nl4|^\?eҕ>ƕӿR|d6k&눓b:}6̊׾][DrN~mց_Q ?) ׻t*Y (2Wqs)R9}jGcLܛiQb/GWz/RXF~bjvŮB?u8ӑqiTDA5V04A:BJ E fa|^",i؞٬ zjX8}Xp:58gbb r1Ocs==;v۴}?qm ?(ER?߮"q2WFa,w /UrxZ tݙf)Q݁w<~$gݍmV}>b8:>N/}I"hvG`wwrE}x !"Mɶo#xK;+FXc::<4ة" Ay <ل! 70v.2vVu qN9kxWig$8ι `OXP}z$V$$I.}eV)qBp^<)1xgq +Ϩ_]qwWw,`1ŁnC4AalXDQ`XDP+4*_Yc£ 0ƷU J)ǶB{c<`ļ]tpv "ݑY=;Gvr9A($AǰXH~S@B@e#Bway ^i7@`Pإ(b u^bg^y.eX(B˰t.r}ΎB"fHODbkS Q(X }IE%Y-"/屡臭Ί"OwΊΊtӉ*1 q@Qy!A*J'.z\&$T`ٙ.TrT1LX ],";++zݵΊE5 =`HA<IPHMyC@H\zltc,2-"/8؅D3)):y Qww˨p޻lumI|ɦvl''_6@V2#9ƱQ5DXCӍFw_"+%|f't2RNWwTP%dfPº_Ϯky산,o{<;h!Vc1Uf$ <;u`W;$Sإi$EH540gMI54_ٟYd:d<<1MR2YL$ )$K(.K}}k#ʹ}7{pgMIH loΚYLcDr9u?1URI$Ϫdjyֺd lF'FB!J T+ zjy&5geA9/E=-kTƙ)H X`1<+܌\iZ`RHrXGD|(9ab,2d1yf3dt0#ۢSIM?0=>Ho8j^rn=rK Bۢ;wMz[Aep<~7֒30J9L)pĴq5X` sCV:̔&X8#VG6»Ѐ \W}w+aB4q,Gw{niK&;\Z3;`ow L3 ̻g_}43 p ӧqLpO7~ar0èO.=I~W|0c5f]6 '`<#l˞j'lɷv"nsfG֘eDsc;T(yƓ03C!4h9 Hs15~{?]uĹd(5]^I.㪵ΔN[(Mgh'> 3ޞo"\7uHj}8L3Fό˺HR+cZ[@Jx3w{,3h~LE$ǣZ~4 qܫ_0K޷2_EDL5Zo̗"^$F-}])t|)#)jӲW߿뫁';c >]|*w^fa'\k-5 #oQ CARd'" L#TK,z'J,U% kWpq f%簏a5T8!_naΈTW{\c) l){ݻ՝G qx*X_sOQ0NO/_q^AGiqYd?3{;bdO|ynGr6BʑQRr$,xP NʎP6 NeF9/se05s4|xJn!fŚ^DHFZZdh:P#h:BNG5Mu5|,q$؝#d!EGȨِ`@,%7qnaA5I%Xp1+`!w6t@$XHy 7]CwqXTfQ1ʊqkȁm(hp&7/(ڂxRIHc!yT$QD{Žk'kS!wr;W=.I7y"O| OH4vRX B ANbDqÛ(h"~&{.f!ﱰ矞_ x6_WgPV tU.ߌ'LW8x;qs+9*7j,K-b!w8Vc)BhmL.*6b"QQɫy@ f!@%ؠZb@k)6` .˹0P;wWВ)f-@BDIDc-8VS N@0adb|'ʁeCqR^cJ}ά!%Q(fW^Ľ-`}qX13lx694 hMUE?7kSEW?7Xzkal}*e:QVA5qYt9MDT q q%b(Vhʌ,ueFϓrZ!7Y,X>Ųe gˤHj9"M%;t˺܈\1z DApyYEƼK>CK1q]ϠrrBHd΅g0L+'"(Sܙ`GX *ND24xJ"ch(4?Yl{~|'s_"m%Gܟ5͹u}A=eh fbǤ:%  -|λ%BzED}IL.=}3,?߳ ov\“ͨUJp ji9 D8-EvD6ͽ4~$CGe X=Kx?D%T- R=.?ZTqH9)ě~cx|gJQ/ƝׁR}.yP0^8Vdxr!қ,zz=|5y׿JJ֣a--3Y3%gm4Ĺ[-(x h@`cA#ds 8ŹD# [g6㑰4#atO5}EQX.UVt繟2ƾ%YpczF5B}¡w֝ܪΗviv`z6cBj;_=K*! Q[$N'ėbgAQ,6!s.A40a[.$:%dK:2Vd7=Pt#kǗ?`Hqm[a!3 Avɇ4qt$#{|HK(\΀=q%;E(r*Ôr$ZV T ➸$,XAdA Ɠ 7QᦃT4c7oS$dRqIu xCU-9 '<" 8 j1-V cqS.n16ƣ~<$"ϯ`:{xHb7gzl/WC_Fїd7_>>eB%t,)%:+>fJjuO͔otw`k+ >V| qek,bΡz=JHy_% $=Pܯ;!MFx:|jihW@[y9 Z99 QC^ Y*R& ,6 k.>ԃPuB*T#]G ƓJ[ѓdqqWgeJɾUqp(=,):Dh LVuwtnp丬PtyL-61uѮb:g)uV~0YNu8, bro2oD%%op#&nEu05jA@%ze[ŸZ٫B5 6x P=;$06+QGo͢/b @ eC@Lk3\;HDI R7\J+<̂-ts,`Ddf{S/$#|gS{)&cR^ދAmP)kbĞ Q1ɐTQ{NINh$YM=Z[#,& jp6?kG,S(ltiʅGOq蠤ݴ-PNBZ(Q+N[ZqAD5aK$6J1XT%=|U;$k+%=p;38  hw<ʯ$}VxD:L*2UK;HrtjɍOh)&Eű/.jx8\$ 0Tw2 zꑬ˛yUU]jaoXZbm5k޳1}֕Iix(j@MMӮ@ftrd#c"DXI+Fڈ#5%}?_\)mj;#N_pF})*Q^@% @ǒ~W| {TzJo]99;CPvMykPQ+3b][rė_rNC?{R3gEf߾`M76}ȻXw7ooOO̮'H`mՋ+p8)1Z0^+F{_ۏ/<^_ϧ9T'S3 [~zn}}?}ro~o #)PxTp.kw٪Ү>朗PŒl8џя»lD ?^^isȰH"{t͟D'B_8^Fjq֌R!nQiZܛgq) ѪWd |+6{_?[>O'7E5D +E`yS"9sx4*O8&nwK/1 ϙL7M kG}_t>z[Q,=55Vo#y"^i2MBohTo(s4k!n^q L(J(_pMI%E*,ȎmshL2x8D3zK$kshSۙ<4@v虣Mkq܆"yEt%!rcאCt1g/'R*y|{5' Z_-CJ9eՁPP`W/ muish_edČ#hZ2R2|# Yz~  Xl:nY[drL*z!O/< 9F{” NnqpQ;8@d=i9 hMEwy IZZ7C];F\>C $-5@EB;*gׅ$KVx>J(Ea[("#8Twllx #PT>vxMVzqHkWSofhIz9#cKpL@I9zL8G_cQt/yunzv9)ϖ>˹VΖE0횂?D|' @z?fgs$<.FGΑ1䯪:?7>k[ˎҨCf7,|qaVh &|BoY˧"/moeN1Ҽ1 qK:̒`RlY$xOM*9<ȷ6m6Zڏ&)9 #\\Q BQ!E儙acNd2YkDJ#N&{qQ#9pJ4 h>Cic.L胰IupⲎ{B\xZ|y(t 2C"k$h,cМI}!TGSIc?=aVrr#^c%%!H Nhu2@ۥ7&S &%.r !)[^lD4ugv~SٴWOomd#T8Zif%ҹ6|$VY;mRП C6$(rYIѓr5s./ .jGzmIx$.!.X(SL+MT2+к .n5ɔD.LVht+zhT'>;/v_+AUT=fBl\EcA7UUT;f$;b*d0iaYr_687yc{ۦgt>zd@Kv!f#JqKxgFL/j]>Bޒܪ:2Wh٫5/?,avctD!`BGӛDo pQw9̎iT}Zn?/Q,x!7c!!Ig]5"xcŘ{%TK7)~whx}up}6HU6Ϩ5mF}X-ǺYy7ZMtܩL$|;~~!f.9[/_Gs}]~0LW^}=Z<''wyK!Q>I#sZ;m$'n#މzF\S$pp%`V*8L<6zfp!.mݘ@b]|p}3_:{?ۼޗ>ġ' Z5!&8gqEq,0>G* 9Kp|E/h#게@ =@8od.PY6T#(В!ʛL0L6ovL^:Iq ysc-s᭳3FP Qa%F@vl5Pˎ XpJ߾𠗨+cFnKz@ NUzs񜓕+Ah?xqVg+{ %7 NZ֓S61 VY{DO0<0E)H`ID^2푰7X=Lـ/IL<_*$@E;pAJ}RDs/::'I&O,u{G^#cv"@)P{Eq irL _\%< B&Jem O}v3giuyL>&-^m1GZ1;yԾ^$N Qrhquo+^EI"҇q"}vޓda8}w MwqkByԺՖ⛖ 7y⎈F z׹=@ g2fm7@)5 ۱a}4tR,}-7 ڥ6 rŐ"xus]|6jT`Oq>nBWF"ac~[5/a7a.GsqUiLliOԑfD+4q#ͽV2-@zQy _/4GR+g .e{uF@]@}HR嬏`MЏMZdHIRR}{rhSIK Q=O`rq%9dtP %ΓԲ\e>pEL˃{'b%=`oX󀊱eH(?Dجjڛ43p+' ;bZ}kKmUyϽ ,Zѷ{GN=LY\<9) / JȨC`^LԎQaJho=|ЃPot">l jg_wC1P|Oh҃Oz܈\u:wF <~T<44d.kuGݍ[otZJ/4h㖊XRyFǷk:F 򬌓$csH+L\{fmPٻ6dW,.Fŀ89 J(QKR8{HÛ3=7KJ]UWUԥm֚LgˎyjO/%V;l*'3#Yi'u_SY۩鷡8!/XPyfs;ǗVf D)lX #%$L٫sU3s8e ܢU8,QZӊA;YtsƟAˇ]8@t<6߉_I:5>eʕlIBfTX-Ɂv `cne۲,,mu67?}xvZ6e@$BhFu«o;p e?S?A4'!ʫ=H7|=* !VRĆ'm(@QVc҆"ؔ|3OvÂ(f\im$iD虙,0CVlXle '9-K:Gd3Q3rM^Vh`KSajL: L2y#aY?*{ga`3#u_\],z ;MX>fߔE~P־b0̳lbeX|6[X剞K8vOo;nނ#wB& ox7MW[B$k.~e5 / !/QBif 7 qcS 7+)ӕ} E"8 uj_0jb&N$2 I0^i @ݷ&n}`xI`+^"_%:=OfO# 끓X|q^_\}V鍾ZO}yu"xIhħy_#> s&luf8h>F[4Kb945),$%@!aPkp-!@X9MrG$a텳VW*2$n+7$jCq  3_8B<ѱ vGNT F0zTCωuµr:0J%/?g"`sU1h E,[LO}Px gw]XZ$݊U>}  a E(*fJ܉愜q*v (#4!Br(0J#0A} g %b.c pOcÝߘ4=p5K :,: F01fyV_hǣK*Ǘ{X[},bf9.ʻɫT>)_p7mx.wœ8}b1T+J(xQ3tW0_,tHUtjǏY#Ƥ8|\ɏHS~5Y:=-% 9,͘ҥ{n|L%uALvuI uIJk56l8Wӯ>?kxK1>^Õv;+i y r<6 i6#j3 h!26hAJЪ{i8_t RCIv_lEwˀj΋w7K.޽ݻ%;VwCB^i}]k ޠ[B@[A\]_F[cBյySQ6Zl5Ya"_B|. _qwʄJjwS&2K> ZW69(j|Ob;e CF/M;c'jx3Yi͐nW:l5F{%at@s`p3%2R GW> J˵Z3dB~{ rgɑeNHvu+r$mMӰJb9fYLĐ[{/>s}iCB6khC@ľ=0Z"'/!d,VORi}ADiտ_}؅5ɀE# V`U( N2PX'{ J0h6AQLtt}sOcp?XюV6B͉ulj6W&W)U%]>-}~71yTP钪$%C:Z|-z'G|,%V˯Ef? ZZit;c оŖU2 @t;Jj0)< Ȅ#H+B9 .pNNG|a>6M?Kao Ii7+fI?b36s9T玙r-t c_`slܷĨ#^.W[BHcܤvwl @"LGb^MAՙ {#PU")c^.D0t_n?;rչ P*l r/@"P ҋ@є7uzۅA<0D/;IWn/,W^Yu鋆M#=M{$ >d;l(NcR-J1jya;@T 6~5C0^{zGq_l_B;SQ![jc PW A`JVǢ@:>OfO#끓{jغ2곚_MozW@j=w _ħUL)O' Y-@S^"'b-mJܲO-q>d,V| u9`aL 0( <Kk|0(L <9`>*goU %KؐŚV` a10 R. $ #gkè2S"qZݲoH6c1gAnTyC J!;;ǁ ,`%=i{o`I:oɛ>wˉZ,n>ݖpݧeU( JG`Ql&6mk翢ffmM=ir=MVTIJےH$c{<9;#8D4 f><#bAsΆ&@Іt +RAw&ν OevRTmoBŌGmO*wsbԅG{7 hft[ZS2+ϨC&WV[pe;gY }FSxI$ċpq624{lBT ,7Bd{yڜ= Cy h:)Pa)\B`ʀQ\ޞ(OSiK)CRlFyQEy|./F N8 %)2&'hyC?qh\A߃D:03Z)%#0bjHe$d`d^ ,*D6& o~9N&z&:Az)FP*Vo"I9D"pyʥ^LiSqĺV0VbY-Kw?@>/ f(_s{${ M `e>DQ<[~ND;سQ\&$AMI#9@#' B 5;(ִ:3kMF}}Q`3P .ĒȁȞ] ^%g`C)cB?[Ż1lۥ A-Oۥud(tYZ(\V ~Q(z,݂we(Rvktj,1PNz3u/j4" u-!2/bt|A8-}q@%:.F tfޮR[Qތ_Vhi.+B#-Uu 8< ^f@7,HB淝HńMU2A)m1&6aZ >S)7Dq?L U&ͯVL߀Y~-BΈh8BAӄzvt7Dj13 w7GgG"t#3,l@5&0qI@7¯¼L?- s~Ks|]c|bN,r`pK4x4u_&3χ?_]|~bAnLՏOmn.W_y/꧕_G~BM͙O;-~ 6 RH7v@+n،g7d>dpg|ݜ}.nuJD O4L*"iVJ3nKÀ (dY5&3 dիz$+WӻhvuT~0+nП&І޷ DKͮERSPWiyͫ`GL8ߙUݷ~>[嫯 w .'PYMt^O` 9:/aD|, qKllpJ}-!\6{7$!#)AbJ2Ictᄄ,) J=NUlӄHhJ, 3 A4vufοD1(3G,}γmZYh?Gھ\q(^GPJaI(s($10#g-NMeoqƽeojh&h~Kvs:~pFq {WϾV ߾U9eq#g:Aԇt>| vg;]"eftxQ}iPh`V d yN 0H!rRq9T1ez[)d!~鉺08`|M0",|LWkF Vb?vo?⨘*Q%0#3ܕf> {2\&0GBpmU]oW(K/E\?P}J!`Q z,7^K[lywoLqmF,݃73 rfakŕ!G_]^ih|d/ _JߺꘪeL?g,ىr>Oӹ[c2uE,4ad}ؐGA03E"j2xl.1ht.3uiFjܩܚ@s0_ӄ+,Y&7j(77) _AC?ɹv\QD.gTC)g(v2N}x ј>Ii ϗz|@-dn}O^YV|K)&P$ Q<љo< vk%lŵ:|?`P='D*CvyE&4xa(éД =riGNhA )(d<~uT.KM^u?QϟXU ^F' )IĂ4ԙpF#c ,`N x?c &ibw/pRP] [|M(d%`0/eŎ|G.fpV)*":Iz!HZ_zp/G XPeȁ$^fi BYΛqz5)>k::ɽHHz"Y*N5^#N5DSZ980O.@j"jb<Rj(C( KR4ׂ z:v]%]*Skvq{~?Y1‘+;hX Nlui!#u-zU[hL){žvcAbPEtbۨNYw-zU[hL {vӌBEbPIt"ۨN;ܙFvk@Bp) i7F1-jN6w`:twhu[|"#SLk.$t7m~}b9؛ =t>{;ĸdj\'UyG݃UaXKƃ0@їdEB^SbuL+h̙X<$s_ōNn4ɮ`^Xf +Vì,ez|L@ @Q\8 P\\C(EPAR?xGxoz͖=cgۡ;`ۄʋ+FU20`/_ɕxd|i`80Ŵq_8"Zzf&Xl_̱$KKQKQKQKQ^u"ڞkÝI ΓNrSi7n-F܍[#ť8X >vNvOa`!V ;5Q\057wp֎G+K g+; 2'z6ow%X؁A4rǖv~]JYm`BςrX(U&T7r$X*hY׋9!(q0C%'Dߒb%%՗|4"Ru8X!jV[bkfZ *D!2-de-0m2qA7G V0 H Y ,sndV,aƳ 3n=lQ0耊40hx7pkΕKLpߠ NhfP%hC43r SBɋ/}JHNƽJT%TXP;$q7b--Qh͋y"l/1?MBLej bާ)V`6!eB_pƉ(E {Lg*7|˖w0N1a^)feQO b^|>u6 GnJK~ H qwD^S IwVB)Qkm&mi&<$JW!2aa*˟6O7UlaiSV,Q59pu)chM̬> 3 Zmuk(r"e<(4kxLlM[k춦mI{U 5ZM`#vnnlCuhP~e-T9(6N-\ݕ?F![_,ZTAyϰFźFSOCyL/ZM`F 7d5!Ma:򋆡}yAv (Jj{(Z)h⡨k/nkʫ$p:~Z[5x;Zb:-G9ԫsLV&8?: 9 Ӛ<:Do79͜b6M+y_c7GijevLeСΙJ3]az{}`zC3->P_<+{=ټP+%89J;gp7sJÌ*i(: igNUίw:5 nKNKz$z׽@Db y:fZPV{/}do]!-|{EMx**ZY~Uh^*^ h U!dP*zM --ǹ A-.}HoĎzgn[H9"D>&Qgqn-܊*K4rkiʨh&T\+ f+ FgLl+}R%OE9 LXF=b¦2J0Щ3.u) e8e"FY8U 4Zz]>($Eݚ$+pRl]p.^fS2a!T`hD$ʦs/E\*N<<$aϜ ЇH~t8=~MU3dvS/d"I RMA9'RzMCTa _$DH1|pY 24%n?{W㸑 S@=^`̾  UբIRRJ"%^ɪ=h`ڒ"222"29&OUvPaiD4q$B̤< I+*"8|P"E gQdL XCX𑂍?EHZ:I ǁ0;$'"iJRc&SPحӘ( loI@S pOɯ}XB0^8 A0 T(@ qU^!uojL߾%jΌ6Jmy_j=+-{A˃&s7]qa#öX"O 'aH(޵g=6jds`\3SSJCZ:S 4cүy }=kY+EW,wUEMk2?͛%Q|'>̏"O4akQJAsˈzZ NZz"[z?ԂjD~6ؚ;޶0/4$H(Y8ejN2ӆLU ÿeˤ1Lq8xCPo*%|.!6aAB>(ySfڑW1e QXgH Lo'$.&{{;?7𩱂A6C olCA.dz1"y>nw%q=|/G3c8"Ҫ|[Yx?ke/fQ0*ܻ#1h߿ЅWE T>XI d,ϚLv&~ M˳;f mcbsd$zGd%Rx$:d4Bi FgH~xdALDI0޿˵¯4z35m_^GװA9RT׈.hw1q3Vٚ~EZv۽ߟ^g) ߲+gwZ!T= fغ$ ěmYfta !5CMє 5CSC6c{o&bz !MW~ wޯ &XgMb`=5ba)0"\r4ѼWR?\/}xQĠ6F&]pq!wW}-jmu;yj${oTXyYPVM{mZ9%7$OZ1({\~2ps| tKV҆h#^M~8Up( $aEJ9.:_$o;_%9iECvPn|(ϝ )u`ٜ n=kCyYI_Kkt MiSBҐ*fX$!F)'phRނ=Psju8_]]Vv/wq LD)AL_pRluҷ8kPRywiza6DKםD:!S,D(Dq@4x%2XNU D8n>eWr(pR ȡYDa~aN_䭋[;r8*ĭaSUcM d%Ih-2R6jYXاda} eoַ;i1oߞ͍.tO̙պG1v!W_ioh"24ʰ 3 [ {orK~WMTO9xM\wWV'68׻0LaX?oh:д9A1nhXx3)>`nm7/ywU9J/_dL8QB{'(>zUT 7xsEjyߓ;d$夛t] U·-w 8o<%Wu0x|] (씩 }f;B 7mʐ饮<1 huMҙ;GMby7~rAωGQj)/GP z6a ?@nNǣFֶTL ϛ㧔)+7$kQ2{7zÕ@]^I8gY!14w`).6& &b1x<'i;^0%vsLm_@yQ#&2 N?*E;|8jD5ݬyk0LbWv`lJlKFY_f}yá$s`Y%cz!-zDIE&lloOCw(RZhG!ٮA8YbEtS.}=fa̍^=6Ab BGVPjd̙U AM|>wGAJ6sM)>=M܃ וv𹣠AElZ!j⯰*M6xax$YVB\5C*?OYjh"׎#`+")ɏց>U]Mv=-˽jka9GM<ܬb\GKb֚:zWdx(z JQnX1:jzՕli4-#_.-PK&5)h>k]39djk^+"B~PA@# ksIo0|e_#;QL (|1!^G~XTD%#wU{i(A5niygVY7%WU5&=,ͫ%~ anKnүJm@1X8-;P' 8,cbŨfn'ʴ8H@Fux9b@4#( AL*q?~L4prF9vv6/#&?,Xy!\8(SǤ-i[&O8kATh]i_ԅS>u0zvA*E9-J̅i7vp?v'ɘ\8n^IkACXbAB!ʅHPv٠|$KPa[&JݐR,]m+1+8BӀ9!ŭÑ I1wOy;|B(0$ljrrQ3R{ cd:/2{J8urvtM;;- 4KV[ ).G) vua|֕~* ?h>잀2;vqp !l|9K+3`O5īH3ɔ@vS`:p kg5[ޫۃUF\:TvY Ts~Mu6GMM1z?iLSQ9tu|txƃDV7}<]d.;-*"UQFb1W_ P(!` Q.x.x[4Q3V܎2XF$چ=(TR) 2r@KaʝEښx);1g룔|dIF8tLm)b 6V57K8T|u,܂N_M6o*+iǧtq@\; ?(dFMp $(9Sw[vfT)\b;qyn*LSUQB;bhn.LoJ~̈cRƬsLBW!Yv'ŎfʯJqGWfhyiޝ ME+ZR֔S8b!#Y5ȹ֪פ Qy.qܝBݴGۏ Vi1sD֐q`Hof&cӐ T`rN!ɋ): i%<b߃=ܒÔtNs43[7}=gtєx>&\=|-<xQ hY^3:c;\@@9so)3F ! RUKkquQU b&)WWy LX34SQyoЊ)6j\WW<ˆtKJ4PDNJ8F.KUzCTj̆E5r5y Q6: bzP"eyY e,p+S!bG']T?w)u*To;6d-лt"EupB\O-A";ÍG͍=;;nF kH!IT[8-`j*{Vp#\䭟ߕ|rl ?kث̅\o N572,q-M*[AƒʦMnu3[&Iu'8Wa)ń3Q*p#hӝƄFy#bVvW6"e AɎO~)پY/QK[N}GK]}W8q 8zU\]?ݔy7}(1BtGGׯW2^>l7(E,{M~ Wy.3~S,poN"?,g yX {wL_dC$w=/Zo'#yyǷ/mEϏyF~C]7y5[βfUjnV`#:VHoq7jcSHM'SSGK^tBD3FQq'T3_~U*w7M|\š!oKTYIc]oFW}y0 23oG]1b&)%Q)8,[U#"T޿濫TNn<6g3|60c_ s |DLOE1JINw1,qRM,82Mɽ7o?vi /h  )3/@9nΉsX# =[r^VS^J12$$X'q͔̯Ŏ[zfcv2HJ.Yb'.)@Ï1oPP I~ҟjm;W$V0#n7k]23\yBpDR{+źNlR)Ξ9ֈ0R,ؽ ; 8$oBˆQ" K,2"04\StSh%V?3s[ʌYzL ,:DI\-Y9Z^SJ}$c8G0M-mSd797 ) i:r!9$'q24^/j2RzUf>O)DVyҋ$*q5@S:x[XkÂ!g`|VKg8yJm]W B3?IU!YNM/5:QTKDPڒ'{?}2u(UMp38[.Vs+Br,wwƍ0K\|>Qƍ\[ܞHBe+ )`:N%adx Zfh1( d6-*P`/J*_,? `}Imt2ocμ juT~Ӭ "f~_|Sl+;٣m;gRfct{b¥e3K7o ~K*^^)A&_>ߗznMVRJ ( Xz!/ =kR(ߤjý̐g Q]CR8Z+u?' ͎g1_)[^}o}7h 4r&gE:-Y{{_'Jl1lqNju9:(9j?=y,ub+:ݶʁah?1c`fx&*1^)zufw-[CR8yW 3RvDlyu:nӺI%ҋnPN?;ӋFwWDOuw 5$ǘP>YWW95Ӵ~ց$XI4Tu7G;மfӨ%쮾PXWBqԕ"4ϻˠ!tɼFQG2vC75>m4e&5uT[se^#OgYɌLcZ+ǁY|04&B :E*@Th An롺A(>p JM7%4a"M&I]o<[Ilzk#6mR^{6mzfJ8ŭzf[u[[7.(MBQMBp;} :'~䔛j0x;jD|Nűvq 2d{]7@|ѫf{kӊc S*i)qtFa-5\ER=<6P<.ܳ$/JnD6@fE_s+mfAL`*iKE zrӗ t%0uaX ɢ#̹ųNi 9]eX h%vzBXޞ+%c\ lie 齷_bg@!koK7?1.>4$$~lwSfl\%{er-77ݱݡozo˧d}~zO_sK{V+A䊟b2^Bt-f7$LM jk}祳SS6\+k/$2w{ʨe@7YN!hϰ;x: [ߖӁ?O `7]6 {՟"ӻcC=,Qoe֊! h\%씶9VWeTUVYw)q^j3ZvukBC*ZGO|^qu+2Sw*9OQVѲ['WuJ t E&{A1H"Jg! zoZ_DžeQI`B{BPZRY&xx{~o>YB[4I;=spMs1S kB 3Nj(nQ/+m;jٹA_`9/U 3}a6 3_3WJR+HUb h 1'` c?-7R,ndEUz趗[V }ނO껻Fd0ZkZJWw~9;rtZw?ϒT{=L`6zOA2'Gf6ľ}sxtER|ΐȇC؉E irѠ@`vL^8NlGzp8cX˖TLPBsTsx %O[}G8|吗 TK?ͼJiy\.+.eL 8aEÙ+|_:P!d̢/dqn}%yCW+ϺH=yj'Ǖ'gdK`~/]7y?+&\HNJuF-SXN63߮ՁlOA ӽhYVyz~i;RBBx %o@o"Gӻ>ADxz-WP7}0ϖ w}+M~gae3r<-{p7J%a MƑgg*3c;Չ8iǶ9h]}+-e&6M,O;X. ?jLk6KY.N,fI8/ln''[sX"2RB:p&|Agr1~S֬v0Rv|+1`ݭ1Tγ o8Mfҟ?سanúmϻ? `=RA> 4Z'ԉk m JkI:ZDz{tj5L+ZnܒWKd,$fDּj2H/c_)4ƗqbfF^ĩ(? %Z_4 ?AXcl2E:š LIHD!H $115*C`Rn#1!"D"&dƁQD}﫤$LU (ɬP!JL*K$ϽrN1t2Fߌ?\܂NWHrH#WmSras5VU9~I'wgjv<wd2]7ԧ}kOm9]m9Ȕlrfzb\܁ٸ}XLoL Z )`CᴷDjeMX3/VR&LBưm/Fvk}Mj턻Jͪ E4(R{cDГb4!{i+oΉ̗E BB6%Ĵ8 E(s$xb"@4uS^m>RM)0Z[f`IlAU-b--v5ϕNj82tȢޛށy4D,_ ,X@qĜ,C!d=kA6Ar: 8 @R;۲ 9':w@dA9+@cF ks" j^$cPZ^I+Z;KOE\겹Ց>poGF{b}j2FԱ 1$~~>~b(^lFKP yy {LQbc̉z/3\fP%ٓZM4N!2%>gXߏK\jvHQL`X|G^i c|v.AHZ X*ʍ]a$xֲNFJBύ`D74}BCFH#aB51 }}s !%Hn/{OFrz r~}I2x<BjH{z$1f  #c\`Ǥ?4Ċ >F:TMN!X\pH BБaESs)k:>dq SRc;k]=sw@,w`3*Ě(ip*{: HkFW1 kFI+4ruJi;Xv;5VLʏu7= s~K6=w_ JH >أU$";Ԓ*iG ?Xs_,* [L'1Y vP?gzAJ1lzGu5D,QrJU"Rə !D U|$ 8R΂Z,LH;*Dl@srKG[p뤋5/A uR׮g5J:]^@ sNh}_\O ^bOQjBtqNCs&[\G%3N [ c"Z@ ; Q`0h\ـ)`֝48'٧ zUd{{ryvƗeJ5t?FN6DP |s!Z6>rp(KS{)ա - +8†L)p` WܵS7ŭogqttsG&@sHp׹ ^$xҏ4Fhb=RUxYf"Ql4qqFvntF0U<@ըZ#q:v{U :;'~UOyz45=_pSM $ԥec$txLj1*m (rkDdG$+ )PR[N&`Tʧ@(Ra-.kkP% ܊[ 9 ϋRbՂGxQ* :Am( S "6pIkd IZ^g[,悱n6tZe݋+Е\pBO=,bj~+!πPJU[D93 JGL8h &V[#vƔP$cX pYX4t%Jtcb+E]Fk,VDW D4",bIiHk7Kk pRK#(fDA+Ai5ILm38Ml3n+Kis|O_)?M3V;{]|t7V+F>鿷oUnq=ݵE҆'ar=# y"Z$SIܩv+v_C- !z KW_).aГbku,;׋I2f=\8baf*t-,2J7zx[tL|jfs+/VTͭY:sKj-hN]3Xh>1Wh~#Rpmrn0F(r89~zpͻ18q}2<&w1GL֢mA-G/ANsM*ǫ/C/ͬBB^/n 2VJ)]vTx݊n]H "l-Gn45hKI6oVhvBB^TAT364ѣ\>`đ"X -DjFZ[ āGω-d1Rpt=mZkJ0 k""),2"81@:KӨA)+E+P$  7\ms3z5kܖ#ۜxI]5M)DgyGII+[W\[8:I2s!:tndEF솼˯Cـ7σ) = }a.̈́e^ɳ ཀD+Qz%>#q8KtIxmZve.8D'-mz%Kˠckz%c(f#ю-3`N:t&QZ8j}g\'gf\.E27 =淸南b[-I&:Փ.AF/]Y4baɶvctڭYcl h4iiP,?_f˛k:]]x5-솑ڂɉj∋ݠSGOEktIIZ;z8]~MvjQ'R/@!xdbBdbrwK>Bs2b`}57#T~G(x]̽1{CHԫ=޾X!ꕶ+szJu 9gUF# ;T9pX[WhKR\+"RN({R #y#"QZnݪވRgG*r nGD`,eƂ5<d (R3Wח6hXY,6~ۃrCcQ(|Y7Wg/xym%~U` N^<=7kӯß}[R~~D~~`O͎|3~XNsf?^.>rC\\`8Qpb=;GoU@Wf>[2x J8LqcjZR_֡3ú߽BrW7œeat8Irqj@72/V۝TJ8@*W@wL?{a93SIF&t<RKw>Z_S_Fa%.Y7(\YGb[(-yNatNNAT-)%IIX,xe)1"?S:)ϔNw^#K(,mФ`lՅňSuuay1'cyu’$o4 Kϒζ/9oyS缡 9cO!! 7Bk!pU'ֺG`ӡߣꏪS4[+*go97נY`=ێ:ێ8=6/pקsNW#Mw(QFV0N@ QHvIϤQ*쨚UD 39+]f C̏$fSx</zkt'M2u zB"XAJK8W}g9v}B`F=YAf"R~:I#}TmQKx?/e kK"= @qZ1Y]ԬqX cW,ׯ^LMle1hGf AZ+5ɩ1Hd2H"Ɉ%^Y3$L6zJ(fhj T26btG*?&:'zB73ȘNd&%b丆ȌBD~зry\q%Ǖ\rFX ΥqZq DDYFSLך.]( ]xkaV(Wzԭ$g=q3Ni%UZ&G<+N9#aY9lD/\]eR9`2ﺚt҇%$B~a/tqiЯJH(kmFE^Z&,'d/xFJrfؒ,lw|%2#WVUFW(P4qXEɛ艖U>ZIlUb.1;"zd-W ÌG):m6Rq.9'puw}oL63 :,@ItH_y"0eie_B\\Xmr^W$y4α[ó+y}gכ$ q!l;%;E$3k'Z#"S2hB:M:I9U/$雓f r-H )_EornM=Qvn\L Ÿx&R%I A ܋(#Q5Nu i){aZ'f>1-A!vSJɦI.'J@磬 /.÷h ɟ7ߤ !@V45k18@4}N?竛;5'5'5'5'Ml' z N&&NS-%TЖH8O8@dnR%wf+w B9ʄ6I ZΙ0@|r4JbDjrw[Ata;yPcO%0w8@t{k 'Hc#Cbh V+ځLE (/]0T_oyDŽ0R>= E)ykQ7{@E~RBPY0%:euFSk͔!o"m!рf#5ܤGhD@6hHt9ǭS@LQV E$N?Z6UZ ,+D!pi"2%[1$qPMVHF^enȤ$\_Mr6:m>]cY뽫lJU??yY.bg?|  ac#Njtrz:EwomNMA~=|.`LruJLbkzuf`T{ciE"5?]fݯ3֠-Nd Q:7x' ' o#Eۨ=3Jq4Tpˌ1Elg=Z+ɇc¢[j.P6XSi@/e~_%9L`"J,4RK%2hPC@[|w2%ln 226,÷a ktR9H/)8 Qk$Ei"(y& b %Q[ō[9-`dz.4&G]]'-PZ9^иsAP~;4vrh!tl83*L: ~jAzWEm0(-zRUBH"h 56r #XɹP]04F5:ƟncLӡZA{1Vƴ-1c,ԑ}m[O_hQf!긇DO#BMhSS l;5Fkd|Zc+Xa7M`BA5`0Ƽ6FFSTkTM b $p(QK#,sji4ޑ]57. C{S#}ir$1** y 9O0ywXfJF*ğQL Xkr*itx߀vr wzh?XOc#d;syף uZ>z*3ù~sO/?s*_a #zz}>*2lϱ0CsہWiFx3J칉2 /…/bʍ:du2dɤNSgR}Z L2XZQF_#(t,D|ŔuurwpCJ6֓z5be2Sgvsw2`#ׯav?5,BxfiF?^dysRBqzyvM|zwzJwZ}#cJ;o-)Q$ h]HQ+i~+w"9Ix۪Jahl'h7J& nrBD*CV=w'BklXᩀ1<b^#+䤠HpYNq?dN(EDDPr!(R=:, Vߞly9N:+j~WGi˨@U|u:\6cCG(cCz/p]d;WU$mN Nw e[Mdq4P@DK~:͡ WQjvvj\ԗ\ ;8G]:a.v^z!Zb|c$ٺ+ a@~̷C}םAYzwwK|sg\ z'tQ`*4a9'"5CĀ!8.MSFGVQK%N|Kά@U82-B s*(tA\`rH;RsuQ9lFޑ~nL9-{jv1Ks{j_+wW`DFMe~2"04\[5'6k|>;M\1jzOcmO}nτk%$4wț/,:&>jm{w ZJ; =;1ro_S//F ,u-vY2]B @_[ŝ[˜2q(RPy>|_g㨤AT؊+4i/\X&1H)ng"Цsy+f*yܢc#mJUO4񁪶}AoPdTz2SN7b!Wȶolxb ݼW?;MlǨlr=l[+^?8m+)MYϩ2Cr>k  v7)$$Q)qVh(=ɟ!0ufgg8'z"#_Σ㨵BeRD$ eٝL律 ի- 6oB`hFq߂oQ844C.i^`RpBTA>HBCABpeqa.+N3D^"A~C&hl99_f=`D2W*u675…6 ߲7y6{C]&hݞ՟qz˟.@9DrOd|d!fFiǖ|>8wO!rޢ/EFA)t-)Zp)EW8bJQ9X:YMoSzpٴK_]m(/_9%>*ƬE9E^E9[(U^t-*9ѴytG0%awB*~ yԑpa]hQ\"bLAXvj?(Jk^%JZ{Uen:R+K5D DxQ# Efc, l]-Rп >N2ӯ vn0io@c^9߄M0Ñ| +ajN;^Oibњrs3K7t(T5~fAMXC]t`P(Fc`S~sYTb )ghZ۱h1 9ZRs]_WxJ6-SVO"z)ê8LG飿n0 (- `,LJ|wwf6~};%~zhK ӷ "ĵ/.mظM 1)WtChCMSX&Fu4%̺nEj1ʕ{n- 7[p sC4Ft?S#gs R"GYi3$| ?KlF5C0B RE Nb@U0Jba"T{@^l%ڭZ,bo28 pD"O#wr#T$XgLJ$0V^;@x57I ;Ri)+ْRi~XokzMRz{6[L-G4XX2 2 $:6I;eÁ 0" `'Ղ4O?xńj?B XکRqjI*xr]uRXnvh b\},Sn |?Oߗ>v_U'O)F"6}u}P'&Iv?}p1&eUP׼V\wNx^ݿ< CǛ|0b$ `7.zR♮8;3&l }x"Mf*ǂ.^6 K fE̹}8{cV;=`wѩ 8FH-Y;e+sf΁OdGHgm1gze9AJl8 ?CT*FO|!ZtX8&rv2" 4iagۨOzby$%* lhL$c27^UtFTz⏛-0;(lz,*p(9r3&цApe[˜fE@"Eg@82:ixs7w'6U~qyQGE \fR/CvONey'F1tyŪ¹gߖshAEl.dT]䕾{3Z%#^j[J$>G0(EVdE a!]"z-7KHU9<>lџA ~I3~#&G3TwZUwNj2M$;^}ue|+ib4!A c߼p,jX kʥ&}ViAz]F02GIA  \h 'TQNՇYU5赑AIxM4 ! \`2j nuT5}ȭI_֝dܺ{>hkx$*}(Q]Tb{<v"aRgNaҞtEY<~i-*ˈ׶-683q7$Rϭ6 bh+#bsNOfe^%y\UrWUYlYCy* Rk$՜ (V0ViٜQfsn{+5Wʯ+XGk⣛ { .APuX \Jk`)&FuPC%[`uBr1#iXGpPheS:Ă 22&:M5e?>c`7AXH3g/uC^! xiS&mya,RӉq}IbNg QvekBތDP ELP\p0cKE`@)SũSȁJ/q'[N%xB aa#a0-"VH9k#-`3\Kgv#׃, cI%@OuxX$ Y vsjӌ"6cC0f`/e9xMX2FqG=)yMFRi ݵYm냬>N<}ȟ5֧;0Oqpݍ.b %F6lv.r%*%NR Tpګ[4ퟞ.@,Guy\|构Xq^lV;e.ړ;6>P5 LLgC a' 8_mɰ/yI]`';%dGZi@%FL-kYmޣЈ#ٌNg@(Y)T(~qr,'I: FDjyMo$!pb n4Ѝg6y*͏b 1TNs.^ V 3-Tq9`^ rMrli/XQi!BsJ̓&B-M61% p`N' {| N S[%~;i~doխLD_YSx(RlkEn_SƾfʽYLqL63ŝ}1SG!w['DW6 9:9S};vsMz;CA_vlHv%HV_ I`u[ۢk7u6:\Ol{kq}sI^z]|Ûtjcx4|L?6ƴ뢊{liƇ՛k+=""ƽKB&x8ͤӳɧBe17U]|EkeyXRQWT{Ixٵ=zvzikO96Y5è#\blG]ϭw\6ea1/q>&n>׸فVEP]uVp۳p9]w6yq朶! |m~:=WI]Q;Z0ո[2k@zN\u{N SyzSJH?F!Wn̾檨a͝WzwnutP`'s4<ё{N,{(k ,AE3vq}wwMuш:Ww`Ƣ.wx.Cɨi+dFco6m܆7-#A\I>8uo9椇wco֫y6vu9;J:f)>sTu1Ox>voGbTf|4~Nn'5 {|O5aN.*yxP;h]mh{#Pj-#?tT~y}w<9)6gVR{}4޹7ia>˶ގ"9AFxMƍf@+ 6n 16NJ!J_>rn2Aq/Ѣ}{뛇{Ȋ@9J<KN$oOSW}#AyЕ7A6UجL&wd6 }?q#Ih{^s 榧N7;fS%Jl%79PsՇzIЮŠl#;WXfi5G#,ܲNI%lw h壉yr<Bnck3??U5/b!]ȺXux}Ƈhz^}jEF/q#q'[ !OVk$M s kW80K)VNіS:?$ƌ_o_}6/QS(tv2?0[La٥yCxKR|n)>w)^ɐ*%))i(9,aaY dᆊ,ι6?hMt Tq1_$`4*men!eek 3_?O#GwK7#;.'KV~4ͬ[}Ng+!J/87<,>%Y~9f:꺲)=eة ױ hyT9ɢ]C@9uG@SaEͯ?KG5K[٫ee^`0e$/0*,еsPU BLYpa у|_ AQJyтqiL4*1aYJrY)`JKx*02P\gQȄaR+j$yIK1βRi"O1a)pp ڂs3%xes* ed,\Ao\XfJPo[Z3ZXwhgHJf00 4)u`hj Q`NL)L"?0Xb:VX B/~@_,;< 8VX ~~1 ū"*v`+1=$+ 'LCHk(eYSA0R>1.pTo΋ud%ۛ G5CݪV}%2H|1;_59\ 2J6iQA20 .+x`Z8wwg\{\2*]Y\xy{ ?%w^ 6V=zi_^Lۯ`7wTJruxf? Ff؛+ 3w/z~lzmX7_ oX6- 7fU*Zr"%Su~uH.X7_ oX6,UZY7T-zɔ}lAmcTƬiEBB\DdJ.[7Ey/UDǷ}[{()Y7t-zԡˆuºybPIt<˺V)#ʺyjHȑLQ\֍!ډ'l=+Z~ GªFHȑL =ntvbPEt|Ǻj٘u?YѺ#Q[MJg{nUzݼuWHȑLqG DF/UDǷ}[y嚈V*Zr"%S 'ukyA5i޷b'J|Lb^k!bl̗ ZѾF bnḙ S(P\<T0v.I5k4Q:-ʉVz8=1'&(zWHX[\ Rwuż7TȷcV(c1R(%w1fE%b1cA5Aڻs1Ę cb S/lo1c s1f1c Ř9mU1ESIcB1V$aY c9ƘCj/cc1栚@hRac1Tء6cBx?W1Մ/Bm3,Qgb(bqֿ31ca5 Կs1dbҽ&>Ƙc9&oKj1Ƭ0W1c!5Aacs1APbJic1T'uZ1k,i1sHM;ی1kx1sPM/Ƭ1c \NH @Dfr\bKpJBPy"Uz!i)3ˆHDJs'Q>3+ r e4\'ߝ d1 tz4j3{s<0*Ѹb>ڝ C9Ljᘳ̩콘.Wx I2KE6 Nm<'Yq}5+YJS&c)r $v.kC3y@&'׋t|`|VLn{p:xY:,&G3ˋ!ɫWǥSMzܗ]}N'0OOܫut 1ޝ&DJ" V"?j2^%i|4lf:*X 3\¯|7L.+Qެzty湙͆9to~0o+rxs3;MNA,"8EQK>d(,elTC4-˴.Oblٛl74g_ܸFS$AV[i%2!0 Uk ڵw ?9K)R]HT,- X0g+ \ ʅP8*T*X ;MBL) (X_l#_Aeؘ̺8މ4 F"V!Y߬ 6.d@wWUYyTfeoRƓv.FC?H'Rrt %@x! +\?Y&iQ*eD'Z)T{Y$jt#BPNFt<"e IXv~$ sB[Q&}tȔr5pC1ѽƅ|b$У/z$ވ31HTZ'{|ĠdCD2Fm1!J=&A(r#Aj_GÃRF$TLգ2i<tGWnp2xEyg9>pvK7.??%j" IG3"xA}VR{M, %*)'@XFB= l9Mo98` 9aD7$ ΃/O r}ңˇн\ek~0xP*IC8X+fex\sH)݃[_?ĶY gQd@8c ӈkшBqrFTU鉗0f~Y8k%˪{Nˀ\-!Q{99Ւ[S?W3CHʇv)+ VYC×i^LÃ`>uLN2[ִhkA+'kyN`,%5Fzi Aod]w+Q!1EM3D{TWIͫ?Q*p)2% SCb NAUf߷; XesV(<32!K\J4Q "x,9Ì1_wd#&jF;r<<֕2᎜d`U 'wy#=)ԷO7 椑D e$ 3&}JA2ɣ"@P$# E#HB֧pK$DVcU%'ڀ،Fٶ#p#S˭ 7xhNr%vيϩ TP[a6#D)o}=\q[\ەFKtA7Mvîn7޿{ Rw8ͤÕܞN_ ELώ\XggChg6"O?iʹ} 5.tv^sAƻcDn"?H%_}m.&#~fniQz\#iڠ-?ɵ_X!( ITGâ˖!H6bj4sVNnj$gDdoZ; *)euJ- 0.%foX2^\j:XcT}u[ӫΠ15t Si׬o㛞 Q 2c{ DFCdxθG3$h>Bbh#1@F(QMv#& 0 #*ȵ!&Y)r-ѰŎ)CܰJ@pY97ElTѡp Ƀ=%hLonH>:/_zwl3Jauzqˬzb]2v"0/{qb2 Xn4I)TNv䳦dG|>S{"|ۅ'{l{rD+wS{Z%_*L,ſhͽShHאki|YG$/-1E!AJ JN)NJa%n8]={v۝^;ֱj ڷw'M܉pPG8Mly/inpSuCBl>ZM@*mn+*Vh<XFtGFp5LQ5%D9:yXMxdy9/0zn3'XYit*y3=)S0Uq5=x`ˤC0[2RٷNA $0TRVhtD :N2jL*ܡYJX-ad~j ?Vz)}K`)}O!kѺRqs!eYX;\ 7g}믺XLZ lAmZۏL*Ֆ;ʲksuByqe0'Kļo^: >7S"tm57;Cяd4euq w7 21:y ToJKmy58o F֒[(|D3OQ/LPDO(:ݔVJDڑgGxݑ3}Ѽjufi5sNzO Fx)إmRGZE5:(Vz!Fq`\BFhrp`BaFYRuVW;Qq3$۟S v)%(C`M)SI} bx%qZ~$3oZ?r|je\;^1,drz ~ه>e!.>l?^4D "ux!9󐬖>Z#RʆGځ@t8S%^hMCӼLE/ZoGr+Yɪ㫸zj74~e^mKF-糆є -`[&aTKtfΡ 4}]r|UjvbU>n+Vk- cx$#T3t\|vHG'O" aۈ [rԘrT3t-Q!:7e$>L}TaU$P xOEKv*[VHדqN&)ֳuFqJz7im0X>0:75TiܚK7&x9$N&k\ qڻ&KNއ6b/}N7 `CXˠIJ["¶ F|UjS9P؋ &J(ݛ}Im+8Mt;F ;T0-vz]-Oч>0IQBxǵCxӻ`l)G%yd.{X^;ʷ1)Elۊ,,ºq;6`v>Sߡ3ԍaOg3JK͟V>E¡17WC榓Cd6HRS)KQ5"FgyV]jjQ3)HP} u:66VAfcd*g/K[((nFmP^-/Vfo84NoA?מijER }/sZM7| ;u;ͤZ ,Ƹn 7K{R7[-*ͮTWDɅg /[,*o#*ѨqG/wfďUBA_!ɂ wMͨ/YC5d%7^Cţ!1ܔV -ïU3%y8Tə3g;y!1,noY&#qnv^_o.fPzƫ0kF=aw,B"QyYdD43w̳E*ǺA!i-X v'>=OlLJ'Ÿ\c5PAvwa5afMAm/]vSiMDZMEyVZq%٤DZ8>x=[ nBtJ)#R:klZJ%#Bǹx 0F\G "$~MER f9n^e63ev>;5e/VlVkB\|__χ;v&p!/ū6/qP̈ik^yG|c<9v+u)oGr3v &ӭO@YI x)2pFў`#e֍E @9c({wHzU,e5UB:5]tm#"S9tmM6EN>}Yoy`׭0 JF--~*u+(5.ӥ9e[M e|2'c%t45Ad+wy%^M͚ϭ/^/YNgW6ofϾ/tA|G%O2qSŗJF#w^< ]J̝X4Z6T;S8ꉫA# }闁<=i;C /+dT?úL/0NDTK-F^\-񷒭3rlՖ(9;n[O!]i~71f"7H g-ǍZb^ڗK\ v jhs9~OusfX"7`X٬N)(1Q;Xh][8GǎZK^ERt>9|'}t 2o嬶-4"Ύ>.x N§|uF8d: I6.Xh[!+,Z!+.; @q{ȽK?5 $97\.C.R]X0;\Q w0׍I}>D8]Tv4{MԨƌdʋWD {{(GRi٭;NdC&*P`AD/Xm"%fיX |Dw5ύ$s;"/>!j_8a7mk。P=KDő~5"h]6lGTQ3Z=cޑ3yAוbX,EJSn}ۙ}w$)l2]=D:L?Qq M">}THկbҎ$r~Y"-jyӡ?w4οw7 6NrukETHy?QJוUr:}Zpv$]WS_ Z ~- J+iGK'vw#5y=(("/\!7}o=:p)C 9 oCXBD⡭dE¹[ARF+AR[R0o}yM-\Ɏk+ -e/Gy4Tx@Ƚ}=L!Z?x!K$iAfɥe!(MB )-BfC:BDp'hp-E/5,GC;x8_Cl YF.#uSNjc_jYmi Iqж.")z ej _>%[)yN!g B. Tv<xHo{?qǫ:gлHveA2潳ڙ=:p|)Bt7!fj"^j4T"m,:2_T&ueY7I.S& aq' +ٟ?=S@ />X%{9`lj! &]%JiOw>tYuQnDOGf.{rH™$J\ r?3׆)<ꭡXߔI' uI{nРih>Of_ O89PC9:lr?>( i-4hc$R%h Qi6ߠns5vGzfӎn卩]EX>'q dTnz_ͻX0~ΥJ:%uX`["zc9hʌdV8/N0%քo d), ~Hi2c {yi2 %4}d 6R(^`N`uA.gzS_i(M<;gCQ(g J/,q)o\hӠڐ7\j#BHc?(nM0; )8JBӎ M;?X@D _q)gbq5Ʃ5pkkeH ̴x[5릝Iɔk2B+L.0I KZ,GKG@RE V!8,Sι? (񮣪m»+"S-ɦZFu$VkX4fCMv@1A~g\xtaН4^Lp-wn`֦,\\>=xqr|O*GE*BWE_B?&ulVy9rΒw)(WW- )fjo&w/O66FTKE  -*M&QlS"X l><ϔؐV4Q"10&(]Ǩ挅hH1P "fcARP/| 3J$ZMfB vV1=6Q{[̭v+| |ՠh J WWNLƘ);dBZF8fFXpN@ .QqQpUMا!yY$qZ?\j+,EŠHJA'EɅAJ25xf RJ|M;@Hj-4R۽r X+Z6mfڬg&08l6F4P*c ߚq&qxW0] }3%tWd?""G޼ԑ8%gbG#L+au$N2 mB jK[QBJye%>?(;b".rd0̈́*EDZZJ$UK D*REI pȠ&Zq%yͺ26:c$7C))|)8rJf_0KW[,\.O;Wj!' 0"_เG6|@B$S)ve ] 3LvoA=`[J" IGe+!I(7uм+Y<[ %xBF`Mem;ZRBOb#WhrC0ŇOM3k "][a ƃ|aqrTat* Pw+`O<ۡz?]&#%3MQJ{CY$(i9iﭙSµɓF(U$K#4wj$o`㑽ԙLwNT̔ʕ;q2[_sc ?}ģbqv v?/nx\|?xdؠZ̤R2mGk̏X1@m}X|ato)$;4Đ4hƸhYI~k+OWMyI,wSA=UG/U˫l'vXL0[\#rv>.#UIy֬#{{4 nko o~=Ct}Qto%Ooʋ%rg*9t/c#)9#0Fʳtm?IsT(M'ŋ5f~HWN2Ϊa1P2’7C*/i:8yZ篞LU>OMdxs}$tRp0{ 12FsmW3{@bŝ,k)?x~q\u-pJskpX/Hb h^1iId0TGoh`j3Y|}+d{ m ,73 ^O1 /\ ^"cxX+`h4E$#Ky N%(2 ŀePNߗ槄͓/a='m`^]]4k: 9ޡ29I^ph~jtUqk:n)d=[?nY%Ɲ5s~\3? ȴrƵm=r^;8kș CJJgἭYHJ〢ii69%g6Nq4ŝgjq%BvuYX qޚ A^r}ϱMx+iZ"gn=h׻l?+֔_e3#<ٌ<όMIGN1fya3viiXX|vWyU>aBj5am;i>1JˌꑖmhV>_2qY6X%YևOFdweePtnpwgz{zg!)n Iv;iS8nm{S\i clvDq9dƠh&,QXTⅼ|7?ZI]Ud5ΞzN"M ]NHG[-ھT8=.=,fk8L3 yǑdtU4D9BFqKo^%|{JX6 ϣn{4$gScK74G4h殥qe`bqIU*gjʺI|570svSuYEn|dKQ 8t>irЦӿ_s킂9deweFdc[QJO|o}XT,$!sm S1q3uF@;+ӕ-߾p\(yWڸ }i~= &LJCddVHB5ee\-,^(9yU/@:?Ű͇ܰ4LYT=<+w0;iYdIKF .*Ip[϶~@|3zu}?Terɞ~Stģg;`T`͉v}b%8JebtQ ]dбm1F95JOIl۱4f#BG$A0L2 a-Ӳœ9?] 'ݝEю>.(&1H\zVYuX{,{Ч#{3RT6=HIu\5ƙl.8֔PI5~mRru>?I8I8I8I*{-6%hQY@k3idPčeR)ȋ,DH7=M;_)Kj epPS!˼ s}*/m=# ,cR ꌃNR:,] <{{փZKW1_̛f FY}cu1_%hUNP4hM gwQ3#uW?gR"s dfûNff(xuH"VvK:?~>>^>}t #l%UˣUz~;V^LͧߡQs$s$I~?i!]ͻyCꡛ<.]G5E҃K1/`wP %$ fR+=BQ f*lzPgI6+,}NP:͌]da'j4bRHʎ HJ 6CRD]ZV[}OϷY@XJ3*j'np;>]uֿ̫6߿.\)c$E&Cf@ee@Y,x@T?Y= `u>V0%?w;q@s.%NLA9KB_Tis$1q 3 ,B#dL Bf6f?GÍ[=_ofO<^6ߤ@w^'6z=ӽ _)7u֊!L__vDb`q{ HM'F3/@Sc1gw|N1īo4 CJNlfcr 21{r!V*Jq<-4J`D\N"%!Z^y1o\v˚h$Y] "J}Y\8SA߫ o MeE|T3ȵ@>܏Όf4LX7YU/ Ox?qkE(t@sѬ q8p,~|`U# -\\r8(2gy`#4OT띛(M_3wjDEbrYXsu5ϹNԔp^{jֆ ` {9oހ្c9[HPi~,<1 ^]8"%vGpJ8~)@ \`C&7P6 O}$UFYuI-Th&8˭j}k$HHF٘85i(~h%K?ɱ „$F 30vsyU:ΦJKMGp'JI4QV#EyVmח^ JP+VYwN΅Y"2堻f?7^Wx]}uMUWWٛ.21(Wʤ\.O2S | MiN(}̈́Bb-5Ց^erTY^e!jȱڐLjcj rrؕR/a3%QhNXR ,'91e+k3\ZFaUʜwhk6ً$`˨2R)B)uvYR$9$s Wj)ÈJDm!%È/dhĽQebWHRepܼQFa^j,oTk;Jh 7+/a2*LLHja:DTJ6ebzs9%AHf2SvsMh?J*3'ƏAlSC R=Q &U-!a$j$3Ɍ "8M8D0Nň n[SrlgFn< bZ:ރ-ظ!ۀ0c nlDA!KBQU@+Hoļ2N' wrOIo+qB~ ;Mp[+K}@OMއM$ pR_Խ {P+@K*:Rfr(ko_ VxyE6b iu.Ͳ*aMN!51l]0;ѮY͜ xֱ] J8@I4=o;ً6L+NЙ!`K YVˎ^mq㪇@\k^sgvb&^Em-wRi*cUi}|W2nõp0zؤ惹cP.֚(\nSJes R9'p#X0-nyrم{"~ժ= ! МKG\Lyk`Mu9 as!Ď, ѓ"rW@MLv-ujJfk9u *vBfȼ˸53fv6"t֢A*9yU|]\PXXp:u^rRḍ.!FNɃ-kQ6a37eX_ ?kcAb߱q{ݴؒ@SgT9fe5wh}O,"VgGqۏ%&e}p )츖8gR; ։}Gv< h+[(@|,SS87[,BX'M1o\UI]k荆j68gO@"C.0*n3H.d)hA*0SNLy.P \XjGW]5ML4^$WnJʂgw \< lb1Dk̨,,0OƜQQ&c .A:JnNH6S@4! QvۚpWp0 d= ",tS /ɯG~|Ta냙}xL^nWW.c6ieaL1H  gw :PC-XֶfHdC2_>< "%;\J/)_rKI{d08O z84ipb$5i?_0'9l "N#͐y>5pp;_n2I5$zw1T0Li/ (nq"oi/'2`I${`J 'Lk&I\0) L ;.08Щ}\xEMBZ~&.i]$4GP>]v nsp>$44b֥z daRF߼kV)5jn]9Kv6xR?&E裓^WED [Hh$i2*soP鹳zQQB*TCAAYɮ'_Tmʊ(Ŧ+ ckW+G^͎pJ?Ye3Yb΁ ָҧl|āM;9g7M:ES۔mc Cs( Lw5`F(mOmٗ AՆ濏 Kr%O&H$1ZrR2w$;ޱ"m yYH{K#g-o^WҧFs(Xœ&1ǔavr:B垧Vs9;a[;Gὶ _òAi\\lp$02ޗ )yAG8ɗx?ߴELdnҲL=P@yM{ݙbsv{C3 AT6+g>C dC%!` X~hԡJ.]S?1s#,AXu6+ǪNh ,96x )fݱEXaE^n8,:)zP? yXbP'L:CEm.4"[Ɠ Rլw%ģ]wQ1Ži;diB4ݲ7~6"/{wlcZ讴J{-Q#M$ܠ,Ӵ 7+2(7F!fE;/nO^[nŀ+n˱X7nSP_hf9.(٣ ~KvQ^WZ_ E-nrw;[0(f`r$!cz?_韆E朹yjVkӖ(]W4Jڐ`8o;pƺ#5<xXLކGJȿև60>dP5p*FǵWoB0-#QK*vPۻ *P )W""De}h]C:3~оt̠C]6y_k~ڗMJ[$g(KU"8Sgu|"*!oMNǀ0Buۑ.:^3F'!/A@'xb}b8Raԛ㘓, 27Q*u2pTv84`ZwBh܃egmjq -q-9䘱(\̲tzkYU/4[Z2HҲ1D1N0**Kp"]{r\]!i6v?8Aۆl=VM;`uS't&_@A.#);҅|)R] |Ы$G K tlAוRskhU ưwZfXGW|E|FXIw$(Gb1,)L]0iEQo.262Y61>7Vǭ1}f4'p\-ΑSt\r\}_)VwSk933l}s iJMU&\qΘ!Z½C 1(c0vAxk6C4Oe(h3[2%puRr5NL\J<@wV)^4`8RRfs5a~d[{X?s(9H2V ʸ 3%$V1#)譔J jAIKӚM&օgAxhm4?.\cˑIN 2s\湶b^P8ͰNZS gy)o<_otOs53ߤ]Z?=XܱB?Yr2g5~wH_1pW ~ `V~Dgg-n$[lǓĖ⯊bx𜸩Fd zpBfɧ"c)UP4|n;WK~ h#k&JPk txec1PZ3\jA@ܡ!Vz;yV Y1gp g}z*&?x&cԁ"ȤON4i(\$zz}yC:TKqV-]G#C*@c\zeN,  16S$TrcZ5vL2 1ӤQW̃KA84ɤ5:C$<+ɟQd9rG{b 12a|XIӁRwr^Le^T7E9=`ckEkD&8鍆(N`Z*0 #%p^z!ɣX/tL*)Tyb<F1eU FtqH_Kc1`M]+¸lw@,oM evZz$VR a`AuGf0E(V G ΙU7m.*༭I)F7cNNܪNfl)"T9M-!&exƬ kR@DJ+I9Oʅ~N*BtQ)tAYWIXo蚊.cI)ѷyP %^}9l$uLl.LQ̢avO?OZvBlR[II,WB>sY(6Jεõ>_l!֌ّF*;rމQdLP<цwՙ?ejLK>XGwGR2ΛxxJˠpL[m ٫v5҅d}CHYѳ"gEF2ZQ!3o1 &"(Rњc,˨~)"R2kWtߗ7ôh3.ezkWlj6M/Da$%r[.bwVŰb+ar9Ot3/G\F-KI~ >.i-\;Z&A;?eB`]sGttu;@nǟ''\AkN_?ʼn'K2z_WޔNGWy fkUQrVjzB;dO㗒>ю b..DBRIE}yIOvJXrxieF 0jhcݫl!$"|nΗٹEQ yMfB#7D3Β@ƒ"e(⡱}\ 5J"i㍰YvڂaEo|FJi7 ئ=K瓛rsfUqbjb{Y|-a_P l8 ^RJl\he,,ؖp0F&Tb+yVfm 거`,Zw(F 1%+:4^ ష'YIj4wtɯ1([L?.tIV}u*T/S.x)*p);Ŷ/ʶ,?fV_2 sYHO/+wx'Un䋛L'd´VI-7yiy%uT(|dAF򳘛7Rp}{V+mVݻ(dy"eY>yCMK埋ȧtPܸkOLL//(|AS0!V!f\p)դ_lTK!:Pvs\aEݠW -f;P}GMQt;RIɏ"Bdmx`~yO`hEb*kp2gK4O2XTkW#]~u^*dv̜@ˇ<[HkT7neFL~7)my`f]Om쮏v+v~mm05{Am?އ-:ѯ t4P*zr%cj{AoY3KڨƆeS Xh\tdxJ)PgisW76uZA*Bh Ck^%IE ϼef-8%gFGIS֢0&(8II(}`܀`Ji嘋rˡTQd3>2Ȋ> Fз.k'hH-fg`If<+zz@ -# 10TV1eJYIrT֘Dk`~m + IlogN Yn&JƘQ,X*,| z g-'ϴ8҄bo[4g+Nox;Z{/%;^ы!ZgA.@GGJp~?+5x(J4ww;(J)TbJwx(VIך-iK |)#Ѳ(A17\|k#7^f3&  Z#C$\bO0zJ^k$N ߼ ǛK}n7Y7j]6?cx:ϳN !?G.ok%]RZ wyqHE2v:ᵭn k7O=^~-x(i Q5#I_e`9_¦v&uU<01]\_3w7'˽LJ 2I!H Ae3HZ?ϯfLs*JfQ <7iS*^ Je so/8ZYrsZ쏋|Vh+)L}iXHw`e6ڐ9Y~.7/2^/߰4f 4%;Eޙ^V)XTTCUBV:#8HIG! WHL%ت"){Z+EiKBXDtGv-& sӊ" `b- Fm)z"j>Hj|(  6(LH=$6FԭO`fY&CHt%oA{Pyuvv&mXV\_Lvʐe\Xكv";#p!x3[> <Bl:|5ib1TY\Tߪ}|=k3m+)!3o(sG!bQi4F;Z?Rͻ6&dvb(t>+cT1yԟL/_ &~4fCP!'R9XЈq^,2L F1S9t2wӳlimt#EIŏԗoXG3a0z$.V -Ɯ w MFJeh< FsxԊ˔c6$ }/Op^V-!=2WҾJh̋ `UɤU9PSI6LaNfc>h*Z"o6wb͞Zǫa-G骚r-%0hbOWJ#3Ll=NAJ!5p{ZRmr60Vk L=_6wLWXLm{dIKFp7(:@'WJAB8FB^_-fpUN,' 'W]/NwοTS|8}kr6rc!CJ~Z|#KOFȣ@WHLĩh!90HP$]o'+5}lz`Q.<}b50b?E5$ ok+ R4K"VWJB0&%+kyP8qAOVPHV 'Qzm bL m}@KvxT.%ڣrջш`50,KA[3fyBP-Fra TGVt=|~v+!s֘z>z-ua;UȮSempDLЏ dV85{K(1`RH.iD06ȟY*c '#k?A&Ws湳cׁ`,F{SjM5,yF2g5&׮z&قnMfH-Of53C-tԲ RGPJ{v1X*9(Z%h>Y#ӱ0 %ިTץ_u%PjCP9 w'VӟJ?'+ag.TMϞȳb]S5IFBuT% |6#gO<ŝjaPb~habqbv~sԖ c, C2[VL;Gci6b^.(% ]Gt$ ]7t,h'H&1}'dwI[jnDn00g9p\`Õ$sqS0瀥(񋈔+{#EU^B V0&4G,XTJVفRUABr+wc%3;Q;/=^zdK+_í pyWT5kHQYn6apd\K+UKɍj$oJhxw=T f5zHSڂL(D7JvnOuz o}1)D=eQ4z]:B;\0`Jw!F@Y}{}w!5.D}9ʊ^n(mBް?biӫJ8zwVB4{'f֜4jiru~Odrœq)DSf9r=zE-nN _xG1`j}֔fb1Cd…spoMvS&y3eTr>Nxu3AJgFOfK&Mlp+gջe:*|& ǠZ 쯶}8pȽ,O^:;?b{Ha<ĂbNomhFׅ kr =.U?(RJ4coX=r2Af]@Q,0x[XI|X=_w8J:8|D-]CnRˊg稲:n3**sNqCf+qÿP1Gow}M-㕬@s1/tCɲҪ X y\[QC1Q-D!4~oaWV&) G0ם7Øn =퀔<]c?| bsߺo) 3mkW5(ool:o>ۋIZBӁ|z|ҝ##p;al:tYΓU"p< #*Mա!N {.{bl3` ֎IL6BCAA];?2U]I&ZpvѫZ0"1{b doFP&*~2;8؄(r7"{)2M̖V.8#klA`V OkX'rg˝65>}8C yу4BK<)pgko-4?t:zUU^pm-U?k}wu'Wvz-W#l<͹00 uS$3<{hE[]jzs:]V: Bac؍ &T })NF-*=EM+UؾnV/2_浿ء ,X 7p=JY!ZSHBGI_47>yO'8Bk^[Jqag+J2P^+\GBWex'0Z?WW!$Qr^xhtR\yO X@cx(7k|NJ1c-i=WIMi{7HlzGgX"s+3r'-%zQ#> 5O:e}=~A&t@hYn4\#>ޕyy X X( a:Odo[Wݭ3I^ynrF5}eSWJ-2V(Jq o8y2Vp=cm=uV:(yk%~5*Żfa[sVpiuz#  (Vd[LѱGB;Nje;^,?ӇУU+R B1L #k|qvmkCe!}xZgA2217ʙeUͰ$x8f-"$mxO1*): MɾnۜjV[L[9ŹYc 7Y$3>L( bwܰuw@=eز)ϵǸqJ`gֈ)ԗMvWs-(E .2_Xl-\u|(Z'njEn2[笆#o= %eI\FBT|V͌ZO GA>3e p#8r%d z&J"BzYc Lf .;0 0m${YR|C`Psx G:fK0ah ݨ࣍;״ǣ__5/WmŸE^.cW/pf)WEOtiљ],7_oV6) \ђJqbdq󜕂mEs۔'*j~1Cryv?_؅~\Ց^'͙f5?7krk볣4znM7Bd"޳`tGU!gxKuhJUP6{uխO]j"Dg֠gl>|[5;k$,TZC[KC_GɃJ(Z):etθ\h\m Z}tVIs)YEl"!֒4u|Fa[ VO5aI1ȕJH[© se<)/~" q|NO~7ž;wM~,pֈK6e$kB՚ ǘ<C5; F|qV)78{QhWeE8zps 7=3M?&##U.'H>#!֍0z-d6<:sGCgj޽ I",t1{/go"NNQP{( D.tFAА7`&4tAU%8d PPUp`<.tWхSؐ3r Ih H$ц IZI/ik#ǯp%Rmb *"w9ebM5+քX;|ՅS|FquwcwO<%2HlQEdL]lPJvyFቪLVt@.ݎ !#4qqN!buPㄇu{G}ĚG}ڽ:Y#:`k8TpTDd Ҥ"2g 3|Icp>fu:ٔPO![H"Q F;$vauMJ@63XX#!]L-E}M=@KH!$dyäT9ީM 3!eRQ{dp:( z8B"Z)R$|F!ɀpݘ`@wFNdpM'Ah7wy1?B04X==ckә- /<O?1clc6&ɿN?>y (P8.~)W9ncsؿ GfÕ]|\0[._[+vO.m~ޅyWrm}C?|*Nȏ WIe(ut |_ =̷U @3_\?7ƶ;|p^==@i߫__ /zo_A6Q_jlw/?^e7葷G{^o$I?K_w| ȕw:7h z#Ǜ~"? d=|f||z'r(<Z2R^ $uGu_^HB Cp|IW2^h6yM|P?8K1pb,Y1Wdq9:i}v`gq(, O7 =2 D4R 6/.hDf'ɍX\pG-}37;fO2X}eBlC(q_noaU ]yMi4Vi N@[u>/aF5uetkm5h{J-gA[ jжme!!$D\CnK q #O!$Ku̘3Si8, _n["D ^th=[9Ƃ!3ܞi#NP0SH!Z'XNhl$`1&w(8M# A!4_UAՖfU|xC.=<ĽMq8;Rz$4md=b\S5Q#lI@ND9ICDPH>% ߽@55\\-ǿGNpEpEpEpEV#B!X}˨8gD,;^a МQgMd[YV,mVV͵\,jK! D1Upqt!zUzKCͫ>|u8FzCPrb%R*:6u"G0&UϹS)L2 PRcZ ː >a<4CSg5T$Þu姾8j!Z'G'ఀ]0j?ݡM I{`B+oy@6fhxeZY5||wCg dWd/c45E U<0 Om.ۋ'}{T>Ij,C.`9zDh} ,iVo%tQp U*̠(~Y'q*B` "jǙEL$ BUe;f]$B*2jc}]KOq>ПKu.}ԥ$$2 l!LB&c@( LEɴ%RTL]cy782vV V=t`}Y>+{ B|`)yc舦8مpgE;C4Rg^AUzJX4 ;' h@0.Ӕsdb(ÌFgd6D#a&Z]`דB_xF_5Wj>B^|!Y g)Ί!/||kr *k4S\m<OiBE+Qh<4j+w P ,&j3/nҶxT|A4o8T9 Db!7_PiqӲW&Aef5w]Jh#j:QI%:C<24Jᮖ `"Fo"w"K A3񾒋-|K~_BU *9)M-5m4Yb$%yٝʽr%<#f`W 󞉍} FZX5'6E)AU— ` <@@>@3TX#Uܱԭ= 273th\f_2I'A4/sDf@F0`%xB :n0*V&M⭺<ʠN)ͰvaM.yzԪ;5{IIHiLvh#J \[L `=ƈA3=ݍ;`>2&bv@fPy3(jjͪ`NVyg)>5E(X Uݔ~\Q+#\闃ʹ_(M ,-Iw.0yT~ |,+0ccu˕303ufeR?R V թG:^JmǑ%stpB8.nEZc R| KjR-]y;=9c=|6c^U+ߖR/8z閫5>YH=\qtW\chKۯ!ĺS+=KysUdNzrKSςC淿u,@B+k%#7&/5鋒^$Wj Wj+ $SB1-{EU5HY=JYq:?IYң X~ \خ' .1d0EbL={ѧ,`X)Jq->"Zsꣅ)\ c܃gzsVxj -\aȚ:}RdscI>5gIT0BRg6E1FOj~|<-k52x"XH xxn.$0ppzM)D3;|A,mpܬ/E 8 卑2C!+l4JhV LN*V1mL:5S>uUT1|L5SǴ>>ea- /iy{22Bg̃f!L[3"Ek#:"RNKŏavip..XIK'{V %g8X1p窘V/XIx`._Xj.:%}HJl iqdD8f-ZqFiV ɦ5}JĆ5ĭ!n qk[CAܙd1K]ֽ럅7mȉArgLxYIQ4H.;RZXh{V HGT{IWoq嵉S(ċDu*21(ƠZlGW ?Mj[V?5YOjBFbT ̫D"!ʴp13#$?ĎT _?%]AimyoC;{ mP`ݝ^SǹYK,@!xJ"To! &4g4mL"B!陫"BߊƤSg.S\Ď='u\ԙ:sQg.֗Po!J;" J q!2̔e2Cѣ]p$0~b%2^B /G}QΑ['E *AQ˙i VϦԣ$V=v'O' P^c>L8fh0Z^xFk4lF7ohxɦ=$!l|OOyeo?!3I=͐$!cBLEJ2I c$hRLF2mr)i+on#77\GdJs/'[BrFdL?Od6uJBEfG{"B%0:$`LX̅Pv>1OEԆiԡ faWhnfUsWeJg w^IF' :*%AR6tR+~"]RY|D#QH&d3T}m~6+4inF0+,.Y/# 6dK%e{gʋ2#qQ &R@0re{⨃y= Kˋz&) kw/-yk3/{QBjAFNPAs*ahGc$ɂzHtAh !,:`"SŴSO{BeyQ$8* Y C}<5.J\6@!d冀p\'/'F0EZw[4Sl9?RRvvB-Zw Ǧw4*C/=EPk23`.7?{[Ǒ 2#y!0Jt$R&iw:i>\Όs.J8{ontqyqqūd:,K//Y"XyZѝao7?z 4 *ܺԛWLgȼfŽ!]ξ1 y9#mKUI#%BECm"3̪EԀ**j!lėm,3hh"{̠`B'0ˠ9g:b:4{i+,/37&!m>(l}Pp)k׌'Qv>C|2’@ l?/ZAclʒ8|y]mSUdX{r h\F_CUn&kzD6&lF7`rLmau&q|GSW`1 fdcl]Z^݇w,qI~Pvx]U*S3L8qf$3*4.V DD96FAU02K5` *"S`9 %5m ]ϖI.>x# MmP{w7."d* 6YǨ)f栺503@L:9dvSiD T-XA9lP5쑶8r9l5ߞ9!qFЀdJnJy.0j`ǡ_CQFیr!]t9Dlk|%km*BйjGe=.:7f3Y/շ{Rf2agK>Axp#R~.&ՊJZ;9Q`h!-wzwTw!ꞁe9;rBK#kRQ!jm*>(8r(:+T=;Mp2m؛|0?hP/#nyyUA9H:EÒ,K=X?kK7 ta8rDDh3 tXkPW|X:eL\9@w0܏zh$qLmp#ACF^Kr5602m797SR0( oNr{E]Ke6r%}TrMߞ|ܹ ?=|,ioy)qd؝0|zuU~sn[# nkiC੶Gnk}OV:RuLFEHsK@AYPawv4(c/@X?(S@p|{ADDNM # %)7r+G]h|C-iaɶLqAǹhc:E#OP#K-grt&ɩjվjh4RtA=Sz^4^8𖿱pKƋ:A<'h˳ &eNo$ #.e6BHCP[IAx|VmbI^/\쵅[%`\?YZɌ; &5~[0!Jfc!af: Vїo*aJb[&ilP״I$!:-/bIO#pK/op3%RʠŦRUC,4'U %16#hTZTJ!T \:63ffѰE^ rjnrўv5b/0 [þT+BcG4'[ fW̦ʹIWe\:+fBfsdޜ x2~RU/+Jӯ3ChR}[CL5هSՊ F74S7`T!@,b&JSc aMU}`ޯe3G bWJs۟{&jy\x_R E^_mOZ 2ޯRM̻q/̜ l gkwIT}y djQQZJ&r߄Ԛn(R̸BLmXan|p3S߳QXw 9FS:L{LkKjFV[RΎ8ΰnBg>cNJ|RHb  ":,\VR,'݆Np|YLH{5cF玡LCKm}H5^OCTFwUvE0z4v&^J]$X:[a!}TJ#dСY{4!QilltPd)t]| p^dtNI>0[CRTg;}!*߼C%Loo {R(K|/|L>< __\ Mِ*o(J9er)Ye"ft-dCR(}/|[/}y]&lw45bqiLvg(?|>≬Obg=R:-p)0^$><:Ի޴RLK1/ǻTζ9KV9hI$BHѻx-K?|6έ N2pqK2UG5\gNQ5*m`]UOGuSnɋYej8O3g (:SUfKktd_s-PE^)l\y/ (%f p6iQlFWt[f zZz[Iif=//Y׵Z5%Uvtfq邉ӂ;C~qZE`b(1xl(QF:eBVg\d22!ԪLXt<>+2Jg-Խ@,&{rjc$}.atUL5WD#T9)'W!\p[>'"oUMh\Ow~ܯ:Y{K?5QC%ˋiWMcPMQɕ5GTJ'QSJLJkض钗"ڠ*v  o4d9RHQ;8 [L_Њ^O_0@ qDV@ѹ;غ]ü \n8 XYm;J@23R8իR9&tK_j,kc{,Ayrz,Q)I[?,ǙnF+XJaFg22Éq. 3}=.3(u P{Pd$p`9cM69kz,`k/we5W*ovfhv@hLT΢q%ɘWnbXmkGbҞK/ڪvdEsVjj*iһ)V+[&s/+ӷC9&'dT8ƫOsT~ƙsB٧cV!l Gr$Fl=nٌ3`=2O;I+k7F~D;P6~g;ru2`P$)aȔFl4c%sPj+{qDn+<{l27?.Kr\^ůIHW/,锩bʶAV:5ʥҶ jjqd!¹Epz]tZ;8AJen.+b>4F6ڝHcXj:17P8 BPbf ZQǰ mHoD.X({cRM&BH5+ 䩓F~1QIQXA@Di[H w`G|D+ lմ Bw[@`)>a`0>(կS{}~%2Qi&"H^}dMtu\:O(9h &p,+.[m!1U_Q[ٳrՈTƝ%j5.=SB\INnL!9>WXK6ICt%Px?mfDLDLCeQ༐f$K*P^IAe&By,-٫U`׵N)x];`9(؈ "2(+Hꊴ>KUf)UeKȊR Of%ceJ+RVy~1T0Qaܵc.s^Y6YtׇjuIRM)`d[CXZ 1[M'Zz#P*0ll4t8~ŷ.3)D?H?9IIa;,= ]Uz?]%J$ nk"¡~"Ϳ 9귤,.1eV*I '.D*$JmV"iŃmtyu2jmdKaYhGsC%DxPbYѱ@'X7=T2;␌HW}at~ /qs+745=|KG #q 6.߻jۀ;@OUHѣ=ƅB-oˆ&Vv` ~#npnw/ogt/>3"qJg!@^ |Gފ(5".%cuE1#PA]R0Ũ:FB.yWtih17x!YT40 ΍@mIe 99n.YOVw"D-*[T7Z9chBEmUm$pMU$oN,YIdzNތ*̹MFX4D6STzp"Y^&_2KܞͦË[eͰXiV<F^QxjϬ8ط B^lo+K@ۑQ(7 } D~VD ?GT1 `)n9>.)OoiGFs$0!!(ؿ(\s64ԝ['zngs z]삞d/鵜|=5紗JőwYqȒS’3 lӈKD8?gMMXxշ#=2L ЂTyBkoMfAs;>mWsك?GXP+iW% d@>v$(gP˩kb~Y 2z9yZ4˷0Q%uapwPKVC(:׈@8R#JJLN0mL3fD(v=$J9FcnY.8B]U>Jө?{*HƁQ{$~ W,{j)Oo2(Xm'Ktw0^ 3 KSK/VF6"$3qZvP!NeܭS`yJ@GQ KKR`Ւ&e),\%m,ڴwu5> &ǺK&>(.8mNisn'~9>@j90(Bdb;Ls+J)Z/^L SdaQ!Ns/_( m*ǠH*-6scP 1cuD\$2"Ľ1鞨-D͠ MĈॷ  ެ95>R<_:mRBv)tOac KU^ #ustvq,8*o\֏t}R3Aq\^Fls2rS*?[ /P]>b7f嵑x%X?)}^iӾ=Jb_Z3лŭD$[nH "%iGԄQ\ ΈD F4/eI ,A$2dJA."Q/BxX yqͦ:Z I&ֱ¡$2- `z—)"^C@[kaߡ#EU>V؄r Umj;EaD\L9*Dsi?7B Goc9Q\Æ¡̿FTFh .Ĝy % TT}clyt%"LR{AH9Jн-o^d; yOrAR2$Avt$R~"RdzAAV8r5V<,u}+٘@W2wYǪ嬉m/E'%R)v~CuZ"x_͙q?߮ sg mdߩ&Ŧs:g bxT=^..Lc|JL)ХʲѼJ/^Q(DJ42h5At5ɡNW RO~IX `'g|*I%M[$R42,TRWlѰ & ;Xٻ4V󡡏&TzvRMNz[ƔhcpDr9:+V %<-EiF%=Ub(ЀZ>IΫxP-nXzTU*7b;%F0sCMHWj@[AH "`AP4"F:>;/Ƃʯމmĥ/cgiy(H!c>IA\S5WEr4 .ƉX[+a8wƄ@:Fwv^-'4e'_ % >:{71*0Fx( $ F A^vZ Rzoc,0J)B̦h@# ,H ,ņ HawU"[-,=\T -y;+2 Y>_$@ qpxs[#.~u#T Eh%ԓOBa1"8~9ѥ 9vy3Ed0vLM\jfB +\$F*TTǹWWC$@Q VGƠye?~hc0`%s>re9n roH6b@s3'f,_ B =AykG 5v1 F(h()85œbo~ ~W[+f<3e ] כF|Z@$ 5>$ڿ_?|e.~VI$%Llu~:"C|cňGNr).r`!/Kgy?}e&OlU2?O}z ΋I5j,_jToNP2+W,ys~X$/_b#AM{myuδ9 nRf?!&L2^|w/_5Oo^52K7?zI-o/d{XTi^~Opb/*|E1",wY_-;Xsy0CXh. *Z?RoL'U9KҢXhUǿ}Cu7cQF_L'߾$n1yxnDdݳ3?4_|u_1/~=Wߟ[O?tz_~Mn87&UJ_u\' _cq7\%QisuB ՍiɼPz{xL?;>qkwڀ$x Zߛ;|JXAvS*PMԒP˴WKÊ.e[dUXC)$>wK[/'9/᰹zfGi]_ >Oϊiiwq'Oo5S@ Ytgb`Ծ4‚9j?O_uy 1/'__iMNO?)!DI4wv<}s|Ns&#O?z1 kP>yjrNC0ǟmPi#'Hc B/'iMYN^WK6<.}:߯@'^mj^~--lXhANOڭk>,[i߯WM, _CFiM ]*]WU!T9qaE |6Q%^c%231VI,-~&? =9? Aom>Y dQ.c42 E8y4WK$^'f omXJ/w֢_b[v,d`>dQSs b/j*h2n|2` +{ BϔI*?U !cv$߈䯭7wdzGwdzp2C+[dZ[Ҫ"J8-+K F4Ai$5d*tMW26ԎL/AVLH^kZ\(Pdc ;`Fḟ-ODdQ%$a0uUqEB{0HL_R9ɻk7+l8ʹ@\~V9NpD߿]U(w<'6p2& ;w8}8'@:V<ҟģR-q4[f`_w+`j:F,ciA񽼒K[kh~;4tv@?O>ZC;K-wz7FtoP7OWMC~J_FKnt!*`xuH')4Zv%ݶI,rrǣׇ'{4j^q^%YCrs`kn.k6K-P% f%3mx'(ۅZCme҈=w+~V`-il>6BK{SsF$<#)SAQɳ I/spmMYHJ:TB0(E  :ةg&, 7N!P!"dmCɰ;R#15)5/."$.i.7SxD" ,&OiXv$9W6RAbZr2[ad2ʠUFJâr9lauw\df,&I / 1i8 9w :p 脆Q f?6Q }(zm'""êIR< DK8w{1<ɕqVJc|֔Rv\A$р(*ٔA= ZKs- SYJ I”1fe zŴ-yh 'ǀ 2fZmm]Dz@:s&Cs%I;&Mi:p0T EXxKPHEpt2rOȜPȂPQchږyj_|qjk#_.^Ӷo^01Ai\)lᤔF(C3,bW[?;Ln ~z2:Կ _&ǂ_ {W&;ˢ*y˲zT^q|ً*%WޜMkբ-]I*vq}vpt5__idTIHʧ"eOއ$D4%AH?qrP4h!&Ǖ{4Cp4ZDʆIX=@D@p2%Z>}~}'Ӹܯ܈޷z,^鬴zT cQ9ZVI]!:6]1&Dn'UgگhjkpR Fե,EcҒ14!L1iYmC%^`)`I.uR~7~7;ە7j?ٻ"jFR|dsD&Ag˔$D͙sRFxL$"@!@t!>jT qq-4|!4WYh2(6 Ykd$hB JWBЖ2 &2Ja9z!jmV)FLF {j Sr,ܚ2ґA=#Jߒ,ݲL))xԤy.ݚ K9ՍYV7^ZF9Œ23g9[vT05i 9nAMs)))!*4HW(93I& {G7Hf F z$ԧ@~ln:%נGt8ϠuΚD,uH,Ͱu8Q;a3e!SZ(#LMN!?!= (L\ `C'_IؚdR'œ!UN' >)Ry^bFn!6vG1oՊY=`N!ЂQEѝ#3.RTd1ek$Pk^4kIv%~uH`jx7fj0e v[f1c qvEIkl-nx;57 :Bfޢ)`gfXMs3Lnly{vLSQDx85C֤G$2Qv6 IYt$EDXzy ȠKBA= 7 `4 ./5neA#-V`S(.Q1M4Pu9p9Q]8X %Bb]9L%$I`a'rF?XX8DO5'JX!HW^P0~6hc9&q&z Okg䭅F eA(&638S )| &jD,B[傚ne(זn:{Th &Z~j,L%nQ92-=.|m(ulql ֫} JRb-oi\! }t:8{ꡏ@WHšofЬŖ{s5 63/I7mU]]X[v}8V2PoJ`3֚ke!nϊRn?P˯/gef|aNN _k6~Wh>Lr\à:xD ;ц~C5 Ps.o8Z }~qEIPON:蹱 <67x,)_!d2| g%M Œz=cu\QrC?Oa^3 7ݤr~UVvSM;1$=9jVVS9b^-4aW5ũiǦ۲c;DO%ѫݏ餝+x2-mZݩea~1} U>uqGFLo9V_XeRZw0t[qNtnΰ#nw캇ZlkhG;ٖjEPF ZYt:y~ߦ'Of1Ns: hw:)kwi )AI;qemeex룣èc&/hCwԆ*OBtn&7O0-[Icb[X$E&{p{<%0Ȇ$Ofr}ln[} LJ)[>sˇG&9+ؑh4Z=h%X C>[WHIw2t%hJYrFQq"'D7I Gwd ʽe,'|BB2ȥPR9̠KFv'i:UbL|XZ pCrT`\b p0#ye맇{0iAh]eON~{=̀lފ_*t9k6-Лk/'j6$jJZy [$xk`9i>6&,&}t?/$xt/;z]ݟǟ._Y(Pp(\ t2ίx3Nrizqg' kΪ&9wyKX5Y{ݓU δx3Nr˄ŽE_/Vo,'~o/Ca"T0~*yetx]]b.~}t3~f *bqބteRDˆ"[[_j4\.8FS*ٻFLvpu|KJzzn$b8%5ȳ24<sbJZ;cIq 'i8wD=Ya6.gМrb|I$(g_Z/>0L @%^-ŀ^__νài{H6m4\o彾b#f4׎չN%ֆ.Y^c瑱(¢qZ*WK|F22Vr{ZIn7BY_ۙ Ek d\k{j[M__o'!p(w*AaEs*+#c0@f^ :',/IꮻZH R[왚JĪ"_etH%{)WQXʊ~+;RkrPN ljUz4i CP"hVqS8 H7gRgU42[^ ExV_٪^r+ DT,>? c(-1rhJIKUnwx3ԊPrDR1-ߒҟ ?pgj ga9FI) pFu~KQvOڂm43ZF,5X SߒfNR.cQti<J&)L,͛j 72C&'4)(! <(b!l$&֩7EC1S8dbў)ױ]߷t+#o*`ܒzG=T>r7%RKP'o( `SRBi8nNٌsNb6BςbmϩMdB3ejJdk %CkGQXKbgxAKpSla60u:^MtaYѩlm(I 6<jS7FLwL_ϿOryEYd8?;p_ή?RN.ooї)j\>k6OkyN/|v|y1`63R;J$?:̮өLDGUPi-&,t<}0xG3ŬP넜o<_}pJ>&좜No>?S:[G'v~~0#7 Iɷ:R៿q_rlwpk+A1~Y+z]*R>(/|S2jrTU,%o#ZG̩Zb3|]K >__=\l-XN ?$X+1+jz*(yXQ1 =L(O :@&:y3ه||?VЅ?!G`˲pmB.x&' v8K殷'' pXƼ[]OlF<J?;IZgReoQHq/g]_$s( _7?<8`[fsfU:O,[edxhM1t7І>w;6 Æ'݀R<1#u䋪h]r FZn[ʟZ#[ΰ#N|eLڔ D-MUR3"9kVWV q *-5(WVLS3ZiK#|@;A,E:!1qTjK OvhYx]U'S7"Y D0 B kh@¶Z9믓f˗o㭼8#CAvL߈DsV9]dAČ4DȴXq+fDJ@6gGZ`e/5B"86k`Qnibe 72rBF`3%JoMXߦLqu:>Πpo[3Wκ_hr\ 68 mphL\qSm6b<+bN3|a$nClofה~+^cnCؔuzSFkGMqwaj8;iAf ]}->AGySSp7֟5A -ªZ`iLVI+븇h(uw bw1Rv<ݣu|Dפ6wzB8iimZ=Sqv,Ulklvр:h)70^#ۃ_07|}^srAwGv;j>Ǟ+'9y֌%g {>G8-uʘut^Vfq@^YԬ0{5sO5i7Y,ґ4)?S2RS|}~Ř~z_fڕSfN*on&av0qF|=ݿB}F[spS^sJ1ohijřk w}hygxE?_/׼'+u5x@ ^%M +}Zp?wLESO6\fiy(gosg ?hާ8t ߅諐"۞OEf']S`2{$ѹ"W_ƼG~]X”RBgU.R,䍛h7 ݌ѻ t>w;bb#`ޭz.,䍛6E?uE!ض-S/Y)bt[w= k%519}ub8֬.btRA*͙ǑI^٬}Eg lm< k$UHIUR%2j}`0O,xoluWQ켇=ӢEÚ/`kD"ށގt% NSIJh sՐ ,{RB]1~#7d\@n1󄺏 <{ 4r3u4-{!gCdK 3`i3XPYd/EewXZwW7(Fsa# <fO 15DaTp1!H|BadMsRB[复Dj;IVÓgQ$ITA̲Ө~cLZKL剏<);gW̧S0ҷ ~1-ZOr>{ dۧ!LȻo~~h,~4>@DDhռ-/~M~$^<o.৸Z?Aܘ*P߹_le`9oOnc.1F_ m&?ǖj&̤p*ETR 2'Z1I1nAwۧg 9֢ez"0<LDanNBSD={ Hq?}{4}zsqLnWunlRUջw*N~'bHyZr2IL/jp^ũ2ͦew}@DJ<$ӋNf u]"]nZ#|ɪO̥/^ŚkPLkŏia^baI ?ZP#xmTFi\Y1XTV;7!mF1AXg8i?sӉƱ=9Nw׃))Bvc0g2C!8 Y"$# {ׂP>"l0Ƹ\P1 9# E!9Ί ֱ̘LٟK8I" H pi uЈJ9GFN(7B,zEX9 EcMJp,$JRM "P FRݸ;"dC n uG8EJ n3 'ʗ47X"Wu74sd L E*ʀ)] ѡT!t1xr9K7{ `ŨgA87!#ׅ,-/Xb6wWw?c-Oc'|Z> Z]`#kp73ֹOܓf6TY0=),`48As9).9\m}C7B,lAN`ӷC.h#G @m` X,8sVKJcu`9cؾ]8xNshu;+KP[INv /T/qly֋3:ȐkS%k,wRA*V%NSEXJBe3-V 5vU- JV /=c D`XC r*x)P Wj[j9=7x=`~xSIVg dDuO2V7I?'߇8 GXOaZ}=33xmnC47oV?,&D~I-,'}+GUJR6I?$)էaK<YO1%aJCHLz fNg n9:)I)(: ?xW -"G.uG4"JpH~nqO mܟZeZS^n\wЗ$F.%m_]2t(u /a6j_6:kuNT8\7xwH>~KϢKنv).L_M U@^P,a)_oVIj^D|4g. V;q̐kLP,n>B#Rؗ8 k^_ ȷ0]CpKx\᜘D9WEw7?MkyEj-ƣȦ0-pEaRcP11g䱽]}FoFqvp]}x1>DRB}y<xpnrwC_}߄wf >BH8.䠋A{ Ԑy6($R/.3NzW_4Owelr}WO/FpwEc$ÁJr5A$Q%QhؑZڻΝӄr5&DRLgZ<@K o*.9Ba]ˏo&)L;븢0QS1r-9pl=:b8F1]1cknO`TsO5t6X$aL095($A! OrѠwgϐptȤ `_!X<2J_M9/ñ"w%BG_ $VjIq :'y3U¨by 1RsLJΐ^Sr+T#ji0B!MYIJ! [HK$ )⥱Ā5: U `Zf!^ gל`~z2sJc³Kp,?/[%N?Ǚު =<~5w4/õ|n9'ZOVbrŭ^"d#1VK6.׷d~SJP]=GIZjTT큮{1ª{eD1^HԁBuDel"Q:ԕd(FOuد끮 Í(( ڍue"ը Ez=T#QG!0ЕoM(g-}!l|VܷZqMXY[o/qeHu޼We9b9ŧY(Nޣۢ굓&U dzx/Zn3|@$No'Nu|qՓ,J=5H㭧K~ [W{i渉%;Z>ipŨpUk 7^Q7-7 ֌h:~,$vu& ENmZ {ve3;4/D뗝 SzAi;?>g 8[BVg#yGN;9)k^~41%}MT;4RR-iͷOgg@ΉftKAL>l:(:8R,3Z`X~`=顎xAl/Q3l_ 0ۀz UӿԹH_ɂu.g7?+fT里C=u9unL*`51Jշ+ P'aC%)r 8z*f u/ɻ~ ׶&/zRgi?BIX &??Ke/z+7ř/A^Ϊ;>?lcD.ג"o:HXG'׍vɬArCVNLJ00p}ts+Xtoڜ^ -Kܺ3'-\?Q&Ey|O>-^[UÉnfQ8r@A*-44Fx\\}6ӫk{zU=RpzEk٪CfpcTb'Tme#=0~/-+\c,W>B"#^ biS*@xbbЈ剺G']}w9dFwtcMdOG;\6oUuJP*1/ |g:$ Ody6]><սݽ+Y'ZckSڲ7  +9Ľw?+ ,X*x9/  `vk,m7cHSN}}>Gj}Et c]Ei,F"V&h NQ\>i-]c֤~!0O@L610#ʙ p%bLJ:`~>y7.}ʎw1c@Q|H C`7 YJ gΞ&sх30K>!r6&aW޵Ƽ:єOCݗ]KejN!N)8RvvYy ]w%DeN섾dqxftx>Re)9Quـ;9h㈩H1ψ 2‰EQP0!aQj]jF0ƽ$^w:r%RrMT(*Se% i/ʦkf!葑J$"!lNIOa C3o#NhvdwyxX{=W faZ @uRid?G8օ뤁`q8Xs Ռ# 8gvT/ZE,G*H`I8`P*^)EnSe c > պ>gd /XFd盋炮I-W.?m wt?{׶ȍd?̖KHt31=~ F]P-ukn}HEQDU(Qn-XHI@d`J A;QAgBy~٥م^g Wx~|1_&,sLebt&2Q,(TֳfV^㣛<\Oi|pcazX~}珃dڎ\ P F޳_?{O?^|(?u[`z;\6c8>n zl<Z<=(ᣚAO?/?|9ekn >|߰QJ( 7g 0̻*W_} g䈎'Ӆp}MՆHFH 1_8\2u:uܾ>C|„'Ź%]_ii^h-wII}}RYT=#9 n|) cĘaՔ6Mnu/4iE\w&ITnlx|擒֌M jJaecӠ%م=g cDmc[wK?g,l;Κɗ_Xzm/䍴}XN~?k _n+bo4?<^_b56۴V 6%']gWwi2/ T3( s-H%6)ޓMrg+2&RE3@}dud}+p}ΐ8\W_UCX MRfs3utΖze^k*BFщWت"44 ym6$"䷴,Wɡ|LVRZvtuViX *($&̥%]voꙙ"+Ij!զ8 2"-m"ȴ6='6ّcP?8UsWΈyMIk}" <xPF͙K *!` S%<'s7Lzd9~h7mhzBWql8CWyQd=b?yJd:yq1֓BsOfG/vq܍fX<8(FSpyvA(X4}S\,޿~x@Tchۃ2HWmҶRbpt˫i-pέc{teh9G9sϑY2Q,s*E# rƜ1TR/Hg9ㄝ/3 13xwlA"7L1K!iC[hLj tJ8I6XBT+] ^Kj[Zj7&:m?cGv$vPeJf6gd }lM ;޹>te= L'3e[IS\E IM )ز;A0,FS7JKs7|W䤖r9ۚJpP{cܵ+T%.6XCs,VHkq <*bq3K1F#rVrIsgz^"BPq۟B5eRr; ԡr;XDAwIPf5$tUej.ɓ}DjN·.20A-RRTgWOFu_EnڪSXfEEe9:HhfM]8lwqq;pM@"Jy*;MH^-|NwZUP7jMș!>|t['B^(|Z$Ig*) `ּ0g_Iǣz,J1Z + f.z("}4gA;ksӷsIl˔!4y{KsJѭo43;hRJ2"2 8SBɡr nHȫl£ \qP$tUjϩ NYn݈2\#A۩ڟc2{2BFqgm@9c0hȔt|*#mzugQ)]ULhYK # 7VA'0i]&ژ%J;LK cᢌo7pp4LJ L'VY0ƀɖ6l v!Aagg;;?v"lŨ#F|tZO Dsr mVt 'vQl5zl#j:oq͒B3dG%?3FE(TB*O2Qj. ˶eƫ24q7MAp碟k0 AfK .=+'@.1:K%92 &T1,dK6W`yF>"`]pn-M2]`81/W2[8BpF0 h\-_ʠ#ؿRl}ECjOڨUCV66,*p~ )*tŎ?~}hk[v (oDlTMk*?-!(@c؏PfhBS(2PPLS, 9Bڵ'UA)ɗ7; `'XIfk !Dѻ\7G(Unie,(Y[ڰ5rCo 6Z0 2slZ){%Aqy PBޑWz˰^bCft,TX%j"M|ٵ/^A0ס<(.Va;*y-)1VUt-{{+'%++Z/d2.מKoIćjwߋŃbT$s+ $J_+CrXRxMw ːd#S,&LLjULI΢ƜYAYZTpVp+:xLVVW  bz7n&`>/f=09^J_Gӫn|oRlYk 0 !r{x^^˝Cs/w;\ EBIJ$0)Fv)}k'<e9A"QZ#z, > b f $ϝd\S +_NS+뜑PzK|xrQ4V(̃W؆V`+AVnbQ> Yr =NMܼzݬaȳlR,-P)CR&-STy!)XJ.s̀|):XSY+5 ܷ].Y lz_hZ/H{jE-m'EƆ)QDjRDو,RR9wmm#Y~R%@b3`ҙ~مQ$-ɑdwҋ߾$٦$J*aĝ*NթS*Ʉ$Ӡ` -SrƒHc*ɕ2ҎTߑwֱͥ  _кh; 䐢eı),2+2pJ)"0)H#2L\{Ч~)q#p a+Xƨ並ƕ[J ^) o8$-Wױ `aJ=mgυZ28woj%$pU\9FB P52v+ܤ2eNM4 8zN{QVR)\=ڕsd߮3L)Nq~z?nPsK$>/# :3ȏ?>*=Ƅ~>m "Nڤl  |LJ;Oq vSΝ/>|3]wV</(5D㫕'AqTS.@CW Jy$}=R@d'>Q)*E^kx] 0JLp͔µYTTQi 0;?scr,τQ*EaF7^6=㿤)O޻k#qb$]u9F̰f+!ܩYZ4 hSp 1Ue6)2޻-З7^$E)n{ΑM'Go#vE`J3 (c9%tKex.Fdd*2@oӂ2v5u; SsewJYU Zh̐LID:)Dfp )qkAY :Qi;v[4h%INƝ=*ɂ({'cbFt~,RLd &5eØ0Aۤ4LiOrxoOƽm~`=:_'gBDNR_H{OQ@A~\}'wOŪ#C %SK$;0\ 6plIud\vz{ش5Ξ1}+p6  nF{`j4wd 5#ݱS~abkz=kJQw,lf_^qfa v/\VqϨDFݚ9\%l5'(ٰ!o\V=G%FU{}-:.5w/+`\َqُfq ߁h Im/mCvPNVV2:83ZJƖV!a4[V!zIBS]q K0X68 'IRخw6rc'|FL+uuٵmµ{=6^E! x.JdgA\侸Y^lo5`6;_|GKhǿkuIF 濍`<̭};ȧvԞ|v܍7 7+>lp7`rً|>S}⮒v fdLb ޱ;e5b pTRXNz@Ө1mBHxS?D y҂ua,"s/:G '|_Uy!?7"2Iq(ė00I#r9fIwGC/drJJOB%ZE4rh\DrGp~7_N-ߟV[n^x]^hXq&*T(Wi.0sC*RifTHTA:0"+7OKߢ|ͦ˺F Eo004l|`oŭ-v0ʃ_>2C^T-ە# !ey ʘ:'R 12m -YSN]]MS>P9DGlzp'o2 ;ZA2@$Bݎ8t*0$!ծ$Q^f*X5= ~s!FrT{жeZp߸<[34dZZ$}ILZUEZet'KWiI+zE5zxx\rdE]kro`@L*[ jP;FT ZtɴukpKaV7:VB"5QtV1Q8z=Nx6fcxSJ8yQhp8ͤ)Qr(ϥ2ȴB)lD(bJ*Bw@S:s (W%9۟t.y7EۇNzd2jhl_j8k4&n{ff2;׌jןkVm6%ӷoWs;#$-,DcarejKW<މW!3K-~$ӊO>0lekw4'\0Dbmg v^-tԕB9?:w?8?JG<l:LS&/^y^AڤqAt~!1![E|.f"BI7 -,eG kSl)X#) 0y|fs0Fn(<%.Xy2X&b8eMFdg'/>/o(21#DQ*NR̳ZcfHX4VADZT `bUq(lؠq/>LJ9F.p@\3_J|Lf<y U|r8&b8Sl 'd\Gf>g;z1DVxg]tb:}c3WBHl4}rpKoX+inNLb_lw !'@_ 6 #ԏc"L3FJ Q*2n:9KPMXɃ.ڜTjP/:J-S,\ы~tؓ#ZWH[Vm&_+ :hғsRlxL/=O&WeŏW7=+ʜvBttɽxrCh{R 1uRRiKccM-X.BZ9+/ BGF*Jilv1L]a$o |nM .(؇l:yeƷѨmTݜfL8m%,QDTM !} +&XY$.y8DҽC7QHӖ>y p^KH, Jsr8KDB)y*+7v8 N}@J L\z[(iaf%ˏamoI>]a q}^xWK"J"~7wKw==EoQ.jŊ=8;Ɣ;=vΘx =lҴ7v%cx $x2:N,n'L6sݭ8"(Nә aq 6[cL]T=?*$jDpDyM7YY#F)o=T7'c.[I4J/Щnh7?n:h!gWX3$̉B"{2YJӾ\M¤YBMfPKM "0x'ҍXgJ#yZ*˫ 8t[˄~nTZ-;5kPkrb/,PGY.h.v1n븼:G/xJ>A 8 ?#؆ˠmD k y9zc:ރ5IN-l9i;|\_p&>qjܗ @NʿX%î'/D+rIOZV绊Ӑb4؀XJř lSOxٻF$Wzǔ>àcnm?yȫZPLR},J,YEð%1YEddfN&uU^? tmBjE[oE{L՚} @5'jj00.܇y Bp3n#\cuHvI9 Dp$1e:+`]|scp3}I->KIhye6KhI9`?U #Jɩ8^ΡQ76ePTHܰar$߁?[M]ILߐTn6a iY&{Dݑcq|H!oy)~=ʕ\@ZYlasASQkE5 xN'Hk.PZІ G>lEWd(#ՙ!aɼvQƘ&%gLL3L1g?}=U5i5B'T9Rݛ?v_AmF~۽Qg% &峃. J8LU(S:3uUf,X+$ư3XNI2SB 88yn?ѧLu˦ \sv*%tAdNs &plrlNq#-|ʴd[ʰX T &}G\o#7g@q?s{pΖȥ9i|?ұλp1f*o;Ӿ_FV()v:Fp B"y/ˢ%}WX{0ZK/Sn:b4B4yQ*0Mغ)"HW~]y33A)J~Jԅ>42!@3f8Aю HĤ0t.0 :7kn8h-|!-s|O:K?;r^ŇQ'ӓjLB,ϔ}1!W$B0;So}]RGt2aEBZtǨ0btB7sXm~Q|t?7 Do}$ӈbnad0}7?DDb7t~uǓ\=Ǜ.|{;[LZLhӏ!n- 3]roku)glm30nx(\v!vЦRa"A ݳ^cmab% j̰g@(u9cፄGf#PDppNh12be9 uj-r_VI+X?Ǒ<=njX̽OjS43 udႠYڱ` N S!7LfL~=&af? e>szU3^2A^_}'dp@'ze?|%y+[*j2h'WA/ElAKr!D9mzUR Yn">'ݫw2䊢)Zj~)镑-,',.mVLBw-`eL1^Z597=wz]N)9u,VwI >) SP KdBX"`r=dA98={(~IJ( kٚtãhؘJJEqǬәWޘĎrE18(gXsu(lgι T UWxi '&,`epߵhJG,\GY\>S9BִW)ڌ{l{*fj$/rӭ_XD4qw&fTI!f/v 04+ tG=Arj/PL¢!FVkLc8,b8S-S%N^`Ok'eM-+m˗#t6Ifst"lJ Q[Gap4#E#R XP&5 Yj8j4Ǩܤu&c`USDyg' _M+t~S14)Qu+-8iYA͠fXnuHv)buݫ^F6|%l^`鱧-eB;,i%_磟j%]Gj N ɨ)Gx}WV&Ǵκw1,tR䔸5krֹ99#8r2u:էŹҵoY(`byOo>L[en Q5E7})ZYi;גFQPYh(&Z՞ԪݰIph ڵ5.*=JKTS̛S%L}7bej`L)2^)t~h23Bc.BnO5h-(^/}VQ)$h7~S@\U4KtџSNNs4y]ISцV/D+$ƐI c~ ~5Z)ƣ|T[oP1*&h3F{LOhoJR5g#'l彑Izy~MN Ld̄@ErKcʶmQgzo?eX&f;ΗAǠr~$~Hb6rͤ$ AKQGEl[7&vrmt _hiLŻv1(~>L~~95~[V(!$ܗ:TRU2l֌cR;ܗ3|0u傒`/;=gTqTK2畃ҷ9+ nzRPzd+IxsR6 OR"+F.8g6Eڻ{'3pdǛ#%̈1LSX@tqpLC+0gN9-!h` +҅^11ޡ?qLctp}o] gQ!2y0.\ϐg|_u~z^=eP1{0n:}xu}I P{Vw wv_u.؍j'F|ȅ4@QXwA5 y1Hz:3'12RATF[r$ 8gU,*/1"S(4AJ,NӸ ?Y쏳-)e +#;R1F{Y"%jcV,uEɝYMk3XR@v>xr0]grOKKlL,KIyz%Q_sx)6ᢷt}^?s\+Ǹ3̲__t:{/:'g]ӹ\.|g} `_ks֚xX87pγ>եU {zeNg/^^9rQx~5qOK( %U0]C-peJ;5`~[~?'GȠ*Du>itҋN/q7 :Mo^s*V~v/W?&?T:OUgn]}7w330Qu|W}9??5/{P1~? <=i޺qiOJ ?{?2L@򗾇/is46ȳ~ǃIaq'kger (t xOwa_GtSQo'PT: 0-o`!y _e*H句-_ϔW岆+ϛ*i-Nw;?/$!F0MU2 瓇132Fw17)0w<,y+ [1Xwgsxwن1?ѻ wgY'~P9i Bp'Sh$l5SNw }e08"i Bٴ[+Rn~knS79nI߹N|ukldXǮK0݋meibȟ~kjjm5tlo/š0%Nx"dٸ=q?6sI6ETPtx]~BԘAFVBVcyH%ÊWAuѿOm=9?M\ܙA2Ț['oZ>HPڙPu,w~τDYTVn5;aޠ+y"ebv"ebP9nޔ4O8G9.8vhPPd잣Q__E uFRnB: XG5g콻f4{{r59;iCfjTd?/e\3m`c00{f^ID90RM]OR76qIߍ/X홱w:(8lx2Q8Tu86jЎtv Ͷ,{Xl,kW;]LlkU;Xk !*E:o:FHi@62[ Ė*.kEcm,HqTEo])F%*GB 8ǸS.1Ƶ0 $4iq0RiZƨ!RظX V&R'cB[v[x>tZ?ʙ]]['U[WqrY@ &ΎZME" DlE1hLfڶ11oS@IP Dupk*.& bbI1qEF(&bS&#k8BT܋ "!n.YB %lQ%#B9XZ4r؂4'q(q jȕvrU7VAN**(D$ pOe섲aIN0ApqP4r&5`:6CK pIێj3:mx,IbcE@zAC:J0s t^ ,Mvbag> @ g0(dgђC) _qz}`F#mdF+\",2`EwW wH1Y60a #N@'Q,Gn}^4VO~h(ݲ| r~q3yI|5~>F'Y( ]. {MΟY@!DD4t"zsg>2E$ݓb/#s0`>kuM:Y@j>GA^ƽ><"0/O(Mx +:)梂,vo@$c j)E\"\BD.ng?-3 j3Ce빀˒pD6)*ePXё`ņAcj<~qʹۓ0VK'P\&O7L僫,If|&4D ^G>mB<&a4Youj Sq5 7 lbᣕ;wqghg/߾vI·9BcVXa(ڼ-s[A1Ȏ7aKJ;`U醺N$/e)g ?kd}٧/7ujp=Q;_נ\5wl)ۓ-Q9%]+߷%/>_ζ_jo?Pzv9*صS:Ԥ$c7'A*db2sv8CqH2Bf˦fP/9HH_x7$Yda <<<#LXM[#j)2׏!!u/Lx+4+7% d ,y3qFmQ*C#/kY <:(}xOd?}?Hucc 8LjhѶH6uQ6Z H0} xDŽQˌ2^IF AeW"pvSf=^^Ky˝AY\j!_~;^7rmq?y%"HC\9 Ap7V#d٨wQZ ;I'wo-Gx:+ltJ).='QRXq8'HQ\JIK#gDRDSlUBBDDH գ=Vr cΎȲXix U@,DHc*mDQ̰$A\,Q+x2U (f#b8%a-Vk-LODP*)0*q2X;Cf>us0YbX9N5)?kY.&ȚЈrBE0,[%HvYETD5ԩD>j1Iq2]SB= 1Fk8 DO%IJ Ԋ[۟>^;ޞl QgI-*2C3vz{[X {>R"w%Vp F<"mf1nm< :FԚw…5G zgEEʃ”FvFLmF 䴁M{+[}.WXia"bļr;#k.ǣ.؎F8]`h3^:֨slB0|<6V6}+O:MFߖ5㙰6Ǝ7~C Ol.َ6sCbh&B${݃ XqDfs7fhQ݇oO%xE?ߊő5dҬƬn}s 9 ycؑ>CH'(>&,r+)2}eDi蠔i+gmϝѳuOP$Gx6uً7@M,GK$i9JšchB85͵J.a:ёp]+!\32\.b}aȧOζ>NHSnz/X*0K*q HĞe*$r,,ḠRڃ0+ ,E2m9[*ͮclQ^cڥ=;Ծ}3< YI")s[+21@$0fnJg9mD:ML\ =Bd6 N3PɁf ,lxDYEw4S %;,N{(=|D +Qg]~Prgn]mf:}f.t]n#7/̠l/ à39 =$0H˭ĖIv'YTXJ- mesMr\vZz&jR( jZDN ,K24%J9K$^eqU9bBzb<5bSV4 i@%*˥x6 G᦯eE*\a&ؕļ0:9)x×^*O9W(2cc d=O@BiQå5B醔VPeilb ?lY 97hs& #ƺ#ڒU'뙈Gf0ԪkzeD/0"R}S6Ai..~ $z`u䲵52=upp},#a6<(94s Θ5kSpy+ %EB "~{[I5-˂Ɣ1LhD 2 c5'(U0\Ac)Qnةq:T*3"ɔL9#3BIZp1kϕQt3"lxȔ]:C ߾<ж]/\T_>O(wϐ z$%c^bs,/ ^خ=jޢXlhw=XAP(l-Tb^wZ{4{.UѐrEEFHj 5SW|yH>UQۧ;F>1cٸs6 9I)sMht_;>oQ>۞U4|$Y *lI:1jP> D!Y:Ke2Z ѪѻUwةic-yH5Rk׽LY%cIFJ:Ab3M!ŶvTXMyri2\tPPwTMGP8}\P@&}F gt̾=3 5Wav%#'N]7?ϽDgLè"w}LI9r/Ei 17ُ߁|ep^;(p5>sG!Yo$. l7C1;SBZu*p?MS\ L*4Aa1Qw#u%uq^nZ 䵻 8[3G'U~ț]h0|+:cV "/v3Jč()F=4Ũ`u&afs%mƐ.Wglω!pJÈI47BL ;#*ׁ[]8qM p"yT[τ uH~GUAˎw/A(>v)y 3H#lbvυ 48sEnC"c(90 ?nj]J `רԞר0^*ŇHkٻ %E2l/,aW;EJeD:v4^ j|ңľOymcʓ_DC)e*&u‰-'2fd\V(¤'y4lz-%TuOFa,A |m`pU?g3sGCzބӯ{~kbt3h~96Tw [<q s"EjhL`1uPA^m&|j5v[ES ~#d G@ tPwΞR \Y/pi0Lkh'V$K։7QSl\;hp' 8N T& !hHKEKβ8,Xg VXE(Cx\j77eH-RIDfVeIOlgW-9i=:v1'P`W*!pmTwQ^H&\5Xg"5TYky2.FXSkzEsȍl2%iON~6xAHRJLMsY)K-+e 1.qqjB"YCzht5 7Biӫ\i*Hkr Ci#F3 Gzmog+皿A(;gCz>͇.?y-Zkѿ=ǷW , /l>Cp >IoЛ_{3}= /pi4냁%GшhM< &D}&jn~PV䡜HNy:q4'γA T 2`r=40<<6B2gO;JOAhUpY3@G4h(B3b!dd,BE _ *J&" Dq¬#P4Ǭג3qF#_ɤ0NڤxBy m D^ 4%Tgb$;F.63m!R1<5DI1ҞH+@_m!VՕ$XDSzXJXTðͰz{xX}cCVqqxp `Ep/0:")gy ̓DbP,y&.qj pwle~#r`I.0;{9@@#+Xɀ3Uo(s]vznyyI226_&o͇l\H߈O: (IuMG=:<s}@ m5򩽆ib5`%|d<|(wè\OoĝGė_8,>.ZuwlXhd'uYR;I[|9Y$3!=#iѻdizy xqK8k G'n G'i -^U6AX@7ަü^p~`&aJKLҁ ұF6 n~ho37a/ 9Aӛ5o:5K_l$fz sIN.#0ˏW?+847Q'@bUN xE8:xm58:ڄC*{BҰYDůlN/n> Հ*ĚKڿⷖLv2Zk][ :xYOK49_FyD{Z RעqHp6*ϬĮ S(7/nװ#)OécHC(Rf*9`2 wi,剕*JS`ac X3Y&(u՚h }o}2p'?lX-PQ~|GB3Öl2~,!VxlWa >TG7ߋ&9}z=)~w4'bkc՛Dw!)Cw-z>s:NͩXϾ\Zv3ɻcn*?/}sm}rޭ׏ޓ-U2@%oC.EkױݺB [ĸNwTiNBuֺ.hlք\SZF5!&S5M߅ᰳ<ɬH6I%KjhE̾29 )Ԥ(39*̬(E0$V!-{DٸaÞc{We$nAP},ANCR[= АؽJ|p=Rf&V\#1ػc qi(D&X-"2D1-ndY<{jD ҇JC: Em]&B}wMhƓO-.IaUk(ruTnkvO_W~eM5q|-"6'wy*1u-~1xKVL3 Y1qoeT\R^Ttڥ3֪K#ާ;%B2o3W/:ETűs3rlKl{v'$\9eF#rF,QfmO֥sXbCj5kf6W7W6'+h܆䌷kʅaeew;yak?R~8D:q%K>}*!ۖh.Iۭ7+)hsg4oSx0^# ̡3/] xq2[:ϜNdPS`,XBF'kk9e]]':|a(]*VH$Qwe΄ė}ɐ'c8݋ .ma֔!}"ʬnO pIP,ǯ#hb\?wlVV& DśyTHrJLbC0,TYqR'/Ȥp!Gd;)%S)FV:3F7zlTLiQD)sWW_oU`GWSo3ىc Jୗ(\!S{ثȳMy7okNs [ZsdrfZ(#TTgZTiB,3@ɠRsq>TS*0G>#"%:S}]=SsP(o!̈JS6QH$6&X>/1J9aZUHxr:ʀ]mRrt. d6 JZ7`W3 l}+yTA8(U 2t5D/n}gb[WCbeVpU9RIL7~JV}5{u@zgdw8_ڼm?/ B.[wKYt6 5D*ɎY\KlR56 %SŔ,fDhfA;&mHCE}%0M#)Ӆjr3l ڮQRTNrg ĭ=vrޥ ȳ+xPXLwPmy?L\Rv{ǧIbAqԦI*R<XY->BrOڝȐ9dur5MW%KJ0m%fdl@vC}̊ZBX 2Y 58ω*~s~yE6$.o^tp`[]W,RGMM@)A2 5iДuGi}Xsnl}hgUʬ:Bۀ:2D3AeǵASi B7noE4G4Jyoe'^Q.oM㣜Ϯ'.~MҀcϐw1_eU1_eU|trdG,FJF2NkrA2e$8o$㨵BBD9*Cbt9te[<.F|Qln_=Ga=k|l+'u.5D$ʏ%Ʊ8b (K&u:2UNC$Yn(';͐^纲(7Q#Qnz5E ȑ:bx1eZ1eZŀ+ν7Ҕ#, ,S IIit&LЙ!2sQ̓,MoH|o11NWE1~^+JhSZMۨQ=MiϞ#X*L*Fr!lPKg1`VA~2i ^@G/ ĨHRB47jox5X3Z@nTZkSW1H֡3QG ZT3] u{ N/ p)lo[U3ӇNDpBȝg43$Ͽvޖ>"=? I,x&ڇ:v[~.^g-~O6`KRETZV>qpn} f {+s>8x*2ʀk]3N_"i24] M^z>,J7h|w;2f 3 M\?K?5Gt`es8ȂSTs2wx{fm=1О2SD5w`Sl9Al\@f)dF̩T*Ce!D1H^2&1|A>4 >"6*&~n~ QY9. ckqf RrވZ}ac? :5GəWuvyv)7f]d'!_5Mno~g)4#I%}k1Tڤ c[&1 K(`ǧ؃ (45|^fpwE6%l.Sn9E1=#t/У/Oټ7'BLZl죧P'1bz-'3xF妓o@ғ{r֒rm$S4mS ݚbPEtޣv; )rZӵvk^vkCB^6)#zF85ĠalxZΝSF|XU}$䕋h#wez-M&vkAiz:x\ȩ,vݚWڐW.˔õ(]ͯgI<jM-$\v}\ \vD>Q'ňGa^ļa Ĭl^ԹapgxTJmǩ C*=kUTrD?t4rRR(!4qOsdo6_T5:jW%C+uPP,]2DGD ެhLgjam%c֮I$=z&As ja oA%s-0xuPζ"AžT-0,dC Ni&;kْK*%.w> !¶a=C`ÊUHKZt,|i4 &[}5SjHǣ,&L?R8o*K)9`M>Άh<v$M]Hf"4CTb Uipx=JVF՚d.]qF%(u8lk' cTx^TcW -= DpUTw@?F"c^Ljj;ջ!7\_/] =ZBXwˋ;PPoQ=_Hq[pd˯]>5VGY5Il2dfhqhq1jyTwP6{JjFeJIJX*` Ai G2.~!y1{<:ьbt _/ˊ@a7bly6\O~|D9 nO>>Wk 8ܐL'F4ኰ@z)K.ia}yR0-SȡUN8ƍĤ3Z_s"PF:X2"%yTfP@ 2#B%LR9KD;5-MZ$ΜQk9Q5 d)Kw><'UУ(nD)RBRƨDKj.%H!V DzBRŠ CFobd|lY9Y=,35 -֝-8ݺo^$vzJf7?,\F _W pz.=&CpsK|pƷܠ7oπLpB>=:J>C1͗5ڨs(' on3~Քsltmgo_<#~=NcCcfgnoZʮJ\J ;[x_t oE&"w_it4 ) dE=K5 '$jPJjj[ 8j}+u0Qp\2)em0F=Z Zxm9yO RHE3Q)zg: 43Ń'TWFZQ/nMT/2QI@g\>PC񱝊ߥI3"")~TNuPjPka lE< f X &['dd '%$j9q\rsB@+r;H^r5R)@]$JHMuHƘY[-(p ]MAպ/vfɨ6.QG[~p'[Ƿgo~7ד]YsG+L8fP$cbv ˞yd"fH @q4P>Р +dhTeWUe,U7˹yNCco6l~DәO$br1~]L.ԏ >.m$ h *垁C 7 QrDR?u'ߐ(*"_}yB{V ÏzslA} ^ac,f,v<:S2W ,N~ޏ}A fV'pQvQ? R udR dEeUfD7_wBjFVg^mquJsWETqtI^xB%zk U7#z-c)!UqǠ<~VGCge$ƻh$ *ZL9S Xrxµ`Ar[WBKrnERhhV"L-NWD¾5ub#|4V{@gZq7q*Ł>B9T"O;&c/Uaf.{4H6vPi9zDZ;gM98D51'mRfxs,䴶/n:oҸUp8< iP6PR~فÞI== "3&.խ, cUSs܇@ã$Gq+'rL HP^Ï6 o9Vy_OONƳV`8_Tt[rxayV,jsCBƓ J0P-J IEP+rdXc - 9! `# ƍq(A,'#d&K_-fv FS>>nR5zMam 9<!14?Y;zryhn֗U rC{ sRPƇ<~@L/G޼8ɷ"O(g;xʒ;e@E55+xrWӍj*[HoSTʵ Ʃ\q9<>;e~8ICz1daߡIAO3EI')!{WI\(Mąph.tPUqtYu/ ,ϖ=Z-?(U7/owhި>h=L+}F6ּok{o `-Ne R{Œ{s .B ķCԮIo׾)"<ߜѶ \ڱ@axwi,}h8bG __ ,>pjA TsWW[١Ax(zY;p%:WvԡZX5{4e?MrAͧ= 1Yt[$K3`\\oƛz*ᬽU$u=R:6u:X7z?jr?Vi.`éd4ϹGpEJYX<ܘV4< 󖦢k/wήO'hܩZg5ptvϭ=-Q;PڃCJ,Hϔ]Wiyh \xm  S 1D*W!ڄ6 g֚6pPP_R-ͽ6^u|Υ fy ~33wqw_&Nz00_Rźw.]KyWKMFa!6.g.y<(R9; :cu>W|nq27z_֯A> g ?ٜe6s0xA&Ts.OT5L*)&dq)g3(Yl3 s;*gkхȅaT\e ZqX0[;ՠw{V^@S.\u~ڐm;o.rR8_|] YCTpzKY OrmEpJS-0#ω\0DP ;Ƹ9Q Zݨ2!EWZC;'^(VV" N;ƔV16F@JaN1Z='jʘND Iz1P4p bh2 0rbNJ#Hʀ.(ApSÄI h,uQL9Ŗ"Eʽ|ZӹKA3cDzc p㢆(J-$+AZq?OlVi"+ĶW"΅RD޾&F(O~c-jx櫙< rIr//ص8!,E@Ih4¹Qi8W1'1jP`skW =v;5&c~{x* ^:=XfJ(gZs;!`EJ<AKп,3 *f@rM-kGra0~ 9ոG!eQf A: "@P(I bJx R4ВTְ;FAH!9a]|6rfnD1i^9>O/#.b)#/CJ3t{EY]_+&RX>={o^.~Lg[F/y4#h7b `_1aL/"S+'VTMaîc"ri$bPpr:"!q0'B-^o) ";/_83j B"<}]4#f%q'4:ArekB'ϣ*6[l|?E7FyxI7]nW_6G7fw77u>斢:jt[[N (hљko~*@J^j_`q = M5B(AP5C 8)E# {(mZ$KX5IS"*q՜Z!2CA t(lT*I*m.yCm'$BU|J+XQAcΪ ]%'Qҝ-Yd<ʄ:j-@B^?뗭V9Yhؗ23] dcYxƝ;e_-&Wq/Rʣ#%b=f 9ـqd DUPv ɗSYb c!/rK y9VKǠ x:k gI ^ / ?bE[pal+wIɚ]ˆ{i$ޔqg|;+fkACll}! Z$r1m,}ey08o욵XЭVj-7)AċK\v=5<~ЩnHŵ)[WJKaJU7Gs66OtP(^0a%-ݸy0W5} 6uTRBijt A-+0u@|jI㷬*Aő>-׈r^y(׈Z 0- hk='QIjd} [Y5L$n/l!gmQ~smAi bVQI:c /-f[{uwmI_!ev1R!v/A|0=mVr,_"zgC[<U]U]xNoe$qI*iO:(oD85uFP)JkM5&Ds2PdQ;]Ӧ’yUl]$ä:EjQ&UB3<:,yy!@R[9h NIZwXyUmǧ_~(͝=OLpd..B\"r7\$Qud"1 W)U0=PxV uA 72s𐱖g:,^a3jDG֪UMb{0'eh0Wl㽳Q\y}Y(ԧHJ@\G?$@O ns(2HF4R۞qEa^W2^f&QofTBX]qO`;)d3,zUq֤ʭ0bl%Zҥ! &U}lPǽSg hXKQK[A&}5v}3)uW{/EJv7k(NcQvT0g)! RN^ h4.ŨRsIV4=y c"sB|/3fpƴ?l>"'7[=?bf~}u%]b&{Mdt)!i(ZQni^b.B;Ú=yz<?bpj4 {囋\^}Vn?xIx~vxrC lOBTfa)݂r dxQے4N5jY` /r/&p~_M0<{oI[~a@ed Eu%{*Ùn)UJ *XszK1pI#NMcGz4@J0ނ[~sR9ʹDK Vp4d5 EWJwe+3 8eK l-YÆ1FiۉMF-LL;j3DcX: Rz otL(ӀÜ+p U\ vKМ R))EaVʐǘVbs9Bwu[mLJB2TjGxV\WMJ­?B1_8b&ϦM0ntLzB1}Ȱp"qEH;9Y5 #'a&( B&Pb}\Ofj2F'/|(8L 'eYP]6(9Y|rv4/o-j'>k@ĶO'^2^-un[=! PK:Xf3ՉBc o}1"]G5PG\ۇz3zn%}Lj(!ڢwy]G"RvƇb X6]6}ylnUS6زK2*ހZU[Sŭg;0xk2mG90&uR?}8U? Acl\,?dTjʎ μ]朖J̠IK&;ة7DCG/1a1 ,|]A}AvIN%Nb%Qy3f1{`ㄏTמj@ bBB#;R36Y1W)UT*t L(i8 +TYL{)L+,:j c{rî/I u;Ax7x*hO?n3xsA*_bu>-Uו?<QXEe[U? eh'?o/&v6_&ޏOwwӏo.[&]{V04C<` ʠܿ L{v v QCW8z(hJӄ=(+uҭrZ=L GmIJTYMq౬'d517x@kSѱp %JqsT c53,Lx{TBPݢ tj'>& dmo,GIu^` h)Azo_qFƧmHesm'`Py8MO̼I'8xdk?"2{bu;*,8 "nP?Lp)xLz[6F^Ul,\W ЀLA%D&#u6Cr4! u J^@Fv٨@wՕM5_TQSQ"gHtAvi=J<-tϋb|.6V*7kbv1Zm1b2[_wH+;x8_Qx׼c̐Zwm uaڻ_[,\$tO2:dc#o,yӿ524RLw{-g)J!-*\b;o\DdJ]ϝpuK FtRdu;V€ˈTcҢ.OI%vQ9jLunQdae%h?=YӉCwcYJk1bEVx"V#cDZ1e4xѓ#>i٭u\쑏 5x\\y1NNi$0g1r`AIX?by,°qG MΘ(׆p<ɹ%v4#Ҁ}BیFH˝bt}ML:;b cnunX#pέQQK'I3XJ)I;CZcbJt7ӮZ%y$D)3p8'xil}0^{R?ʕk^DT_0E[e+V Tuoax|˟_~(ٖ &D4e"/v+<5u0\ѳ; -WB%LȰ-J3'h32OQŋ:Dmv@΃ 9 ZFE΅,ݳ$$ Rʦ^n9Ba2 u1ćFqgŔj|xLiNŃD3@4o:nEgkEV SG:G ziC|[_: P+ԯo-?~Ut]yjRaQ ԮIe`n4֬6XB,0T):~Vc5ZQjU `į >BUL#MvpI0,\F'/Bf#C>,UMQH a8C3\*eh&910kwum~Y,~f^^d3}$xFɽ~촥nvZ qVx,ɪS`* j/Ӯ fN:Q,ΞbY+%rU *VLOPҕ{D4NEj#'gm .ͩgC1v'B|SX!M0NSDOhCӘ\ET,0ݷ U2YG+ LpMGɫ}fv_zXc-~{v}%퉁>/ťWsX!#!V1k}v򂚓>q+5aOwj*5weâ_p=eKM{K߿'}B7aNB5tB \ùt8f٫ap'4$;,iG[ȤG[}7 ԑ-ԃw_7Of1"oQQ;6SZaX#Yp:Ȓ`;0Nע5[NW,5JלVRg.5C^/o`$_ \1j}NB^gՃ 7ɫaiBZ\o#蟼u t RÐ.1O0_Ѥ>a _蠏ɞ)ӞtNyJg'u^s p9skW2V&>٢W)84oOu}'%X=_^]…*T8W?)'.;\~RÓuY^œLry_n~yE\)H)ƌ>ܝbOO#SR\HL(-hvKGM7Z}ڞzL$r qn A{x n"Zծ9ݳA#>g\+#= @:&;.uf5vq l"OU6D$ڑ9&(x E7:bf5vs"r㈅X"0* qh M_~Yw傑d'lU7ү?:*.7bM JA l( ; *^Y- PTW&6PpTbm`E8* Rk\eĖDJf$s-b2hN>Q9=مzvZ Z9X̡%xGPVօu LdT zMÖů䏶GCȁɨzS@ w6he_slBJuђble A#Co:{@ӱS'f(}B5{eA8 &@;\(>]VZ7~u[e<g -(P|Rv Amzѽ1}k|jq{*=U ~M$ӃB[Oֲ6R]/r]D?YLHPELk_XI+N+F+^poHa \J*5.Wx} lJs &n꫎Q<Ç^uI7\iv~ %K;aL<OV='>֘İ2*pSĖK9w[})C.pyqLgSLIC+萄@[{Uҏoo*Vjz]_>i9LTLw8({zH+G<}bK?vLzmӽROL[kAm}{btm}kڒR8;GbpԨ"r,6TTG+ iȤ4wޓkmOzNM~< &Rn)e2F |mcض #Si{<`{1QOlpF9GEOP"7Щ0ɘIiKŴ#wj9fi" 0I>zhMIȭ.M0M0z4A=]:8Ʉ)%TD2pgPV3&0L*]&K6.x-єZ2Sg0Xmv̹hrBEkb>:e*W{- `m,SSBrRpvY8J"D[CQ %ne'1%ɤt\*28 [)c AEkm8-sS%r+!u9;-GjQCd[y _竅ћ1郋1[ӳ~~l%tR&H{x(DQy6Q;vy8 Df;t\m/نIbOkqPfWvyu}(P\B\q>%pv?on]cMCIMP :yngOC;mz)g=Ǚ˼IHRK0l vbIjG@O嫺~O& K7#>rIKjXJz I-)lD}_YerԠ(;ߠ `S2Y 44Ysdsffti9r2uuO8=a]ٵ]MrO2z+vW!l+KJ3KΒ*pmM!0b"w dmUrB( ДBJJrR7ou{1cؚumۢmX,k8 $b-TF\uK-.2Iar5ピ^p7JrkʽhW(e4WeQ`2Rx(Ňw$(VUt t2Vf<"Ƙwz⎡mC%I:^J%"]Amb8\$<{} tj ?@ {[򺒿.Gy]i!r*8l$U.+B詤l(.aa^e}G㉙Qd'!gLk[a[cN7bX(Ufc5 y4AoY-5.hTj! `A/T(|>' ”0 +Ep6K\~ѧGL$$ Žʮ2% m+M&_M=T@Zp>;SiHS "3>WJ(+g:+|륬0&aZx ^,*+$.bH:Wxxc_KAהƨskBPgP-xb/dj"UTF{'tYR8jW<eI;0с;}b&4ę!LYm2ԁ6q@(碲Hl96eQED?} ~>t*.:6mmi3\jFotD*S+pdS>U~@vͧ 0BbHGxvj20OC2{Ra&YVI&ap1>HB)9\З>I")DE(q0Z-d[g,y>)ΖR&(':PjBbkShyVwlЬ/Gdz*Lc8-^D80@"X<ՎzUTeaTNDWT-2fbvT pfO|f\ɠq<,W~!5 JƘ_c>'A7;p/Pu ] )htT/7(yB[+V\q|Zf|᠇jenT88z0βwl5$&D3PksԹ$%>3l( D!jEUjZUb( JrXJ4(i) 0$*xeI"LfJ;6t*.4L|9P !ϠadҞR%ohaAR9g8RSh+B fI uQ6HbdYG⠈L'3+A'F) (Xk*0MJҟ2YTĚA.֨I$Gơ _ʂH.p^][NbP=!rpyBǤQ!c"S|Q}v L 3 S@g;4t vկB/{S%s4[XxIJ:;*V 򦂓6n~kq>WzԀ_&w.l^-kSZ526@sZKD1Q%< *Hn-TE;lIF#11c< 9F"W׫B0iF{-VD9q{{(:Ù.d qP#"*UT5wA2HPRKsP\LHhUaFR=s*C$H\Ue% J%p3Cc".E(SĒHV-8-SV̚бOn\5Ga^,Zdznf[P< 1뛻tWKkap -"$cs4EŃ7m0HB`Я 6i5h5qOZ7UlI5"[YiU.h#1rFC/@;@)UÌGuZ6Ia(8)JkE/A\ (PV`B4{l͏eƙmw7y uaLIu,3!t؄Qd\qNu ~A?GX`DZvebdЧݠ2xbuP9bT eQjdPƦ >D)@sx΂6ygC4(N93tGHasow,4{; P ; p}յH?%E JѨVP6*hQv{%-[(n=_w?䯟P_dڢ=΅>ߜvOnNE#+H(pJݣ''iDGT:SS{!X5[|\5}|w_яU.t58PA ~Klfvޣq23xfo6u>y*Ͻ_ɗdqSR6yJN(`*a=w΃8@ٳf@NQY p@SJtD/:{eFg=N$E bb=zrb@D*T=IK7s8bo1=*`CKbYϼ-jVz&{eF=@91pe6o߱yKрlx-FNP w3'E .Nf۔23o %8:0]@9XvI>U4xLBN#a/3 .gޓhIl:C;= ϛHKw&aY= Ϝ1BIhШ˖A!QF4{Gѓ"16}bGOnA(*LGbGOn!(rGhbGO5>|"ɤGBeg3Qzpp& 9_;N?*$ˤBp)AO/T]:hGg伓: Ѹ74NB}1% I|zuS/:")ǕիYd7: :x* b#pYv65U>3lN~NI”U&CXO"֖#̂RK+ ST[H4 ր?(M}>ÛۛOK߽A3sŠE{8Y؜"mNo8)_P?懇xmۻHLFzkVJ??1&jӻOhI5!7//7'j{֬ykv~pW΢5)^SOæh{"/? :0/>˷%]C>d"|P6|; /r>4,=G8q\f.z3 4F}EtaDsƖj^E$%f*;a)3IFOkf|<(j Q4-/ey< ѣ P|yjT=P;Qωk۴Ri{z`uQꭄzVu:2T{-N@!E5ѳs_V[#&d@puAiQp4N$T|Z̺3@$!1A$"X9*&`LVV$QAŨVH9/k18z˧VZ4wݧA]To/ͧܟ>,?k~(Ϧv½ۜYWh8eLtbhQ¦l#M5A?߽ɧبnoѕ]'?mze!qqvZNXX# ye{|ʥ4%40c\0w)Z$J;%[p4&[B' X_i#`3OkA *e* 9iJgD Аˍo{A:/d3SFJQEAIj'2% #1TPcP#٨P6FٕN0DŽ2CY+Ihrlc!0 4V+`pʇpyM k6ߏ5'OgWڞǕE.(yeMe$'|0ʺB<~1D\ -+W׸ 6ήjgN#nۋ+3Yr5@,-v~yuPZ5& QGtOItޙh;HHe ƦjtsmKAwLǫ=[Jא\K) r(~|@u@_F,tY˔a#~ǍYA/),^aaO ;ξb-xF3ʼn8-ͨGje)0.~U,W=uDEG}1?51^ u:gDL@bE#+tQO/[饁(I鐔4_LI4µ~4UC%"-(x0p%7*](-@TN*Qmt vn~ ( Cm'gߊZbhi! nSv_Y`aaڏ5.g^W?ij9A?XA-U3-5@B&ٮݓ`B"ؠ~S.IFV': ba$볙LA}6z__a!?¶b7FOqxg;RO_֒Tv$Y݈;ڟSϿ&uJΠ(( (u₤vdG)"c*H" 5Bg\$ʔ!bNU`TwoWHj'㎖Z*^Rbl51bJls:e7[r&Z˦r׻oޭ.>SU[gN̻wkBDؔ":}׻iCnu11[9J̻wkBDkٔbwt꾣w3Gt2Mӻ5a!gn!.J37I͙~ՅhIˆ( .HPgr$ (8XKKZL0}˹0HC|M"]jA/]Hlڏ$ϙ#8/p!d.2)q٥My 9} U[|?{"4BH>OH3A$9yKG sԡ{uXt/Ӧ0wЕ%t/~ԑ\e5o*E[6|IP[K6P!j`0XâJ)҆Q,?\&0[fV}5 hR4os0^*ݸ_E9uB+jH:Z'<cwYXvCtc+%V F%Rd1`DB5gQ1*-TI)O8%JK PXG&j-G7SvK<|ZWCzv^4gJ[ 6L3 4HP,MRe ܾ!c)c;"x+OFvq{K\.Q涍ї/ "(vGjt@m /zϭJ٠ۖdvd3KOxxa5AT\{Sª1FdUqbe#Y:BG ZȺua 1+Jc>V]̼4S<>#hG>QUR8X1T#kbu 4B;q:bPqn\7s#5N'Qd5clURc\aSaT @, K42\4GFjmVf <)2NZ$$%ti+ f(6UllKcS-8iLY"A˚;-_JGh6Pv8; %&SX>} zG\!J bCIgH4n %,NR1 Q0:u;D{bǠ\s\ +,29ܣ,n~4vEnlUǑwA\M+tl?U׷߸1D>o? 'Uykn?mrgKP9"lT<# >z8&bm5LiaՏ@5q|BNB'ZkLiD$΢="Ҍ֐(Dh9qD:!-k ɠBVJh IB\1hqi'@lPi͐3(ǿhiA&4ۥuw~x0J/t|?3nd{jI{tOіJ`z{ysmm^ݽ4GzZv{Jhat0uE;9ƸtgIm: 9*ҬFR5涓NGΓݏG & ഛn~`}ED4<$gW1JJѱck‚RNw^2tB ~mAiw-8WMVľ O;ݔgopRK`~ƤvkCb(t*&Ǥp?r=yTV*ݝ}zxT{|F_o LT^|5ˬ{h!.cޖ x!A<,oiiuy"u?֍o]ibY A km˼q['FM.w_v$ RZ:a褺%*-Zj(dt/QEaN e\%E{[ wH* ٮϾ5/Fa9 [n yES YZc S%M**BcB-eRIRL#lQ\OԚkRG^-o6Uv]Q}*b)4\;T˧=t^&Kˆ 2'#.T4XFF:viNvDUXf'><2^Wp\w_?)7W?#ya;W'Pn UM?:tFGkB6dsj!=dz@{HKG\V~dK$m:{UDŦK8X5k ֋KB. mSN{[;xZ%z)Zzԙ `wZqtK̡F0gPjn%ۂX Z\R;ܔ+{-=) 6eY撴\XhaMI1)*3%N UKB:i)Mq qfX$H3HIiRQ`ӈLQF';wW'ZbPh.O -d ŌD2hJeSjTF6If_ʼn4ʤiDyETQ3ZU39B xqjR΀Q:ЄYkS("#$X h3IdQE)<25@>kM"P#D%KI-d3 Rkr([`5Hr 7вm3E7>TWT _>g"h%,p 0T.bfIiVpք݆ 2k"5mD}%DLm!ypJ6)FZjbqfrw^*N}.=8tZS社҉V+$/y)E0G9CtF*@!i Ŕ n9 '? w~ˇ/^% h1r^Bs2D2]/x_[B݉B @~씚Cgh|^'srWdko'y(6ɬv??ۍgo7nn!MG$Qd!"j#pfx!IDG9%$e2χW^ gq,!:W[he9*V;Ϧ?&σIZ'ud]ji&E(* jFѱ,E hG*-2lTBJK>C0_ |>̓հ-J22u˛tۺE|t[wѭ5B7&ˇ^&n%wW.s65A\Ɔ0_>~㊎CeO_7U.&ykkX&wFwWCJ6*{=^h6_^;͗wW?ml5bMq8拟1^hN%mtn ԶUҞ2Fg;퉱b.w5mM12sS)VFhcGiL030&Q$KݐȲXY7 KdvޏI,UpZwBHfւ!imEa@S oA V OTp#(3 URow; ƂPv+HSc<.(c N89k 2-?GwͶs#ˇEP%!arW+mo/W"Fg]7iR &dsX%+HGq-(2\d&E\Z,se! c vsOK'V*ú>v..hu6h&~M!b1P$=^Pmj/Zs5)DGࡒ^t8)XBϒ (ŤY 6@4ޏBb/+Gy9r|3quѶ`/ᱷ>4a@6gWI&td䈩Hk#s[kEBi$G&)">G A,Xgj۸"]q OI(>hд/=%7AXrܡ3dkedvWZ*ıVoFrNOWC[6@u9vǬkh3N # Ah=in͂uyiqYA*tI*?Ez'Wh:g)%u~.Gt@t_qxӝ:g&㉲$B1;qqΤB#cM+ᷫ]!B@{D7>T3vD+!._,v{ՠ>'e,vmdJ@tU}wk?8jlѕsTk i¸iJ&A7~f| @!b6@px'*ΐŢ&;k0EJghS#[ˇ?i 2%9v i!:qӔmUe5@3V`PcYpe" &D<̱joRA$b.M;PS^Qd: r.G=\\Tv%Cvz1١?SSdj=5RI tۻc s-Q.RP1,Z11VW$&I)?.%$[ظWbFm,o6=8}pdw" / N+\ldY5q鍝_,WJ_օDY;jS.UzW29=ְFz5۶S_qlRj~^0O8+W#6LSݙþkM0] Vu\ \J<ДKƏV\vsv\b2ԋ]hHO.Q%cM{Dw]<ߤ%h6 }nׄѿxǠNc݋w PLUѪ;~K|p/MnfG{u'z4iOPφ /ٵ/Lhvړ-qmxWC84d_k,2Ɣpa\3[ܰz݆J4sFZ9i߀D?͜sR y|xcu ýa1,(u;)/FJƈTYTVs?๗h\wZ{ 3SE2gn= \$RM4JCFՋ{[VQ4/m5Α;sJg[:QG6NW Nd~ߏT(ñ-s$j^.2P8s[(pݝJzVDj8u _22C}8$uVkaGHg<_U`H!/|ٙR\  !@5 X!Gp|ECpcs aa0G1LhS2JYM4[p<:7G׹S\(mt&D2u'[Q?e[JR22kW6K xG>-v};|O /+grNxRG6C}4KF2 g 0pl:`1ubcT 0E(mAJ<@K3>db}nE+zi#e&5Q[J'V[9*Ni95Y w,ZׅD$ᣩP\rrf4p,sEs%Ȝk3 JpcpS >P\4 :V1ci)ָͬrSSO9(ygRĀGWxF[z%`U<4XLT@G"TKFVV8wM$6#Kϋd7woO">\9~q9Yܘ??·xw2A]TBxz[z.tLJIh=B/~gl&uwn`02PvZef_}ws7l`4-^8? Zf א=ѠxbQkBy=bOpTf9?:_ bA{@eX0 W<.wm#ͥRgcDu4{w'9+[WQmu+5>rFnV<ѭj3CGp(v2\腁8+h&F)h34L3Z .V!Z /Ġ6C:͑ S ?G4Dz:j8;;P-xh =<aE\x5=fTL|GḚ4R\m)YXݏ۷ozzI pjeCDgFfFg GҨ )++%l8i.sj:#0NPrqTLx 524T`'Ժ ]~N mMM_W:IFwzB܌P6(RU!9:ȔeTOx{55DQ枳AHZ/HU!hB,Q(1GhYPSoTό*1v*JyxXHAUɭ/[:Q\k,JTaA۩dNΉB?_T T9 6e;AR,GCf̴zj+/K=8#KhE:Ue[=s&?4楺|2KEcc/kP\5=V16&G6[GAmpZ +:D: ێGLlJfN%[)[韇LrJYJ#YPdwB'D)~fvᢺY;Ιh "WL87fD">_?y\5Of~ x1(;^(}4} ]H!ߊRpfŘkwo;R5ClĪ*A֬+ɚ;%&cN8tC208`xp$)I=trQ&Ã\$"eb$C\~Xu+ yW"mb%Iكɗi|PŐ-s ;FFme|w;Ebrio+~8y {xn@@KCZ nU58[{-j(tn*P#}NzV4*;WSinϫ0AFt:'yJyqZؐhi^ܠ6G_Dm@ݔN&uУh h h h U?ҩ5\VDe^c*K+Go/9Xh_vXWk@*>='I0Io4*v5ϧ7ű3IU_RH=àdDo X|618҆3ӏQ@ic̐f=3Zul/3A|On8?&i%m&jIz8&i%F'4I46|ў4n֬CO鱯ɇȊeӄ4! i"4oejyN'шw+$ʓGWֺ$)㴛{+~qh#3cBSxHah7IG+?Z9I:I,# 3ʘ2lcL3)zQbT"HYrWXT( +d4:y(# T㾽SO8GoqCN`W85O%jmK8T α71%B j,)Uޡ+Sz4r8{zۀ9znWjNUKzM;.: c܊;n>s13/!n_W[_Up1f+(_p\CnU[Tr4Su)U8mE͵}iO?UUx,? ]YOx-[F\Fud!rͲ))Gw w trhN8φ}:whwB&cS4Ѧ%G9Dq =ӺL Se\,оKfkI3}|CEʷ,a [Oi&j0mQ'~ΜrhZ) O{<rcl3%R:fٜF-GX "~_NwI.{eh([9*\$M y:'$D`p]NC i#Dk_+0D]%27Ҩ |8~5zz!yXhS!@^r Ջ7OE^OUb1p; n#tV촳2K"NCȽ9:BW*1]膣( nU=rv@_uUcpFqw=9Xmg|?Y]>-t#ū gWe&H)qkyS8d|Vlk-W]MR)68RS ӂH6`ic8q oB#5Z;xApiq yS-=7ISƑR>LT!×*9[Oj˧J DbgcT@)Za 41X 嘲S(!~nVvX  7}@Qa^c,1`KeT8`]53` JkG[R/S ˞42@) I Kj>˹w)_ljDMR\+}D\tn//x.cfrxoi%>:Rug77*?P g{ٱw{hA릀vDn,s"dRCYxd3(aѭJ0ﶥ(ϳ<^.~ٕ[B-@Z'bq_/GY9rq=3p+zN*8S =E4\V n9nB}.]0A. :/Oܙ -G|&A즢I*>}hA(qBzv+{6 cٳ٠(Axt$Zvv][e[7F˵xOgbQiSЀ ESi,6{fŦ %T( +&Jƌ5\y{EQ:Bp!kDOP݇M&cP?O~mXM~Kq~ 5T ya%uVRWa%uVRWJ*H~h"=8-%Va3k҅7zo+)L-CxSjZ|Q~:fmmd ˰/ˏnגg6,\a] ,e-Ôs&0y3Ъʛ w$rGf(3DTJ_m9 ϓĴͮD[_i6\a<>$:=! H~})Owߦv6Ow1J(ϋ$k5bLLӉne-nj _k&똠Vzb Tp%[H]==hJWe<ӂHo8EE@UGVDzC*c"tmabhfVb CkFsAC톄FQD_m>Q(%72YzńWqCKɁȞ9vSlu_7BX K 8^Z)ɬ))bO3{I_H&#)yͤ\yZe! P4ǸǨ OEI.02%E4XZ*vhV>/CjI>s+'۫{čj9-Q55 '~ JjoT ⵂcȻ[Toz ~f?g)ӘU{1)gr!2E\@S Z/_iRs|l^~(!`l=毘2|S*>t~P ͰPdK}:]i Z ݢ?qH .ۉnO,):eZS|kwb_-~L-rtc馋v6YF)2NeA(V'lC3b5Y.>,٣k.Gi Qk",P50$J+pPvǤ.4FO .ɔ9, 9di$gX3?ӅvR)R^V~5`m^'4pRFc И9W/Lg~H?mMRBM *Ȝ yb]1=MO{d}ۏOvOnΒJ~r"g:Pcp_|%ƘyRzf{0H 8$)%cEX 2BTv']ɯiA&!Mbf\E(P*Y+&o&?&[0c|7}-n>88۠ 'l~ד'S@Y8;oy~1swwt{kɆ*8UBC!BrJҍ W JҐ MJxZf.<K.ϝZ)B v:pMnwĻ~p7LNˏo.^}W׳պWI^۾hyEQߤU)P*v%ӆ) SyA7pnjq^s3$;7c{g#,b~Fj0Ƅ,BQIsn0چ\J( + X-DiX#0RS$iJ8\2 [!uhjDdSoSlNGN Yp"oc+ n.l3yǶImYLL,&]QpKWT([01s+WZ\ukgjuV B5t##;HEz'"ݗ*D)o8^(ň)/=h5}&fH >"D[zCҼ}lPO& 6F 9%#!~LwSx.*wh7 hicԐN~I`?5iFȈ&DL~#Ė1< fML'Z%=<f7bN\==frNd^k4m;nѤS5~9N=M73n6іBh(m܈hAq;ܰⶑ c:Q T8; kCLBa9;a *a`nF{VvDv1RH|:ی}^1QmvlD#8ed '6Y@0*% "B>6ZR@IK]ڤ!]z1=r 䬒BBF6xK촧,Fƞ+C A ,+DҫhF{ k+q^D@ؤ 4OHK+!ښtew?/([ qmF.'y{Xv2eXaڱ;cIZgHfH .tw{픰afxt+Z Y\;!? 5"e쓘6TFt;h&2R]*r#0d*f<}glӎUʈ4My'P"Cah-  HD7b7378h,._!=WxٟC] zF%HSgbBEHMBq+B<]θt-A<* tGdZ+߅5hR 䊀>zqZҕL*ʫ0*¨G f>x0m$PFr._0-pJ-ȂfS דwC6[ٝ܆^OBbPX-*F˪e8.́)낊`$Hm"<\+cw͑R `5AL!C_)XV8+|?pv+VU Koʘ,aaafbB ? @ IZSUHRPSQ)CqR|:O;O."39NQE O\4?A1Gu2FXxUhJgSGMmIKm*f2WSw5pI@L#K^Iv~eKD ARZa'vtVTq5)n#9dPUVBVns,CYYM(hd!s:"璸v@' NApE늷ު(W21`vDb=v}~8KkQNJJ¸ϟLpma禲ѭ`$<l AHntqحDoA@CKjv2T<_]>^}Y.77܍\CAoh/ǣM+؅<ދW3_#Uɩ/6(}1cV!$API\އWq֊ω;<ߜ;<{n;|jW5~0~7lg\/َ|L$<=U@_~c%ЭE+Y,1yrHyE{18 +):2PD% y&ئ䀷bAnҮ*txAY j){*NR[ oLR?Iz)mPL|OÌfw!eVJ\1uxIK>S=L]89V9cUi1o8PVc*8_H3ʅʈd4u.fڠu.} n&c>qlM괂 T/tsSϮo^Nwڋ6Α]OnX K$z9j(0 ۘ t "h} O뛊m h_ .쫺]6Qm;E4R Sq։/Qq21fuC{ @PIśZބT%. 9ϕ#qDb:o:#LdAMQ!{3dsIٴ5YJʺۛ-LS0֩i\/VN R1j5zA !m<-XaQ!10h @gUg<*ŭ nD3yNab[j. %Zpu(&[9aP3lQ+FĶ*gjL2sVP| fhe.E)*w A%չerpHlO)Bm]Xvjk-%<7`KK ̈́HLB\`dA\1=$B$jPցK䎚5ٮ:87&׾c=BbcITM F>T91;S9c)!s.q^uG-etѭsSGfJ9le[sƔeTɋn=$=To)ڱ4lK0֞^X}D${?jkLTT( *dAue*mI"I yNn(% Wu*h "ܝTr (GKK1n)q*nersjqSKC|D( IH ,l@c֜j~$h=u, oۅ9p]rȘ۝n&U]8í|~#d߾~B7 9tua=3y{ kUIk]C%0 5yd5pCYVUdgwuuS/{wqϝ>Թzc}#~uu=mq@;:kD\ !f3\?h'z7ͧO` AVAU?ݻ85*j(Am?]e/>@ՐL KCa]=ٿDY ѰJd:9ʇ{b}+`erhg=O+;Ed\Nz%8ݎNN:EdDz9=3z]N*/b zBuyJ|gmqB1c n :sBO@Mh$  [3< BIT\Iam^=꥙W EIY =)F `2HpvtRRpj-akh LtLdZ'E<ܦpxFӐӍQ1ݏf6]\z;u.tne(Zԗ)YWn58_LMAqeKF9 fP cr1g Fjx17R~_y7=3KbX1z=ğjds8n>^x:.p<7wyy}w\3b}qNO[h<}vn4?GB޸9]$|k|&@|cx|unv@U`TxԨ(CtX= 3j0?\4\y.AF\KaBG?\8Xe1Mt(+ 8e0f/yږ9Lj͓LoyT/mؗkЄٺ/! 3_p.N"AR@NLn XXaۮXDA&/L~T3sfuE\d{_z FHN{1;owA߅&QHH%+˜(01 3h#!aX(TV=χ[|>+{5x}(l|Qs̏sUknrbwMǛauj1..[*V{<-~ ]3G8ѐ%>k-1q?(|cАwKc?OBx5?&h̎z-=D=|uk @PӕH/^q7<Ӈp)XW>&Lc$)zukɢzPJV#9- AB,2ai̕9D[3p G簤7޵6r#ޮ4|!$]~6"ˎ$O28bKZZc~,X"UM+4a"Ж_./Wëˇ`'O_dBM8uZ,M( W=͆$4aq`⯖Њ:44l8&/B lhitBpE@vK~6p#$*kp*%#CI5eBӒ#q܁_& eX_c".Y}ʕ 幌YI z\|\g@@˙ZZDJ*`q4f{X:@`%u{^~ӟ޹n "%|$]%("j#+j!l;  V<& %\QW!|鍖 `HF"wZI=a^U1dPm.'Rx&L9Pg&e/9]٧oӽ$Tq*?[YZea( Z3"Ϯm}.|RWy &Iż0 yuHT!DXp:x⭵4˙ftQF(ǯqڞjЩ6傍i*ؘBEFA/XHm"O3i[+I+DMi,x+.HZ+cY4^&w\tvb-b%X. ڙFp) Rh(o@FuTZ48Ϯ3duFw) %ASM0Ik3+.0lpV(?^A 4Ov@{W5v 1A#WRԙv9T=]kQ^W9 7IB"wًk9|z G5ႈ@.GL3 Öe^P` ޡ4n3\Ӂ.V?\}1+JZaR&T+aSNZt^4窽4S|7.Y8-ve)\^I.(o JP*+Y[AQE5G½Ħ(#4GBe apeRPVsRhTC8OŽ+\QT 둻 T~sTӻM~ ImJF[XKtO>j7Zn(D22n| 'VrWhIcz B gpy34}Q#5Ov_aTSѭk]yp5DˏDIT/2Џ@v\SM?zl<9x{`k @dsZZy%ӁV^1ы3)XW1dYmCWgײ@[.K-zl&Wz]>epj%-UzQ 5G<|w=vcmOVӶSɮzN`Pn̼i'f4|3k I9=;*քCApu ֭FMF%g?lYXjd;5G+*#.@3OfIAîVCki=Tk[ HZNX *497p))!`ohrk<ϕ9-إum/UEq^IrND$ԑ[Ēp"̹bń Y!LT)+=Q\"W4ɮ /:4ס'Rc2[Si5OݚJi֔H1:]Nߧ -hywF2ФfFy- ?d 2XȔ>ʸ: EM1Ev0.;\hT~M?iHAC. =L;RJ`]q \)Aw;᪃MOuGp][2I=sk+ywnm>Tur޹xB&kpck#\J>~;;~Iˁ!ͨC5k{$?I5:$򱫣=;R7#9hXrR.nsD ax&9Ơ*#( !4 J΀Dh9 oi 3 Z⼡@I1p͠R]Ɓ. F0m@>h> }/AiFtg5C>kabJŤ NAzPx[hcpzTO@SDL tz9!Gp8ƑSdXݦ7Ϝt޷5GMCbz=F}vdԢYl 弐*aEz"}rvwŃ:)TsG;?sc ~|`1]},)zwa^)q3_ FBnb=죏-K-]lo,F=.di~f)=KQaQb2gKEL87kZur{枎!2QGǟTEVW:UD߆pm$S%nžv\ B5 )[MT_݋@3Z`h[n_4(Cݺ M׽ t`tR&RL=Ud2"21X%Yf~7QR?e 7줸-0 7Cuzqi~/Oz|Y7u<ܸu"(OF7Z2~ªIY;V- sdqhkRipPxXcRřiB=d1tQkDFpc1`G0$D-ƵfxDܧN2ibS #$c) -v}9L{>H#B3 r@R9EͥUr/~Y S&X8 ާ94WOuS>- Nc(;^ AYuWJ5:D֫t0F9cZK# D? =2 5afE0#Uc2J71|ol Jb% Ѳ} ]C!J?YWJ#k(̎Mz!A68,֪it3iDq,͉s3éE[JG8()5=棓S:;;Y,S#fAِ 㔣08$Nioѥi!238DD歹pp4u @ݖk˟M91rbT4U )kI#2B"2䶤% gꚂ@Z 6[OؘǬenILLILc(/%|KD*˨bwm ӳ/9r %@ٖbݬy6gIdGvW.~U$Ū?+1b;r[bh=ے9ł(o( yCcĖqՍo?;!Ѣy_5~}2fc}(;0^Τj|{rsWL%bbZ,/ޚVRnǦ!ӻ[bo+tlG{m߲BJ/'=6 F9"Ȉ6"j-nR4g:;GM^<֭_aP}poY`с*MdKvvzO_x#]&uKtx!'oC֡<^.ܐq5)J:\`(:썍%~+C G*0CaYev\m$Oht'ybw&쑗XW?&5yDZ(=1-7"3B8R8Q1/d$ 7o`xbDBz$|Jt [j5*0fd>dW W %aRa!q WRI[{Z'$%"ĨZQ `ː-F* N]1r9mZdQ0.1JAL nb U$^e -`pThJ#*LX)AMU%?@2 /G#km5-!F~AR\FD.o )9F+c''aM/emν 14k%z n4ڂspk(,x'IVq- ʻԪ%vǀ^-=3AWYB,1m˩$ӮF.MGetdS n$6$/1ƶIes;07Ţ͊Jm>@69=' bFL721q8t1 0ٸ. QLpo"P$,Ŭa_^@__m7;h:ߔZFY~jrLZ4&d[pvpo)!hwhgS_GWw►޽ O6!Uv#Uajz#j6M,/Ͳ1XǮ\>uo$9"`_XO\\ZV\ud!_ؔjT,u91LnĘN;RfN)f;s"g., 767N(Fp+P0"^`"\UNZ^M_䘦WvwRTP9Ҏ_|9;31~E=Ƌ~8KucST w9( |h莎iI@4j:IsԗA>YiLS_4awMݣ{ t?0+cLϭaNj^}ܐxiAcֿXbZy :\Fd )*4Q #pUV:GVL1Pi $+kJ))ieV,\ EXeTV n!BInUjUJQ-漤[.d Te 05_SP(J RkեC`5g3U+$\CJ ޿^*%'r_ d{BCU, ?Y<5pho+0 ^xJ<\fPP# ~5i¶a `WȊOJ\ ?̋=xrchxqoa%AΘ ^a1 [3 ۗcDv$ɱ(3ʢG^pVT+6ro!I;cJ3UÃp></CX\u.{tbtEwp(lֵN}]&>tBZ Gp8%M brVST1g._e3e#b/P%A@RtJ\ȴEk4d3(FԱUi&Kp*-0\AEZ#XkUPa'A]&P Uve뷽ԠZ|ZtLԛ-μ%I[;J!*T!<&n*pշ%cb[( ZI߁ը81~ޗq4wF\`J&| =8[u= pǐeOr:Pdop)35Ez(~<pSG)"ნ=J:/T j\̾:wƘ'<'y; \q IphR_]Dj[7kaɖ.-][tѴe pP9c$XXU'q]Vhe9vkTBq?Q9,8bc5ҧyȆs6&/:,FbY(Z/fNUF m2L&Sj2ۏXE(rf 4H;MYA [2,$ҕ(_e2LE6Є;*ҔQEnR􈰒Xi8\@*%ъU$!T( +8"r@i_KvqSZF1$pSsS`4!PFM-0jT1޹9[HK pHӏ}BG/f8͆cx4}Q R;2oOb|˷ปgʥ;{/Wռxr \Ekynr8|~&J~tz~+}=ua­7=Ï8 =RU+;YA2EwuJOT}v@S/AjVn7,P[JBpI8!N:奰߻|Q5s2j`aExn(k=\V~vWǯ 1K^߷%J+߮7h5ummw$D?< a!, )M]t .ʻ꧂CvFyed>5\ 3 B /@!D_Uim )JJL6xL)a6#LA@g:j#cXnFh2Q-2ƙOy"Eϋ}6#*KJVr1ȷD%bP)\*ɍD (+j C!8Ma-[Eo%SN;w("I6eXƬ1\ ]z|H$dLu㓕/Cunpi\MǕU4,fZN `6)KO+>Tbipn=/bbNIyXiYlJHdF]Jg!RB}*8B&u('~2ĺW~LÙoO˭ЧKDWut|U}_ S 2]7!1 F_xzI.k-ţ*l!F'W%i#LvN<`I Hp`b$[pa;߷%()~MQ&g抆orulH6;ٽ04+ 0 Ca ~smf3:aF`lg db Q"4r 1ҼX)v-ӘRit8AEb ^ηPںB_dkqa8%blh/f##I8DcM#%TDVP-ON҂eܚ`6K yHY#v S"WY8P2ٮ-ML_$;WBwa/DN )(\o~5ֲ)=潓k_B)a_O{CrC"Ș6BE 1YΝ0]?knketl,/icDO:4=Um=?{yJbTIblH 0o*%@-c5N!N]t=M-/^6tu}a6\X9: ڥgq1Xxq3NO~zsĿjG}4Z躶?dǢ}?h#oH]) 5fIR"jWqؗFaR+[;B< (Q@ 0RJE&Hi BٻFn$+ zgKzptk;ÎCPSLR-X.*6qL["u2@&$&#P`03TYIYrQYZBSR7!3k,ɘX2l8[Õ4Xhc3XJ޲tȬԌ# X4_'jyT &V,!6&USIH5W`\Ά}~t(_6v9Kנx/io 7^vֶ&l1/ҞŲׯ3Мx{9; gYQ1h5J\0s$/5qaE1W`[آyLM rPiVd=K|λhelV#۰ =/)Xa%+}=r`sUNX(x]U _k,{-}{mCV [5U-9^=HЁcbC[u:pz)!].iV ۆȇTU|ƋlD)`[ ѧtJ]P-UўMy% Œy?JKAlksA8JQ?%%:1V"29+N9Xi5QR/U^xEj kETH Y3#K֌vpg$#L⺵IֱIZꓔ e {w/gZ߼o?MF\NS-KM!kP-*84w>,*׆⯨@k~Y&+i >z`N>Y\R8ez#X7P}(*> v㡍wً{4C,3Cr]-nn ;U*H2@pRSHK#A2>MW0ge2دA^KxkI$TǐK"isD;v@P(jRDs_0Y3/Q +]_=cS!/Xn(;ypu\Ǻ,oC9$m,JQۛwS)Dm6хuT[>VPSObt]&C1,K9nOGrL ښT>"5C_(TNUjҞ  !قW?C*SZ$GF`g:Xޞκ-gr;bV&tP:bVU:j 0 BvI1u|&᳗>h,U9 [= IJ%2yg,C{2ۨ&xsӛ>*wf}p4x80~v1D Qj8 535799@t+VGCYkyۻo~&OSmo|{lsu$hυeުl&Zw v)D~|Pc7rE xe~3ةڍ5oc@ V8,G{`V.fO|f;{m/@k@%nέB1B5]%%4SYgiBJQ(1)1#[/2ѵM5K so!qDM:$^ptF48],.!9'ߋzJ v뎤ͽ@gz^uxi"\i 8)aEH3୏fNǧzɢgx!?.YY>JV"^ݬ!)VmN=[,#E-B_i&;DQZKd"H/'-{]J\Mw^Z6빔vLĈr(-%_>|GȪCq0>~ۯSg5 ӛ7,Ǯq7aV|7y0!m9R.L궭.'(D36$y> I'ڣ9HNbZN5ΒNDAKE' VvΫ+7CvgOC®q4ւ4!̓}'ItXs!MN0*Mu߬ U< =HκoPËȞ1NS=0N*ap (C&L'Qvi3S٥B6H4%qw zfLy [቟qFã,`xgpS?r^ Bqy$& >\9,XQ]HЫn}g_},U^A@IzF=|Hx]27.t+-ك9PϜƔA!gL"+yKDld/(w]e1{|]6t@!% IٌgjkL(0qdbZ#[Id:A٫.,kz![HY6@15pUbe̤Ktix%DR0DBL:_KWL$_aW}!c}efhAۄU'b,Ŝ%/`Pfy(|=chl鉴kWR$[/nX1zʀhEQM#*Ѹ̅h׍\2z PD\"=G+e@1i`[kũ#&8aM\vF(dx7m 4 (re G'DE a5!Vb PJFRDHd*Avv(`,Ԃ{3R3H3aRRnB(Z{@lHHnxPcNw0傥pRx |c׳Pi!ٽ I_Y}9˽/g,rp` u&ZeDeVa J) Θ4Y&3E2Ap6r߀? Sgpl㏫uz3pv9 ;(t4m{m>1? 2IޚE2PL錘B bms_]8'/9dظ@L,#XeN; ŠpJqG P&9Gp I1R6{ @P$Lv5 ;v]Mm씠Z[_?nC0X!iQ hԸFC& ")1u8€#08iJ4Ŕ[Qmβ,@#.L!3(Tѣ6.jy{/rro3]}[olKɌ :UNKiJ&Aܺ!0SVY\sCd!%"bݴ2.h`M9yc؅tDSI @%1H9/#pGO:xO'M~X&|z`邒=_RwXݻǮVϑYMoݜo3X+sARF#r o~03%d?=x $^s,]ҙa\}0R )mFw*d!ߟWіcX(%y%O_aK)1CY]r4;u߿^h|?$7C5aGv[CE!pb,IR0n7IMyʊQo KdL!ۡK$u꼚nM`DP)#vv2fC-,2%Ww{-$)s2KSϟjqIM}M4Hl g¾AI8qlUdrA:tBJJO(`PM lp1BͲ,|kcTB8zҘ҅ :h?%Eyl~=c vdwdۄi'ZTTl?MZu  &T/ijчOhsNOA3BrZ'$8J~|9nqE)Ļwc2ZI c\ uK18Gk@(hCq㴎ˎ?} `<2 ҅oO@NKHrKr"%3:RJzrvCӡo)Pw_Ta4P+ݷ#M5@t_d:ܕag>C.y<. Vy+? k]bYk9>ްD` >8]'JdA1w/<GooF7~z }-neĽ¡yk8zd}'jo\278`aӨ)j%!`N*6܂6LDRD"WtԠ0[A%5cxT6RKI^v;%Ռ+L]ٖ|%wGY+}Hz X4)jIw*(GNRDIm OԤlmYrظ>X -36:˘UZYΣxmAj,hϪ@ieζl:Q֥֥-AN|AmP mҏ`umlQ`LFJ@'Ӹp qq&̛Lދlc,Q!Az/^`q-^zԦ)han{" j }]냭S gX͠Eaqo;S/WcC tY3^yԂ G h &Pi?])yݺԌ➮n?̭th3~DFq~E[-z+ތv<#'g'cy[5?]^o ѶCmt&rZZ|σg!3Կ??)Ϋ}xBfSXhO|Cը\u!_)Nmn9ĠHuZhv;ע9%z٭ޥ[h] GT~nLWY8&1O*T|\:ͪۘaZ1DB|zzROާ{xlttqNU,$}ɿY [Jl0Us#$׋ 8w?FN'nӂ(oRkWͤՈ`Vm@ʽ|ۃ/%CȽ|~W礶TgJOT=9]yn}蘆Y\GoY4z31^H?\/.6zosj_Uw٠;o'r3whyxG$ZpR zP\ y/js, n< ]$^:=Z-Tg 7ɞo;J!_ wU=&-D.j+ۓ}]kyť`h!PYBD5LR=ÈU TW r"JђfAYzW4xc4$F/5rݨ HV4_z-X$:< .rM xOx)!/|=9a c(w~dž{ |&CeJs `UюJњS"ʺ3 {ev_`=+{RO˿n5pwE~W"?qi(hsmANnKRmMRm"!~ִOppI=UfW*I# J 4WJ2A闺{pnv<3DzShrӖzֽk9–P=Wtd9`z`ē\B"&@f"pe"zsR֥օR;A>d|n1{[ySAV/jc[wgفNsLn.bq5 >~:<ݑwui%uNexU%eBHjVbye>fC1!i>O27!{wIml#7UL'62iLm!;e ZTx7w_|MTRZkD%$@.V;sµC":r%M^%[Td_{UYzɭI{SS\.W21`vH}P} 6u1cH{>OSU/^-UX6Γ :/LbQ0m7˧^&1dǯ.,[~*&-Lds]=P^w)r%Iɠ(H'DbxԁXFfFZRJxC$Z9ȸ_yU6֒b^6X@Q`vJWNᵠZ 4|LN`1zA DQHQAv'c5 ;T,cdv WWZ^Bt}Rk8RntVV Wx(Qrk#u5H#'`UrӣWi1#OB8@\3Oѡ@W#R=()iʴFn]e #(k8/dmf9-l,5ZG.{0<( Р8LF).[ \26h՗z3^,+K(ÁŘ!E4Tg`$aZS0L+Q&)Y ;ws4ʍ~vo۷)vq"u}?,Vm:=7p%"x|{.OL/y ˟}rRwo.X2cayF!я{,tT+"}J!߹2_;KIg7d[@tRӷGoB oJ&~1/}rKH5o9BKGg\ݡ3'Cd4&a:YTYgaXFG`ʨhĢ!n36 Pҽăp6XX 0MKۇ*;J Ftu wJXZ+0NvQp)17GMĺYOBP<3?4 RKy5^l[_2sm3UL%cy Kc私HߧYgnq2&B`GvuzYn:zhVdu)ߵ uv8ͷ.1ü“U9²KmZV!G|A>T( *"zj#?@#Z(˖e UyqԈcCVJ:WJZ<=8RRFIIOi>H^w9a Sbѓ3t'<ks6&L"k휎ҬOCh_v$d%] n_V(vkҽֱjݻm=wPׂ,W&Cݺs]jxְ;Ԍ6L囍[:ZXp2R'Yj$2 ['RܒhN\~-1Dvc%l(פgjSgؐD 3D :7֌D)ij3<\C$.#uv/{.Mav4cSimk?#?ץԎ )&sГg]d!.#9.)^|̯}O&.sy[j<Kd*lőxΚ.(圱xǹʋS TmS 8*ڗg^)@x&O)"(9~,sx4(5@ 5Dd6ݕI&>KtM㴺>^&_s-:Nܯ9^>/jbTFE*ƈ&UK^o^l66 k떒F-kjX|q2U}٧O(x'goZBIq2iǴ aHrasX9+X9OX }]tOvuרg pAIָ !x mT9BvL2JT ,PJ[<#זK'lhABPex@I=l.FEsEh 6A(1PI69LZVST,$}l8`Cm\!(<>>H%qk;f h8R)X*(iA#Kt%aC(r< xqغ;a38>;!Ys)/`Fpj{. J .i "TV)cOYsmUCU|Ũn?F yۏq.;#%*?Nq"1Ӊ q5OMxqb*MJlj@G'Y .6^GR$KPlVj{stTrdޕ5q$鿂3 Vf݊>٘aI:$x-k7ߧ Ѹ )XEʃpXyͧ$I-[W?~zk{סQnFJC3L)^:jlC Ls4' vqf;ǫdu|Nԏ~p=8K>|0\G>0&qĿfFn{?H~p@i5>gE;ԺWad䮰Q;KQy(~V=ԚJTU4_Dx]NIaЎ&mrm=˟BD3t V(^t`cy3jAԔEoHJp܌nGR1`%&AF{41r+HshVb9S[1 Q{> OX7'4 'C9J;2C:aO0KtwLWc2925} , |0ȕFTY!CY Q a$K( "*Fq->⃙8␥7WqZPe ^N󾤖xQ"J'(AY<8` i&4z:Bc5z;b̵!ejs9b Bk3g.OE6KMR߄i9F@41{.<8Ap\Qb{,uG*^Z6 V>v}lrdjŤi(t0sb`)V\/iƕUJjÛS̈́_9ٹZ D>WfqWarE|x.lXV6s.W.A~9%z0S:I&?Ff]UzGFj/܇Ek0JCi&F:fZUft~}U?}08y%3/L+#=l{;Z|.f@N1殛Й!良Q4n䘝IjRI >"uY=NXvtUܽ9?o3[E^`cȵ~b^ C@ښ[G@5yIk{ /r{:FhCmC1M~Y>1SKûB.O?fy\a:ܢiYsm8=w.n'Y q#(Ͻzz ɽ#LCk;ws{_gGw)߽ 4iܡ;kBKsF.Żi׻u[\@ZkmNˮխ S!è5JZnA5@bx(X:+ZX7pT̃EO"0?]sRt1OӋ[6=\}ne޷ٛgJ`XDI/ww~=Rgw-WozeOYvLzuMYGhpgvS1nm1ߨgw%VrSnWdj.$62yJz ڭ-9u]&(i=2[LeV y`׹Y's 3ް#UQ $On>,2ɫۉ񔏑c?E.X8mo"ާC5"fկ>e)n%f%gwOs@9$QRY[$\&~ěET|x'?.zd HY%rXMQZ@QCi -$pnkčo~LY $*N߭TkoH2.XΖo );k&yg5Xôq4Fs՞So""V[ nBgeI,)iIRҔhn iAT nyJCҞb 8 NH+K&k1ZRт*}\auXsJ\[eH4,E AJ;CNy>Qx'\jB<-$dmuPdXbT gxE J*bRJ՞ڲHר3";~ꂬ*/4i9}*3Œ@ \T'v.+?&a,Cwau0JJF"F qܓx$kN(fV;ʳ;GrkԪnWSuIpVֆgzmbmRG%O)AH>/.Wy42/w~4 PrfQ9?ƟփjgUtߎY ۑJ翌.Yiw! /.0#9IkuCnZ%6B~{ g]U;Hk\,$)i(4LZ *PzRLj6ZtxޔR j贴;u W،I)C.kQHST_SJ)gyRPSf/u7N=U<^MV_o3\M}ubT"+2 +uĿ,Qq06x^KHpoمd`㫏7ӊQJ`7qKS`&bV|iƇ/b-XI):4[m7`J%la gz7V,'].e@$n,2]WZ'BD!uj l728՞hAq RS2e/*e'\՞.~vjqbEE|jtkaюovQ?v5?è^2@2EpO#kNۧ6o7҄Zq2yo:a}i7؟P}>DY]' 8F'= Cqh']d+ FD7M2jʱH20CG+j& `*Qy# \5!PLY֟)r:Z{ ^cP p@ o#FHK9H&\pn-@@ˉq N2F9Ц_?ERe_gfr5BA%3J]9)Tg@s|E+(vhO"qOW3<gFk4ו]fFC(: "xϵRK%V͹/x%IJKFE_';&! _?kϋچCߕY{@MkO߿[)$|Kng|J?ђYۑ߯n8C ^?m<Fɜ#?>$3QȨ³w/R*rQeNN !wl*U*ny-UK hr+.t3d?$0Q0|;5jg]sw(I?;1FwwšM`8؟s <6KΆ˛Q+`(/2[I5g"xENFt&f˂ibY`F Ŏo,tϡY|nkoDa<ø9}M3(sLvd/Bk9aK0 yNmU3G&s(z(ANKO4-//Q@Gʬ$g3qġBAHS* 8aN; @e֥#K O=m; y͔oܧ izS%Ysn]Bi+妧-AHE>E>5E= fkgdUl}jH4B/c_)A7͔>i_^\ v;u!PB(nZgȣ G0~EJP+?Ja/$jo{XDUr`- DJmby|bùRgdt2+G9g ecBYLUIl1ZFqtR9#%:ǝ&`(4ZQÚS-B63DI[J]A$rH78skg<" ^ͩVybz |bZ"0P:b~p5(RM2t4Eca_ǰϫC0C5뛺S1ܷ]'u_d `kJ3239!3C.:p`mQ)0„ ցP>-"j-^锠wԠmbIf &D]E½O,N+ ){,+x_ZC:Fȫ#w@%x3pw4JY>1TKû.O?iy\qF( Hﱳ%;v6{z\NFW| ªBc\n\\'`;R :}N@)8{?{H^L@$)PhΝ P:j 0~.Mf} Tڧ;t|Z4lWߺr9R)ioΤW C){J4JC꒬ eefG׷ g.xo]GnJdo@뻐o\D[TeAO: cnm1ߨ?ot۷vkp$Su!!߸)MnrEx@i:^2{n[ej.$2%!;* =?UYeYdڢÄ\PSܢPZa~tG}2+?*?g{uZQW‰LYuRMQW4?}$.[[p!G%la ;t6λb̠98b-5 H5z|+rr<#=E9zIkF'+gPow$G0~<'$wOFLc e1L zQfWiMzjZ< *p- %: H,;e6M.+"SK YzawWՂ@iN:eA\Sk0o!⋄putPz L9p$d"0+SraiA"1` zF5@eȆK@QD(=( ,8guш,95@m)(1:0 A f̦\ȇDEš4Zh `hΥHdB(DK؋tPۈE +|`,I PJfDAh" ']o9rW ,E i< 䞂EkVIZzFkF_Ӳ/Ȫ_1sE*}Q #:p[=С?Nth{ 8Hrt;8!^gĦ1f(*UcW>&6BG SJ![ X*Sx2p#Qۅ/'Ƨ.'DxSWwu"D=cDƚpqt գoSmH\zi.&A53 dŎȈկ}n Js~t>ө5a; DؔfAux JLg7jbNs{gm MtM>@~Lm JLg7j钞"D.9UлMa!߹)9%\BwC͞^5\B:|2=nzlǪxe6\e^9RDj iIg"'O)\Oi΂"(2RIk[#GDitwϟlRV]\޴4tϷ{WE|uel6 +1rizϯ>{/]\Q#kRCG9VSAQ`Q8\p_\{zq>OTe`~qtUK< VZl5w,c,]]ahYGelPXK7$N^hF6peܩRJ#mV;#ʐML:!fG! AmLH~〬-g!{ccɡ\D6 $׸_o\ntn8w4wc&_c9_JQ|wmjtWZE蛟oOS:cA^j&EbhYx*9>R%jU2gTb*S)>4Lab:MIzoU2kl&HZ|'"qn2Fpѽ|0_qŕ.cZo~ wy|_yܮ+3o߽}: &Lȶ Ɏ*\ש,1ݴw*mmp(7v!?7%Ҧ6_ty Z4HK&0{\pl^2TRBlhR+bC+l(s i.lK:g˺$ Q#"wf'E90HE,ߣp:9׼rO\#^*Bf;Bs;2:`xbÞZ Xg:œ737QѥvIzN6]!gn! qWuSt(cx\Rb%yЦ6AF1eHtuFfgm)J+s!:-[. gKЦhXұFkCVQETBAdRDu1B LfaGH$XbK>?Im!lh$vգ:tw]7%vN@^:IPS' RdJG#uZ }fh* U5t}hgG#–[ݽB =T pGY! +=F>@lJ{Q_=d7fJ "haC0]'Cp/w&Ӳ='k v&YuW1d9(/ldc (>aUNGPkèveZs x imm 15DZe3tUE>=AQ c,T_S J{~L-ݶDsJ+Xf[NV{{nK WaǨا3qWmJ8T4C^ʱG^PjE[:Yk$X% %6Zq^-Cc9X gledi#DN!gj5ʾ?8SC'WmiC1  RiYgp!xg rަJF&ZNsSLɂUf)S5)RvwY% DuAQ&!fV4UK0;pL ,tL󹽺]o ? W+b:D`M!Q`BW3 'qucv׀ZQGIj@٨ہ<}Jmo&$h姉(ދvsYXёK%ʑaKl1P")9u"!g-s::i\Qu>|+u،RE- ]^]˱_nهn"oQ6HvƎ.3a9(uԝRJNz>2;1rSߟ<}Y~t߄vnfVbk9 i4JZ)N+d?MVրN,&) H49"hK |n?Yi% .Kr0 fκT{c@!LH9&9B_zjͿi/wk|_oh7/l|Owocg?vcg? ScAtzȚLC[Ť\p'mlrNx/[-|fd/㣒7c?MW-WvCF௱1~7R[8 ژ Ơ%>(')'9ax6Cj4I 2F!6:B^aRЏ*?Lz^hР3;UEHDEQV+\Q};GBϼαZjsי8w_le'^ʫɴBS@6?.^> 骽d2邾.mBgfWjO 9\ztL_bUHE0oڐk9?Hx->%Z1|!CtjEVMs2!IѫUQ<=I3p^)t!uds )p?:]$7\s]yأZ` 7*?  p/P}jm z*|lsӭR%P}e?<&^:Su7Eh/?1{7}s{ҿ{ÿ+yûobɼ,oqMwhjM㤦ku&!̗a er"%&۸6w'x϶y gi yAML!\]'qH ό"q 8# |C4Ç&Yg%mÀ QHр9ֲ7`(gtcbT'+謹.rA jMY>~Sp)"Վ)4j#rrEKI4 7RXipL0Qd4wcShWU8]t N4](&I^"xTCR JN(@ʖ5|2ΊT)ق:g S0hb]6SL:ƑRT-6S{7gtv`YvO" 8Mֱ UO9 ʑZ[e,%C*ZQudmo;oRm6vf!pmҊU ]RcSyDj(v<|F,ylR#6Qm|I9a[@Ѷ OpAӀN9$? ) :~F#16"xPX{3&))MIKಏ؎u7xژaTGCr6b6L YʠSˁlYwNEk 9jN ȦCc -JR*eI;I8Bl>2dI-nv >˵3rV n[x^GB/9 Vqv}ϓ^E,%v#tG&A?dڏHjH5YvFRc=5ybBId^ +6 ,$pĤ ,ܫ rgVQ}Gjǧ yl99ŜQm@0V8$X;ߝFf7 tT\DT /3_^3|߄wTݽ}?*>!{jxʋvBn Vi۰hn:h.CfAV7T( uAY6O;6,&q\#qR־FhH l TgJЂ[;b5j8:N>7 B='ZRnR#;_vj9+ٞ$ B7A#RG~ VZѰe:᜛Q|P rZ +Y@BQiH;?=At-a=s9a^Ql$ rHj IŦko/yC o̼"r6Pg5N+zma1_ܐ ->iLFT I D5 |@^i22Lݗv &1JI]ba )$Z R9 e*5N_n5s|OmZ޲UM!*Epu[^M`X?}WX/"+֋.XߑѨb ײF]U4-T؇| _m,{!emַO/wp|c]o>;oayh։PX+V\-|CB|e}o/#)Iۑoö~+nF-<gYGJ vu,)ޏKM_y)`~o2n}By#9uU c9uzRXg\ t$4G/-&" ^_uw:!`r79%Gf'JFSx{ۿobWG ףrs+3ϻu*,=juDuqx@S Zنk~FAOS9.͓`}~i2gЖ#mۍp?f@LvG-"/sFzV?:'s$vyؔ4Bܥ!7:v6"k4De[4+6|%"d-Cpgs_06T|?r zgi%OA 7f̢S^ ˹tJ~9V@Xk<$w9΢JṼ/G\ Ƀ,Y-g*Cx .]2O_ ߧ~nVv7a60m?лyw{dB2-k87D9?,(z%IQiЪG!%jvGQ!z2Bc4Q$&j#j+rP&)Tcm r -Ddw'mA)L+ v9JФ@P[Cؤ̐G\^=)0[$F=UkYy-zȍ{q}+EQ:'0 xJ[}Y5(~le{NhsQFZU4dz0]2ĆLXتsC&Ph}E,޼f{J5vd&M5cHee}ҡCRg*6aGagF' 17=Mxd(ƻNF=,n c!pq#q,R_$v` zQ W.5FُKύM?5{B:'a9醡{k+z#XFB%uJB dҚ4SirB( m{nX]1Sp*JZ8LsJűlܨ*c̚Qb'+dvzH6UR~G4z#ƍ 0+};t\튜.fx4yt!qJ{߸洝9)ö͖3͇¿gɼ/w_m77B8| |]W\^ݾ ֓ye ={{ g66_H*_'>W_fY#-=]wN J#L=Rz;dLdJ8X| Gyђ.\غb(|B|T.E-]ieVqyZC@]v1ĜO`w \VڐwUT$fG]lf Ǎ mc̋:3r} &zu&Kt X(5)٠PDD쁿d{A'd!RS2u}B .'P %+H^2IB/N?B/a:0JY!x'J0$Tcj[_( dn?X{jX(%ESHVѳ (`#/C& D?Xb l=Qz{M#:Z`PIW$k8^,SM 0!)u89]fy%2aD9HYA8+NjG To"4/$(>_oYp1*vFөJf ™MNH4>eڐ?hBFs0hM| uQXPA!a ۅ-)$njJ!Wl߉!`:֗s=F^O%o8JYƼ̭]g=>}owك%buiG"QbӑFdѡ?Y: =AK ǖPY=ϛ/1Wl*M 5b肿 z#߷!5r B,@a"g[ O9q`!\9ẽ`G+V!ᚾ}ԐvP3]OiQVt=z{EY^KC8vۋ*R"$u`%PN'>E LRjv *X36iXujkY\}U7akWB{.^V>Qil ?[~9T/tl&4F׷jZNծbM=rk}JO]wf϶UwV8}Y%@u-J)6II5&" j$l2wr֔UdTl}@c?GL(: WlVԼU)12觵c[?JQvlgȲ8tV[%4C9['ZRhWK(hwN6?4B^hBl.9Nte$Lsls) |o!m޶汲UxnZ@m޶Zȥ<>%HGڼ+_&#AkUBcEc+݃tH޳XX;n)ډ{^TuS#1/Ҋ=i}`1:)r9ŕ bY .øI}0Tkm@vɦ^&4!m>C-d9]@ߧߧ<\ dّRQm.5~14~1 d.rhwt?eg k4'Y=` y]g[U V¯RaqR m«!j2)XMp_ D-Vn>_lݷ_ݿMؖؕdv: :T;dڸ\ZVo X "A4U:1ˁ)utSgՐjw5?穷G::tB~dD촋Mrs: AE5(iYɃVR֚zI"=e_fA>\-O7~jpj}YF+لB̦KFE,̅fjPKP}Zmf\Z3ʒg 믑6 xS$夓* NX8uGqzUjQJ)Ν.*z$v7xɉPq d]`@mu!C> M }R:e K'G:GU5IP>!„<T}`T/Vy, ()LC4i"Q9AC2>F:o2+dU"zQ>`ņ!JRQ6l9#*ca3JRySZs}Z:Ih~l{襊O~a3TXdE,eQѶܛ4զEՉ@+ҭ|Ƈ^v _!- \ "R#:5J\1]m!RD!ؕVaN7 ŘЅ2Pp)hGF#LQF?KalPv|JH5i&rU]еG-!DCF->aUW$j&%&GJ{_E:W)QWYLn sjts )m _çOw[M"mIr4dĨ2CjWNC; xbFݡIڍN؄sF,sBcVJ n^7!ՐMڹBg>7R!4]0H,pٹ$uTg7`(ZB"%-ՊfEzZäL5SLFU3RZu1̏^%Cc+(Ҳ /Zd%kL^ze5|PӀ|Y",j / _p  *KK߭ Y,-Arepes[=x!ܙ#G 7+Ȫ5nݶ s b/p:{\\Ao`K!5-j= hvU tk  $a6d)MC8ޝV_^dɹERjdʶ,\ *Os825udh{0 :̱ D۽2Q~< ],{i%VXMw{3 tћs7?ޔ%z-Zu|V^ZX\A#›u$4}[ۢE-E7+Uâ$ZDWF m2rx6j*.ȜŏZ.fnwc>|c_zQNoof}KGV,ۛF*u}꼴ݽqw*eDBvdv+ȗJ1-#}mCsR[]A쓵t Y4;G ZF%evhrʑ$dE'+QKƤ6xGp<pWJ?եwOoxT㦎R;0&)QڈϷR_{ҧ8qBepLn!VQ91SUZ23gK۰gaۛx3Ժ/yz>eVƲ;%zE|w\nWEn6.ydghodxcgwAJhiR߾:խ.gg5+m޽+^yԀJIm 1:dӋzҎJ/>[M!Ja%EtF C(8C^6pgDh&j!rj@LpQtkV ,YGN^RV^2f_g8f\^Do\f؀U}92s2ӗ5+칼|i21*dM :4҃kԙ0| S _fMN(TtF2-yz\XںYYa 59P%gk ikuFgK-y9YaCWh$^:Mښf|yldʔD׬ZZ:1iST ~ՂRViHSPفZ3ә "_X0߼c9SJ{@mKfREc(%q3W+uI  *]R$ AK2)MNۤnL\J%SV1'q!ޖV7%RF/Aڲy|Ɂ%Tȩd^3D2ſrMKRxZYa*0\}z]::T'R*cXUlਓB!;g}Wa416M7(tfFՀyZi8hc97VWz>OM6di)ړ]KJ6bȵT7u?#W'fGi,FĖnl d_\+6y"`B??+]zXŨ=eQ13w|jEOҾ;n|ڿ }U=b~/zOԒ NIAKPI t1mQjjq+cd'u "Wމ()1#h 1CI2 m (H &j҃ewkU{Xj9 ͿH>1#R%G?&rVWǔh18'+n(\$+!mX2!$njY։rr蹶AZ`#iP\p,6J ~Jmr1Vv  -X(tاѥX&FL6(g Zbх"AFNg3 `CGR!➑k_cyȵ!9rmshv} ֍\k@Z|XUƙ)Fq$~\J-(ȵђE Sζ rQpx,-ɮXa _}%q_^-ڟ%}E:ohis|jG_՞gߜ~x?హ84SM_ `R4iutRX f2aʚ{ɿPdU}/IАk#CRj' rU>y-}a;A)QPc; ꈴUrksv89$Hdt\v CMwːbUd{ 03@Ȩ}"=Hcm/":SLY%b 8Ϯ_|c-s&&MEx䂆XϜ̐kۋx7C ~, `mYi,M';o'cq s~h*oW%e6N|r2 ;92=.;K%HB":F$7h75XԈwns O5;v_4Pݦ)<}'v3sQA}Gv<Ǜxj:v_Yݦ)-Yqd*0vR֍x,L]pP#}rH { deV4K>OQ~y//wN3#Jᗶ!+$W&M8lWњ;gfGIfUsKBM0>Qy 7޻"3npJ6l~_6.;gk~CfRl{`Dģ$VʐםD1L(LCt-͖Oc=)P@ ӵl>2?Oe8RU)l=Oe u}[]믰}_rO7WSBycw`oVwp˰ZVv&K_6@^tvsÿee6(Cm$o̬GgeOQw]GͭV'azh,53| ސC+HbywR{#@۟Nӛj(l z(h,o.f2qo>ϷE>|[mW>7+U]fNEN*͝1`sFˤ9&0G$s.~,r5v` op}Bpڹn(&~vq *z;FgUIL~#_UHORTECFtCFH2[j$z=4$gLQ8r$+(75jYY ']E,f#Hn4Nʜ7uqSQpmeThfNW~'q jgr˥%T$ MX7Inv̓؍Bcޞq*QQޕ$"qg+0 Lwq!iɦ%Do$IIEXYEڦb1+"+ N ˵[^/sWK+ G.X:#D4`x0F730)WRs;fI Tjb8zOj/fu:FS "qz<\J*J>x'sp=6} A[#)NA G/C`,q/E)dʄ(1$IZ#b|!um|~ny?; EI789:]?kg9͋ߟz[ZHV|0|;Ԅ}9)0\֟X~opfӦ ^656yugN]BP1wi/ `vA!Pe=kS1FQOWg[^ {Η饈дt,sYȔ.v:9|mB,V"F"8<#`aƞ>QBP*Y}F45Z!ɷiP_LZeBಚ0@$2k48cyUyy.[iOthlSRZi^ph<酙CIH9]%!/0X !0UHmGbL@THv+,neǢ k͔s\:"7I0/qǁU\KTr@Em #0J{YkrZ٤Uʊϰ3BC" ` MAjth728E)~daB\&6Sj81/6eaJZƁ'Jn}3_{BjIEl(9pܣPBjZQh%4?O2F:(1܌SG(!05!$&,Bc%70vᔃs-x3yeVKI@ZO텑6 'lL|b$a= 6VjTuxz߳c(K^#diS !'v}m,VI'ź>ή+bslzg^L4O-?͍ꕝ;(bJ^O/vzpj`Dz{HZ\^J63 q޽!RJ6Du=թ_rjJK;/ XK􋜸nvHEo{in\s}\@JSSכ~Z ț߬(%_Ydm=])-._⛿>>Sc9HУ,_YtZa?ׇ^l F9ifnj['>Za0*)zТRös~wi뷯}W aǛ8%謅 \M0+#V-yX?3"aB= !uYgAT8uV4BX1 e0,JZXgQy< ΋lI,5& fC"/i}UB' :4#Fxh4uj 7*lJZW1R)6!p{ոSyJ†[j(8Z&ZqZP.h/iʛ虉58X0qs"@i|%-P^*)<8&9ʪx#6€h%ȗPCyEMBOHagIk5z#PE6D%fPVH5:-31(z"7 Z|.c|$cHC4w`5"nV`2Fh .iaPy>!m9N`HCX]*PTB:jEdFcRAX܎0mk˜s̜s"?݂1vE}o_x>O~K$0͗6u9]3dCx>"kt}w?_A"]w9d q}oR+&pc^=6ÆW+(??]Yl++=$9+e{zbloF̻Q$WЗXʹs}Qg|Qݷmj-_#z~0F `ؒ?R iuV*uOWZ(ZorR+Rи"*zj!FRNJM"I,]HPiYhccĩɈZAIzkG9iTQJR:"7|9U%#Urf!쿼ÍY3qxU}(:@*ހspMU߿7PIkeesZ˧MI} l(fS{sk7MN]نb&)KvT풹`#ve}oWAlw*! _hi9M}:F(# Jojcn.Deig)f>flm^eBFOBMti1-_W7Rʝ6 |6|-ـelmi/p 4oBYA޼Y S<T =-L*CŸ*@k5:-9ƇhqN!*D@tiH*0\&I= IK[M%UGvȧJz_ouDw?zeɗٓ/۞ܾd (uq 42ݏ0w}Q|o.F"&hh"WhH4Bߨ+ia[I.T^O([+qVO!joTrFF<P LxD`(L@4 \:tGᩤo7xiIҚˉ&BZzy{ D Kʓ}|~RX"F1sނCۦduFL*'4VpN)JJ'G)C>10 "y(E A EID dQi/5|:I-H:D$9RE#Úw*£*ctzHmT{ƸCCvotf|x հ98O937ԦPj3\cƸ{v k1!ؙǣb 25Yk*3 =U`U.*aMl0F[ =Sr.:= ,Z|U˲-p)-$VC!j/k+ 5ŅXq U{b"u+$6(W@/8-tFjr:E]o/ZH9?B[/X!5:# A1Ũ0XeWwzYӌW} #49#z8&mc:f Y7yGBMnŏGBJB* <^pM)=^:2 G 5 Xd2#bJ FhsddEg$1㞕{V+G"&>8݁y-`5L~OE Bͭ؍NnF";Ca(ʌ? i&oNo";-w%yd0c'v:֫i2>J;j@´:si1-Ɲf`';ʏQu16aH+T]p2;ev˶#VQx,*$Jqh@9# 8,^T]}_^?(axzH>ۂ8WB!^%\B }dTf mQJK GF Ж$&Ʉ?1$ ˤ9R ](;m#jPM "%h)UhQ1IFTAy 8 F5IsTj⹇ hxt݂?Fҷ*RB`[MΨuXq7gM%QO&+*KbT"ꊨVr$.PDS3Xhj:c 0F@flی65W\݃ej]bw|}#R{}F+.?4Ffk|4nu7/\Ƿw~ L,5/>ųۇa~qq6g߽LMZ*Yӳ?8G8|XhnE#붑̻$Mm^Zל/?' ho2.h9I!Pe=#S1FQOW$[^ {"+dǀfo,ZnHW:,Voz)ŏwF*n-vǾw4}3v+@[ΗlV~[Gj%@:n]}0}ze͌4bl"/  InIY3}TK8b3>{quJIIZ( 9F~1c#SZ7%8ބf;{u\"ʙ IV̢K TnQO4x]OTJSPGBtq Wk޵6#"%"# nA/;hڭHr,濇<#Yj\%}QSXU$_h)\p$dLm b4;W$E .*B(3؜(X|U^C)]82=CJ5WW^܌jAw?*_\NrfB?_b .!ޭ(: ZYУL.R~^t}n^yj-Ve L1w%0ՋOgB2 ϗx? ?Z.ynqn}m11jDxƦ6!O޷.voꠄ,Rb u&ȨF!HƝO I\B6DG%9ʕ5+tCc˃&<2)DI X$r4@5 .5x:{?ZR\K QRBj ېnH.0>4A!zER('YuJ ,#d>OJ"06 7z>T \Tbm)h 2yԝ0fH:!fh!G9C=יyB֣V))"*BT *]pWIby3oח64v-K"nW->P#.Eؾ5M4`2 qYgaNj49IT 8~}#:w*ϒW}iCl05X鎰-[6_V9TT^1yWb12׸.)p)Z^!H?Ǩ A#Ŋ 5D3FJl;؝Gmp)oQm?}'L㓥gC翼FI%]?JWD*?\vwl7Nv1XD`0.zIi᪔BT,7n3 QοW8tRʣǸ끾']a)Ԋ{h2q]Mdǐtf Aw̿ =n>[wyb>|G5SNA 7ƓR1&GA,Excq5ֶ9J޳<hJ\spsϷi)k,WKfi}}1R'O ]ˡI!i4}h b#u~TZTZ6])aBA`"| r8T="B~ sJjgS;+BXkY + D'5GCZgӿNpaNC(1idۧE{6CIz:|R&Ƞ(h-*49ёC FPB9J5a3-IױȬ^)ES@ڃ@gvOeHˉcGO{dkQpQ_4Q> 6c OZԤ.x5 B)EuHېb\RTy!YlyHg, I"3qy0K mDu1J{*ɨOeŒAwL +gIZts 0l j*ރijh0ZcAm"Rqf\o\v(ΪSe5gx zt YAXFI18id@n69< O P5\1thdpZ1j,7R҈:2:Z<&)8jY|WPX|-R%ٗuA_L ePa e[gHu"-V-ieb9^ܷnU"fcq(F^+Z@fCoH-$KɎlZ D\x(Z~̈́)ŵED'|Tci"Nk՛§|Z4Xa[μڈJg! ]zKZHz.R3.G)A.*iFxyzF$J$-֖VއAWU_O"֪)% *B΂:s΂ ]v;=꛷ g+B?V MݗU.47kz1924k$f~rxE8_ߪ.O8e@\hp n< H@I.A=dD)zl|*L> S8]>  q>te M&htG,v*県#6+IY'T~JMOܵ_)8p&>V?S4_%N}u'hwr-?~bï\2+}㫿ǩ09%}"R̒('f2|5],D XE>Va_B9Tk*jO~1084վ8~O+ZSB!>'FGc~ʼn%;Kos- *Em=F!k֣ra3~ʠ r{j dg- ?4\j`A֡N7Cʨ\)GҵVӍ4<.#_o-bXtl݋V 6RR sP͌P^CYS5ݭ:q{|6Օ yjIݙMH\O d8 4ғwNjo>VI\|lʸ' :&ѤN &axwuENZҊ?oVN.}n:>6묃]\Vn9W? *?߄e-W'U$X\֍im}5+ H^U^%sM)ҏaUr]ݓqdSN>R?n)Sn]uPb:]Ļ1Jxn [MM#N5@kr ޭJL;x##C{= [MMq8޸2jެiw=G^KEw~z(a֏y{ N6ȎJA쁥|,Xv/w?5"($'$ Biu 8QV . f0CG+ռ1o1kXw͛: LkUoa, P?M~ȵH$]ZW%W:?Jp..@^M×E´h#UQaas!Xy4ʲ9{16_B.Zg,RՠrO͗^v1=פnJ; b朿v.~gEl酐W!0\Y0b P,1> )Llj4(njOivb94ï qpZe&ѮLފ׿uEPxǛY=[s#Mf!IgkH7X6 $z9i28l>sѰ{;s/H@G? BHMPyH4Jb I];g? Vi.~so7[t;@_ ^I7wۻջ)W\zV.%̋͞+$j9_}ekkrߤZT|X{e?\ n'7O*7dGfE^n^}>-.7no%Ni;_S*O?<\y67>T~"mNԆ*V3M; NCvNxLU+G^^IQKΑIN R@\ `NOcj>׉VU۝EjfOէ[-:_cm.A͘$Dzee}x_Va0J^~6.0]WL i<j%6q/d%}ýV)Zn./iO+%\7~n'+w;ս}"%KnO?7qjow]iw=1oʾͣyOr`A&Mʊt hx>P8$4PL͘P4 Me4WqhZ8H?Cu[eJYOHV|źꦻV1-q1#8RƌP )`y_F24C6)lFUr>uqcE<yQưYgF}db6fdl _BVT4=wwrɊ@RzN +p| IHb[=(OkE^׍S=*r Z6`r;Žɩ\Кĵ_lEMK U{чoZV=xӒ]ܴd/FOMK J9rrJSE[`뷭E#fpxX\ J'ƠUy3-1o:n׏iIGm|>4˧u}UlPΣ28Z; zzzͲHtow EX!I%;0+I]9sغ׊eSvfxCfV2Ld9aH? fb Psw4Q"wG#%;ΨF_yrW|tKz~2 ן|>%alrp$ӔN/ӅG m,vP"F,t/!t'Kba)³ my"(OfPp|3@B1/9uEq2\ <e}٫:b%em={LJQQZK;\0ē\|$EL 19)61䮀SJe>8A9#Oq )@IH jTRN?{ȍ=420/;q`O6A}80žQc;<9A!eՒljZͺXU*p4 2T2(r+m] fZJࢠNʑ*f2X-O -qSUc9#y4ܜL>,'3|#Jܿ7ogJT*fԹy. PHk.e"P%R L!-' +99$-kr@luoAhJP#Q*;.wwA䨛ߊͨ=f(}ܭ VVn )4h^ ۟.wg;WºRwKDN~gĞJ[7dt 5>=O|xcȦ7>Wgm b-2X&O!|c,5[=f:HmGO_mM.?iͧkzأ+@ZEmoYdet'/Qzq$mF΁FAL:oD9j*f o"wKo JSb+SłXP;\#N'I䝤#d[G/? 0PQ"/tdCS9 GL)'i*E*QDj"ѢlRd Ժ2Kn5Z=4-vk FUrՖ:,Gmx,) DaR ҌgB'XYsL$ 7("VͬtQoJ˩ևut_XjA>Hh"k&>w!E*&V3\4-,h iZc{U3[v(Bw&׺u!\E[(1_1L;7ϔHTW8W)ྺ &mqߔpw]QEx"HXٹd)?*/6GMQI=v42%2˰, S 6H' xRFTa8u.M)I ](r״oՖƞbWWRYê =JfxU#{ˣc\/mY.r*8BN @a}~3Yg~{kttWZn};yH{Ǣp;) |"{s X쁳o+ӈG}Gא8Jڈ]RۍC%a,ӫ޺[c"kyK%9GL BY^0)BidxM)U. X,h$l[u4W}jIds⥥SJc W⶞G0,%rn[n9F9~,d8AfG7Zq8#I>"劳x3BRDŀQrSáv7hWӼ MgZ_x=3d`bB %\zI\DXwyL['D/Jatcn $%VhI*$p*BB$꼙uИOQã>9@NF@'tf?Ek/&P2Fző.8M.ϟ0A d 1OokVW\=S NEH+K^T8X@qQGR}/natFj{= {Dq4"̠ U(kbѠc7|uhb}: F?[GDbh%ُ'vj ȡVG=nJ@WNp \+%dE&jܞ i*BKȨVv]|s/NfʍJ^u-ᵿ,oN 6^m/iz=B'~ dqۈtª_yW0-}Z\Iu|pS񘐬B~Oy\8R!Ȟ6xͧ3I1I@z=9=JPK1_ 1Ó16?hYw{- .iQfԀ0Ej(0H8B IdE[=g k3I0ESt0LqeIህ9d4cQU _~Erz(i:Bࢥ l  MN@ S~4J)r>=#C\ OpѵhVZXz}@@ ./%{Fk_xU,x|U3*N<H7?iH-Iyń!d 72nue|E&sXjG<$xϨZɽL?i@p39BñL{&GۿfrhJl]@ZsXAeȝJQ dm/Fʉ|ϋʞ?qu<Z]Y|)v&y )TR66Tbߍ~򾨰JȇI ʶ.wrKBR? 9 z=FfLJ >o[*L-'>rݙe:}3Q'?c=__L!YpY(ƎxN;PE:8!'F2qȉA 戺j3UK~g~DS1ȞSП4wE))]R '4wVk<@ʮv/3*Wi74Uw膉#'FƠ^ ݕTngF aLYOCI zJ ]a a^Fɩ$r& KZݬ|MLߙeD3{֊5a60VQ^UMs%N?I!>4WbrK*.2|#d-'J\ZO&_I Grr=r ܟpX-o<FT>rzRL6Lbأ5JV adMutWYXv-HO{hzjy]q*Xɤ.ey#k+-tWfj|ϗN/>_:} b~3ek)ŜcQ$<ϭag((UJٷ mRCr )db/G/TɟUe5̷':{ջDsI p뿬oNVMekB%4łagN1IMhC4b搈tHALhbVY\W%$1|yH7HGa0`$d(e2˶r.`߇ġm6_'g j߉ N`*}0B ȰV^J׬>{3 DV}\5ND97r%-鴚J0dGSL?fY_ Y=8k;(Ÿ$e1]pܚw isD{<> )f:ÌDrzA!.n4!tKfփdỺpz̧i>_i/B@= u,{*O7Ì(5BEL$~𓀍8a] ,Tw#e<%J5"H)a@OKߕ[6TQWvw + wawN//^^:eS4)(H1UahSOKҜ"g J( ɐ]쇢.b? VTt2^4֔$U@S!-?HS)0\&0<<^5xU}źf?D3Zp7r'ps/W3c^<ƪ%5ϳ RkQ^^i !K""&\1TD d־25Єf;s3iwb ws7n vEEm(3G *7F&@Q[i*Zb*upQp#NiF3Ԡ "螊(EWB@|%?'f 3MrC2-k33pF5? -w7>$T YkG޽/1Ԣq lp kWS`[[٤۳֝wo?jk 4^p4k st(IG/ LVebo/tt)*ٮp^˟fa5]ɥ1/_3L>ŸU|Z|VMf͛C̷tOWQ2Ƭ!32sUSF?8@KiCB[BW4ppota,PpuXBMga@ ܋|%AE)mvDgگCjjM5yjbSp`G(/bl 1+qJFf;M[<8xzc9ƛO?6HSͥiFZ\ZD2$# uRl.;(rEsLea`u\u-^4Yg'.9:ǵή-nHϥ1r3y;&79+WIvfjo'ɸ 윻7fY|/Wu4zP8P66BՖ4qݗ9mW`KifnYDY]t!\Et:}Ǻq͎hby:nK80nI֭ y*SG۱n@Gubbݎ7i0iݺАWu P8ni0t@(͜A2AxS ,c8V8r!]rƈ1b?_㯃/b{ =%f*WKk2>Fov?9#\L]Y uq+[4ۿoay{sxs5O߬?1˄YAX z`Vo{]Q{]WT==/,5:(~gp>˗tL}r|zh9R㰔j4G-ĠlT W.upWZbY co5Slh]J_鲑/.8]k}I<:2\|iɮ4x\Y :2aF6"Pgf-Yۻ -4_,Gn%&nbO?hd?$Qfe>+aD׷(ޒ"oibA7As^)Jp)˭ꃃߐdX>;':&OmL5pԑ thIB%~PtGʨv8}'7>($p%Iv}umJ|eFed{%~`{v8!8Ϙتf0/bĎTG!! :q]9N穫ߤݨ.JVgTk aShoaUgx8_zi>IMoYHIg4&je{*rMfD3AEҦ.j |fٻNqY؈u;3?~@gvPϽ<0GR*0]ѽ0rBRGx'la2zsb9&e)Am1I3 >Ԡ`ЏӮ a\vPNSl$f*v:t2ZoǹV3[aH?v e,?=3u-_6H⪁VpzJT?(tF3 /FawM. V$%έLU RyOKx4T%+~`مkF*r|6t6m[;M=܆M{z%U5x f8J/x.r˄`,r*W˥Z.r鮖^ha&FW&Wj,5X- RFi FdKs$5xЉ-^Rv'N}4RQpU0UX 2͈\Ȩ9XEsʊ`pO. vǴow|]-3za~gWw=~~9GN._oʭDnOW4{4Nu*\]Ӝ౟fG wޱ'o߯/--z8\UeUL3Nok2#[63xPTfgޫ0$ʅiB|T~x* 'unw(Qߜ1 n.d\PAd4eS/[ºoDN!(ІکrWU 6(R&450ܞT Q rqr5DԊ{`ݹ4բV6F4e#eUS*ϱ)U!B( +ټ =DɄ[J4%Зj!ex;}ojCdv}/.(Q2?HEϪ/뇵[+|C3\$/??n<\i^~o]sdU+kl˗kmJ>ȥٛj8??m#Lqwq*󳻍֎?Inl<IDҜ Ÿ݈wڭ rg$kT4?{Vt$e]~/WKL֖cxmiN9Mp'6MٚXGQ茜AAOKJ[_t $t- -_'q?[ uEI,@dڕwg>PIR=(p^hבRc& ù>߮tV[>Cj edO eVc2e&_o t`wV_/'.]ƻk`RJMTνh4, jja׹]naZUP(^W%TRшw|x8wOc)D9$=~i|zPN?]׹rv7wvm>>f  F{xN~2uA+Ji9G;ɡ=FLj ݖu7kO-kiE;=w仕hԗ4BT)T)hx(RBO)ጡU-rotǽEM65y G/GV^}^U>9~K?'xao+]HV5",VcKBMwZxK5w^. 56ހ^#w򚮧)M{h@ߌ(E:9׳1ՙ.S־2!!I{~3MMsƨ46A(&xyVK{9tVHǨإ@ҫn@Θ$*2gEE y*AɕHbeҞ(a?~;HTTQYt`H_"et i Ȟڨ}ZL+t>qQ6p5SEIbZ$!LYsu2?k8@_zOS;)H'pG1:G+Di|r!U2+m҇>^0Z&ʰ:9zT iEʐU:k }~^ɐ㤁3ng'Ѻw>\цhTT AMhT)DƎ|{:@j1g^n&FY b ϟ)_PnK]1/vIW@A4t|_'c5%LI3w/wpu'R쀝?G\p #F0n!wW+>r0TNdՅFŽwhױ9]v&bWu-K]!,2[O8cف9Q <`Hj+ª7 }; R2 tjzM _z{;}vr^&OG=i0|QIUubR5]s -,Ӿ R`4:(j ?iQR=Z'g- `q=#{f I*`;>oGqc4LQUL7j*\Xc4Q\*n Ƌ<Uɋ aPktV7$2aH\ڿr%)yQP)ĪQyWT[iViL5"T#L+(Fnk1KoIjvKuv=c;gӉ- ifR &X1;f |[51<'E$[Ke%}c a47lf+QW8F.ՠ+ S-bm_uyl,n"05Zw03Exi8|0uݘ"rbdF){1xG4%b}\ KzdUB}S~dJ턨h%z-^ ]R1);sk  '#9q kQ8F 9]SJ:@z, >*<}~w] 1=vv8u(15{0w굼62ܯ/ԋ]w1/v'&ZZ쎏t. r+3=n^AP6}wq*z5_>fdc$gv'%r6ndk3681 :dF]w_oCJ"#w!Q#'J%r~wX̭`$*RICtosR~" ܎#Yd!bSXH)J/S7:!jbubiIMa*+Zu]|S$l}{5 )+X Y+ʔpQHxvqT%b;pBxOx;P@^z3MR|ҌL^q hw9ƅ ],vBtlQ/~ {aw50qZS逻5tp]:dձ 1ŭ8cmO>SªcmAL 7T*Pu\O] / &?6|$'`(hOX &wh@ 7JOfrӋ}a>'8@))Fatر5>Ξ`сc[k8q[( >IP') e|SBrfv9ֽ8fd2`*Cc{uڧqdS YT`A9+YJJछV E-KIȍn8B)$`4DfQ}Cb|Ujf٫ռo6lT`n%eT9"lrngn-lw,WLELs8c#KXtnU%K[vR5LTؓOvs~ gøfE#>1dXANc7 f%}j̚(.z}T}^uJU꾛\@u?׿/w!gRzӐ}n>}֘Ѩ-#[2GiO:))I]ᄥ42RTY& ZdIXm `V×'kY8Ne]"$"[þzD {sv9^|~?>uiҁ9^7PBOfyYH⟮k;HTf |QJRhwmrvv mvpYP;Ȏl)щڢhczƱQ8dίRypBɠR-ϓZQW? QPÿboUdb(me[9#c|w[ V fR'nb INl/; 5)\YmEw <bD0nbĖ+.yÖ"vG6oA ?i >/|Ow*_Rl h?%v߾dqzu2[}鋿V <)NcmbRlXRt$6q *~T`{ΉlAewO@˽E-٢ })Qtg(ʟ l4ny^=Lr1]^}n'5 lB lB'Ϡ(ݟaBͥ<+7n|s=00>ƍj4 gtعlݻ6cX: $(R_Jm?wRPJQ-3vU趞o@q؜w7V%,Q ZUQ'󷏬N}E1@}* -άa=$rP0I%,]WM"aƈ&]F31b08\+قfĈVɸrtG>vzNtAPʄĈ=z6#<'403̿Oĥڏ}~$.r ,$1d"|sʋ3>k~`y91ܽtJ$j%Uf9 zcV]Ƣ^}ᵄ7{:g8O L,p|s/9` (99io~3{`}L"Wf?둍nd7.Y-f1zHq]e/9zݍ+^.蜹Wwk}Zܰ=6k' ao<(x:;(8'_ya5>|=s~soA35 ϸxE p' $i#`(QN/2$oErWT9Au]#]^T7*юj#4'ӻgeF#ڱ>rү|M6OURZs\#JQH)LoKObDHFݬrÛ* 7Z}jXL =uN}wc1ԎiA'Y~uf1% >[c K!ZKi~>-z_`wfWRݛ2_`ڟ63{S7..ŭϑ_rNxYob{iG:[L6wɃndI-dg`+7Ws;+ǥ>,ډ(aPx́dPFIjJFS6Q5{h!/5>3/$W Ï}BخPSVIXWU⺲+;ތ'B6IWFAm1!J !uQ6î?j]EAN( %@\-֓URZ]::a-!xr?H;gY Oɑ\%\obj#uGmHNJQL자;&Gr@dӆ[Ĭ8N炠1SE˸﵃!% /N{תs/6sMn2Ȝ}{!vͲz&4^ÖWQT"I!ˑ/D?P$BYC}aT !bqr3?b7w<-&%&(zjV(P!RI"J2C% \\=s̳e„g}p"*- <*eX A3;DdI$(PPr '̰@Bںu+_Q7>`22\LPל/:OGo;?|L'"eS=|qSfu,RL}kkк,͒-\U` oԳ/n^Oxb|; \ ֺHDzL1!%E}$j4?/C;-jQ}j!vuAKRKD&8S5hbY^2ыޕjNxR\XMI)N1֋@>T1QqPQH3uͿtR/~VKݛKxoF |9ՂbHX jMu#c˻'>1ItO|Igz 6Dj @89WRCj0҄rk>WRĠg :0dEcVS~EL}2Ƅ~sE=PAϰO͹1*,# @SdaQ&J»DxB[gX }49T ,nBwQN-Ԑz#L62_ގ|;s|LN Xޕ/vEHҋZ=Ih'F&$/[ ̀Ȧ-^&7Vs‚]tooQ@µT牞Qpg9(crw\q8-!=x4\Cfyݮ)K^փo$>rp7ps7s{mh)0Lɢء\6@褍Jf)WT2I eIFEp}U0)Q9)) s@`(J%('H&Vhl_.eRII Ȅe2 9HEq %2 RM g&q1kZAAZ1fKۋˇt~9dW_ziݻ v-%ՃHL%XLypa%/J%($5"#yuωWRC&X% *6zy|x^ .#˛eAB ⧇8H}lQs峋r k&(t"c^pGD!FO% D+T#cDљeH G\j𝮵{:{fu`xf>9ssS{LtW/*OK]vXڂzsdAua#\hFl^MM1y6q 0~I%S$mg܆-I.B2)%i;cgIZ? %H W Y($8C-P7!-)\p(A1y ɻf|_ Œ&"sŒ{תs5_hm=r=U/{=s&U*hՁ? =8:ԃ0R P1`fC6oCc9VIG i77uJ;vjyOkD\}Y+ݾu4qAp|-&;FqlwQ[G_ngHȮKx﵄aKn'xԗzخ((9&7x; 2=*Y6ZA=u(=ǯ^ "Ţñ>1 =4P7=:r>K;l$j̏9GBjY\^I]?̙Š=x$ÐdP?W>u{VmN;6V+ZR^%s5YHB#=@:t*m8 1NSO@G9xmGrx0H&z8MCϩ'TdaBWG6/a bA{,2?LV$QȈg pt1 hisGraWWJ), SgIl- Qz([,@$!\J1fNlMi2HY SBr @qˌ\s!/3-A,V[bMqbB<+yA "/TƑDDT#++T)U0r[`K&RB08uQ)}$̶p5;^]N8%uл)L+#\'/ ړ B+#: @ܒsDST65J1ӃRӟւL~H4CsLN,'NBڑZZTfQh!vC}RX 1Usi-v?-=,Վz.̵;-$DkR(อ{ 8C 9o#{šcDr/X.غwwNh'y<] kkWŘ fZ` IyJDiw=v nMBɶRU}dnRqj[dGb% ]DL+$'<l$eϧ3m$*pZeµx(45P.5=l(:LaLNaA}CgeI$ h@B_gEpC~69!Du:F[ <4*n= Fl1zq[cB\ rRyzRì$IbHQ:R**'@a9쨰(t4{ 2xlؔhYq>C}DNucJ@v]6hRCH9,*n8(ZZR&M&4i_Gz?p\b/CN7r [n4zDŽ+T"v\`-%:z|beQ=er蛥=?3#X5'S I6kqfQ,3)vEɨ#eGlU/j{N9S}/btiU%jq0[vfnoBJw& Z‚Omr("IvB*=^դ j4>RqPз߾DD*w)'":uX>WlUdCDG9޾as DtӭuQ:pdAH`=/vR8^*ܩ Ф7aLgXk@eT]cy αdy!֭pœq} j#uP'o+i"&e{,ηq9Q~j衊f=Xج1\Ҟe*{{uC8pf:YpZI):3p_PBcFrB 1cY=h2_wQRps Q~#|(υDVQU4䲰\$ainil VVyhryVإ9 r ݋㞺J]d_ OsϘ҉`T> uH3Q^0)-+Ǻns5b <$*wc#JʢRRctaȀR &Jf(t67n=D 5Y(B2n '7u.MfYT /U!V8 8 iE dA.:)Pe 1=W@C"eKD͆]KS8{ȂiV-CK.q'%.Pjp*BhB s!9UP"ig;ǔ,WI-޳/RKAȑ]7]7yGpz@~};o>}ZnQ< עS heX~8D;IHO⯻_R$=r|0 q0 ߟ0߿Aq_r\yX<ݺiVnX۲>>Gl}ft|d8ku\_7t'~*ugCr5hk+ѽor8' LJ&z&L5NRr#e'(f ct8mŁ8OO߄T7A_(}TovS+r>~Whtw wqwcݹgm#2c{cQ_nMGl'8#eZwnOwӗv{v!O똤˵j~FݿMO+/Nk OS2Fn]et~#Fh8Lnˠ[+/Nk -cvX>v*1mD/—dڭ{uvc%~Mw&:]L?\7kzk/ux` o Г}`M54gε jI韫gSm6~{u澻v|Q V׿C\M‰ӻ%DEÑ4Ȟ*SuF6O&<]'8qΊ& %yzfozZD RVE- vAOI**ϭ̪L #31+9-eV?_Ud^-K#('P*eW h9 +"Xot_Ot@O Bp)eۘ! dJI5QōQI驢*(qĸ5Gde8Y8)C/rh^Bf~95203}u#,᭲H;`c&w %oG`@I&S6h,_UDs$Z:~6ht`Ȩ/_:ܗw;ldye3t!<M{i6^!koꬂu|Q+8pJ4c$Lhu!5j ܓ0_@&C*eP"E PdEs@j6ڊVp-@9҅Kȑ8/ZT[TeKpEbxdI(RLɼ4PFJJL X83G. n% K0Q(eh߹ y0ZPNS(P '?e0dD MT0﹗]X끒~.6aPv+FdI՘X ).4ᬸML$+z3-"g ҕ]bp}Ejܗ]=Zv#(D)Eb1w|ESv%8qzw@(ZvљNI&S9LUv$|$BN0ʥJUvq`HK Be^2S%s ,β, eo8[v!.s A-[ !4TU :ȍaemm d-ZlF5S6b+ZXu-o\}cj:խ;'7+ŚWYcoc|kuyuw]o?9pNsQi \my8cϧ7ޡUVvWe񩋐5o5|.sC˶oaÃF^q>=ÈCC hX|'d b^-xg-.D/5btiU%jqp24%60Jjs( 4'֖?|ɍ bYͬΐeR`-V~nn ']}lk'v\o^h{)K)nF`ڧdw4VYiv&˝^)X2]KBKz| 7&-r5k{ܑj3B*N85.})s睂wzO; yߛн܍erꯦF+.OO'G<$yՔIigN>3ɡ}Uspm&}˧'#8iтy|p4呂I*$\10;K u#a|>@^njD^9ۅ#(jdx^u&cBfPsk+&͞j\`f{2Tp7*!ǃI4~P3TQWuSZ؞. ڰS膴':sucn~Y7eլ5w u1ZrϦ BbF-"c7Jt!xlS)85S3b ;$H/ӽCK{97J`B9Z4A~lOOxl ;s}F[B=}-d~`X{g7 Y_NfV@]AOa;C 헪D;!+$<3V=w1:$MiN_IMҠ$"âoPxLTJ4b@APHtԠl̜[Z;mωE9kMMMQAh=0RAӂ:Gys'14 Cm9:?l6yf"H"35yҦy'sJl27g k0lH~!mBm!0)4_Gd'K #I 93H>i9<8Rn# ᒾy<1\ql`FQ_ڇټ~)uZCv8xONYGӊIiF#*42B Y[af0ew9}Z|W\΋K hrsyz_h _ұ4H {߯X1)B ܯ&7?t3gS Iy+(,Mh]NaVjeZ$ FS\}Ӛѷ`L|N0Fvw+ֱp98>]-~=^ ۴)$;*Ȩj?ݷ,qN})VL/I hl=oƔ۰xBR&F8 Gnp8֊@RbVAjo=}ja@(II*vc4 ,xNm՟(p7sYjoqLhc62NHn >EA|Yf]wO>)Lp(OxBg&װNBCVsXgb{W(L\q(V (`WjNVn}xY is8QHkS@TR g SնƎJiUJsxcNh iuʈ,3!sat8Fn4%G@fxpd\hsrcI2+.eExEHe* t#@͟Sm| D'la6qE]&-f J?Ek~,.TFӷ;%z7Z&'tpڱ[ AU*5m7 ?f㫌l$D(O%@MN|>*rR#sNLEjw]&#i?dӈlЁ1̏@w}ROSM<)B*)!]~N "*ܭbw;. ⑦qw10ƣӻ#\m~OO;'/$-2Re|%p 9<qϥ-=c o;Sc.(({q`}Po~~Sh3ϲ98ZTNNs 8)EoFtY+**P)S\o6Fݥd#&<$MMlFFHv|zyu.z7\o^&rfRDRO 4ŕ+>@*\&%{G,SڧdA0AZBG9kA W ZnOt|wMVbXK6w}zloGprHSۮnrU<*m^oAJ$E7f:.Wo߿ h/Y度7ؐ X_>_{5ACOpz1ps[͵̄ Dq&B&.2+9Th~ߟaOBW:|q*"}}e(A3Wyn]\&aZ*9gEj:C2Э\"#hGHE(A'F.Kq؃֍sKGdJi!=~PL߇ȜTjQ٩vV{ijIi!riʕ hr{%H-!L6@鴋jHMLn , s1hYC#Փ0 *sQ&֤ W7XE0 ά J*2gij\jT*`01Պ <`y !j/=2aI *j%"vw0y98AK.4/V{⧍Z%Q5FL.njPIyFoM`5Ӑe$>K̪ qx>H4n)Q2YFNdrU=f;Q_d,[GqֱEDQWǂ2-s28(1HCXK\?@@m#P^;{)4{E~őT?ST9ufyzd絜5"{gQ9o"a8i!c'* |IG+TvW'fp8/nR@5-G`\rЎ~ڑq= ,=(+xхcb/Fa{u^2Аi4T;9D|hU 0PN~F~;= 埵'A,Og[hYPR*E-aLS]*R20 1愕Z" .Zj ;~'!eME31LT So\bt0+BUgD-=rzfQZH8wfĆ&npD=|fmf|ߦ8 c,ܚD3\LJcQJa,\hjEF 9&̰RƩNjRA ta 'qN}@qvsJiAvә14 0lpUm5+U| ʦLa\ef%Jru,UB4u;nAXĨNoTny_LKm[e֭͑ UN>xкRf֭.1SU[3V&Һ5!߹^SRKelqǼۿPWc_lɾ2dk߭XT@Jj~:_SVP+ z-pq^qSL ]1 }-0G9u0 )T3jQZS2V3ZYH[WOc)ˠ(p}w2XbB(Q 1ødTTb٨Fї* FrLq!LTM ûizss"RL(f:#izRNhډSU^d:)Ƃ= Gv8{U'jOT0PU,>N+LCQ2/+@^ZZARefIEge)=^rJ[ݮfk'ⶩMvÝNT&Rj'.WډӝK0wR k<˺SV-˺8y =@#亯>ۊ8?ۦ/nנ*dGj(E*k9مu%X~Ҩk\󆣉A|XsL`S*DKDx/Ÿr2MMs,h#8<:@4#]Й); he:tRq*4&g&qa) ~~0~_L]dno Dz. R?U1IMa_<=~kkGŔn&׿,÷ÖO|)Kn$Aac6Υ+&OD!I ]f`>p{ _WHJXJRi+U*'#g/#c#\F TRUd0BTj(mt{KRPu3*JFTVPE=2]LC( -UԠ>^gB 0RzD6Q2UhC4v16ZZ7mEgUTs*NZG7*èLj %-hx8%1bw2*ȸM%89'RLShSF;(|E`# MikL,Yn{ӵgWgrhyXU/DNh7xhydP)Dp*,²@_ߠ@K41ׯVhǟz,9A`{'JЂyhat{BOaj5q}eLhFL9h' ?By!6K[e^Jr*˫{1lT꼧$;̺DS)e,}ftfB\u!]Y' Az|`;\#DŊudn&d2%Vw7pcgZ_%deVvg"N=8]\u#N x?܄P_r98)z(9l[l} IKTdtYM䌩)!@Zsj ^S8"ѨH.ΤI8"t<#")5ID>D*3U҈:ifDzÿ]|+KKqD2[~b\57Mlv@C8N֗J\':^>jC(_3gS</8]ޮ}}]q=Mgz6_Yخf 8yQ,m%nR/ZKQ\DJ &u*zCy{FpTjFA"V4! )4 ^11K=$>zY$(6iwӒbIFu3I㥕*Q|5)5Ětʙ83b(0 f#:X1>dh\4H%iX'*/KxcCf&e93H#Q%Xs݃D̾OcE7_"09"ʆC¡9Bn6.UGvCu*US Ion{;F8Z}ûJ͚`V& P I/U4ngCzW)QY4ܮ`a}^1'vzz `gN6)X{rG`6xJU=IM 5_Y}pBko:/pphɳӊ9Ň}=z R,}a? ` }f}d-Z5a#g%Ԡ uϜ54W?!+w?LN3y|>3o ӕ}F|;؜(@}SZ39|%M4\KByQ{.Av)vLcRkG: "|WF$` "M'[CǡF} e״"P8&iUMUBZTӦaoY@s.jm0p 6oo?&Ϋ?UV?~.%%Cy=LU~2m d[FqBΓ~9D'bO:3|Smrq0+N:_e-#-sQm T$ۤ3DŨsnGi}8uN샽 mHFvmo3.+C6=lxmr@A]ՅzVl #vy] } g!^Z25uپQvۜt([]V'flϔkD4:@'{zi Io|JIN5al9ɨK5dQd=w_'W LUcΐLW=F-/ myxTfmWp7"{s]e'aO?]#Fc'RTO /-TNO:C5$#r'@˫JxDΌbpMͳt>*/IF[rrRj)!u[]l?-4\l"*m> џQ{bԝwKv JZWInxë2ٕ4dv͡Ic;lȫ 'Kh7Jz!<\hcp} kb8hr;CUhlTF@Yx1!QlG.IBUTf2J(z&n2iˎwtʕYr:l$2NEoߟKf3S!B e;Z$\|-91{zvjtv;ْEzu(>(#%:YTy)Ic$} nYF4NT-ZgrK?E]BdC4ЂWRp™q`Gey D#Mv\l0g.B^\Tی6/4 gLha"H9mgwI%UX$N!O.҅E "=h{AD7RvzFU}޻FqRVlZW熪G`dJh }7M?OBBD(Pq԰Z ƕ :׆ hŽYB:V)nHΒDOJ믭;NjWwT6˜Fr j k0fnel\j"͔ 2qI @4ߛK4gC^׀WA aה&v^D1zuG2g3kLH iі]gvk^'ΐ ot+p4HxǙ*4.X[[h6wwvO! b"$Q ^$yC^RaȕhIU5ж5\r5805j;G75K@S]Z^(qᆶIoS (mC7R4]ZeDA,M5#<&}=x7 Hq'qʾwn>@7/߆,,>Kä7և\⚣.SZa] Ѧ uhcJ<"x tW =Ih$7DڴnFN{,3C`9 l? gq nބ~E8c9 e>3*1*%(IJ@cF86Փ mx{7Zw#MiJA [H*dǕSni(%=۔;JRŔBo9ךX6'cR\ |Iݎӈcta_PD;4J}i\g3HJb#YiX)Prew9]2dUN`,p/'s NFHI#"\~ RJv.@2BX+ >fij!i^ 3ij1hė(9o|\bfKod^/-R?=HVe["_-$TRwr?0_>mho1F v<şA[@O?BWj6_塒J% 5RrjQ6П{ %oik8FGθD+W8CsC˄Ah=(w `tTpd^R;55 BKpm! 3 N1Oq>)U%SYTW3 λExȆ2*FJ'6>^jѹcLvsԽ+!dJ,{a(Pg=!)yPKixt 7u<{owQe;@7㲋gj>aJd l%偱Td܁{Jn*~V'A~Ј q!iyOVd*XCVJ3Y39t {C3y%$ m۷y^ e;:KRYs8^4]1I+4b6HrSŅ ճ{VVgigP,B&e_Ww,&9uDiem`Q ux =nRg6*x!';’0A0B\I0քkVv̫- p-lSV(y-tK?Ϡٴ @iv[hbЯMh`h 95%xO^;"%7U.6..r.q V ^d՛GgDTbzθB: A0 Te䅦u{}bq?"R%oM=DFMzзgLx8Dx\D.RP3KͼRA#g_zkӬD$L H 5V+xeãIRW-' JҜٻ_K6m%k B+:U5=]jDEJ1Q<*TVg7oJ!Ru@X85'_bZM*xs‰N囵E@|W%qBfYY{W;+;9*(޳kP(h{cO8ѩc˫{'\R0*7d!^"O/EXF?+'1н4V?w\KCg멱ONRoWZߜE'cU qW!K&Z'Fsbc M* O،&F2L$Hτ̅v빰tS ba6;f/X>b`3^_Z2H>YiuhIrmACvyZO R :ic2)p_E.E]\?u[[ TRS9={t%=k1h )\^;bFy.upSm6?ZqaoZIYaOрl`œy[ M:$c\^m^[)C栈ΤJ K.P3q4p>΢~.S %9:T/8Ԛ;h6~D/N@5[V͓Gqf%&EWþtx)k=sHp"f?um1ޟ9Vl:lqd;{>hOl=$\AQ T ݂eדBC~;CABËjzY?<4/ 7voz-C8W5U)?$@hEy!_߿MBjl)Bf>Ð𽽹b?.𿅟Rf_^$Yӷ p3^؛eA-{ݏ{8%d(&ͫ7"CcC2 B7퟊zҗAPDR!ZJ &M,-oƍl:_ l@D=~(wE(g%=C0"io ijbΎ䐢c^({E]1] Z(|li uzj7xCF52gih8luuqoؒE\![6[0k4̷(72p/߇(8#qu!"nj0@63;LڬEF7@j `ZUCW`OxuVؤ6v0$}uW,DhR(ystr\#0*eTj|Z59`ݧ!)c{5ATuo%w=*rH䵒kMq T^`NIjR-N4dR?l꾆2k.۳|Bd\%NxلeDMR > kvvsSr LGD=n#y\읙E˶MI4SBvII((FJ`dI/\qET@qͩw8|t'X;]pr5ERՋqvUZNrLO~th pv@]Shv5kjgvۄP*H)R׶ml˖C %7gŻCݛ>jIwv&y-"1jJNn9'-}D.eO=h@E[ᴵD[rF^?[ 4I/[ݛJ)mOqi(qT ^2b}͇bdX{] f/E\zou1FT(H$5LJJ$ю[#eN`SkZD;V>7NX}\lPuԀN2- 2qJByb4%6~dTi˚0xPڦԬP)kb_B墻)ɉXն)- 鷏I-tun&UWy)ۨ 6fd)GRˌNe7A DRDT!S-HPЌ4]즽WP~sO'L|&'YFX&1*I3*LҗPLҘgŘ8]%@XV1 jbY[GECN##UzDP@~yfD;>""Ƞ-D bсՋbc1qŞQT7);nD .-蠌qWZs":z 5`װ.^tZkR!{˺{nZY2}uP@ r[u&zZuPȃ&{IF#ՇOx{澷u\oor\A ١ZԑSsN4hi8kQ [o>F"Aޖ )a;:Hp6 x)Bg]pF^d *\q|0?~ 6~`˪Xjyy.Bӻkx zp?(X]8 -?B)ӜHK'Z0D1n&6UR$'|u9>ίa#ssճ Id->1׶ܡR2/3b2K2>.]mKJ8PMk5C$E8.è$#3)3>MH^<$Rf)*"^NQΩ4oS("q22-0BF!K2 5*RinrJ haGٍ| &6O3&Di@bRF+dCK 5)!9>~ )+=Ly6KIn$K'sĜ9lee7g D"p%磸wg;3DhrkJXVoZ"t疬Rj^5?wkʼHD97a&̬8I 61 vqk7/ݶi.Ϳk{F@F1~5:Q}~x|m!ᕫ6t݃/@5v S|0ȌhyZ}U%ި\4TOUg nM☠% F8u)Єrj!uINY*FZTjvF8'I xG~f̦\eєQGEȈRD3P'IJ3BR)IXqU|ޚWY)90ξWW,QPuo}%zqy~w3_ 1OhBx7lX1"Su2_mz; noA~랷=K;#~x~I7}±H8 "yxav):T}*?(-{W#l `>y~3WsF~y'. >Obvr] ˃:LfvtX\n s$ xV3/F4-A[o<ſ5as.&wne_/Mgx]>02#E6juDj,0jȄi&%f"xjB 50Er9.JOe3>YBGrϴYUغ-&VEUwO>toyC Z~幀QnJVAԿvNRF[秢E'Yx«`^h`fӻ\TH"4IDR)M]7WŵG,Cb,w /lQ2/K#45b%Xv`3vzb( @6p3K+ſ[b=og3җ&$(]ŻB jiiX>bqk2]\^LC/܄5)zM T-n ?E8+U3*mzw[g?ѝ&׉k`ng]٨l\swhJr3)٢D/cEĂ?̆:b}C~(H*JU:RNvѢ#l݅f-Λ8ATT\\}]R[ڥv-*ܢv}d-;PY#ߙ7o;rkpʆտ & N(8 |O>|փڔX yDK+oaR҄Һ "$HUdu Nv}|Lh&3ю6Y;u!R KZ!i  OuiR%=XiXr[3:H8"Kqf z̴O iˮ5GJ͛Θ+ Y'ŨNc#9:}mj¹{tt^UA^#JGG KT RR`hm$icBa%cfUw$P(Zh::H,kw: @tAvk#o<#NGjZ6-bmyXe5m7%\)8" . c6*D`RI T3l!pkIblh#Z Wd4v8M֘9lyҭi61*ړq! z<'DȤ\gc!$OhI$Bb}%XC" R*A*JDu)UtL&犕8=m'6\iŖX0&N6_wj3lB$)zN~Kp 'a93XK%&L+H6iiIPCiQb"HABH$;X{7H&ڍhI/LTV})܂rP!)9{҆r- _V*yu1~o]T!]]>psM*ThDg\ ?A|ϣRR <"K~.@W%u%ܾ,=قo*2WlLBTdqTE yc 6j4XA!=3(b=`J=Dx|J|}ɏG:a22 ^=jDm(+wFNq|cč|+-Ei͜ҕ֣`D$;RHSyWs I3Q̬L=裡; ⳵%vV^lfuEdp9JC) Z J*.bXn@ u&zBR(=^b+ߋKQf%\JB=Th_Fm{< ;eAo'{o_nRrR7or.~}ϗȾ$TU1:NB0D+J*!Ff֨*"Yvdi%j4-p4˜m'%mvo%{Tc,Mt)9A vNd^j $2-7CzmK͞o8ahҝŲ<{X۴((0VVqMT|(~Guk=*2'8N<6EU"\y"jc{n]'2Q1 52g]d 9a]i@^/ܮ|y!I(-Zl4-?juz3Nr2_ob X15-RnwEI*96W?]-3AJ I0"o&u^j\7礐_gw{>}pF6Sg rF^88g6$~w4Շ@QCAH{A}AEC60bMٷҨ+b!hWo1HbךLA%t~./&Cu>NT`l^2ا ƒY]y:?с^6]p[VY\v>We,J[s"guJMӏF wR Zg>y4{/m<|Hlzw2Ā>6յCډ`e͠>0bB/*9 P6/㠇Ce8Oޑ8E@:9Ӭx=9ڏ*#_@{9b@dB˲™1D[Y}'Sx!9$/~yx/ >PpDž'C_(aE&῾8h 46oG1i,VlyxbT[ohoˇ,or+RU*>^-G+$aLiߟt4Ljxw"CPpI6wD4.,\gvzN.c'U}/1@PSK &GfݼV;]} x>@lxvvݳѳ#T;r9} VksDa C`C],Ac`FmjҰSZѵ-xٖM +~@U͊iy_`ӳ*Z_|@zz6@ {7`xB -p1H'p0tF2U-ȔL.G.^suk}kI6sԿ!V07y6sV|:O]7\)M( o(Y`c!JD0+E gF)ci SK_<~4'qE靻Y!5>aI ~!v@vL?[7f-8dyX՞oY#P`HfMq6\W^Lk|;8IrjWx}q \HWpׯ h ( eϜ/>v΁kq3+@BGՆ`#<:o&@G-cy݁T.lOIQ0 V* ߎL{t{bْ3ט~v-hhC/VɣdޱV,UeKc\)dr]3}Iv*Lw:lӖwߋ0偾Zad;r`fXRقs~oεbZstӎ(kri` 9iCoZʎ*im:*k &BH ;DԆ(Jj).§#hla8lsVȰZV1-ڂBU̪=PI1ASi= azna.h@0QC`]#bFU4 5ۈșG`K{ SGbNK>ާ|P}Rkm<>RzWvHr`OtFVKoz\ 8e%kOg0mXK=ahOP g3}nj8@,L|3nΘ-qv7gw_}y={?1~_Kmn5aMvrnv2f9{o}:Hj)^x V\WC߅fvDl@ $`i !YC8h)$P2c6@j5Q(bi&<#fBr-]fD%V?OAv ,=>g/@.Q3+T MAꗥ(,)H=HdhrKDh5\bK /$b81èLN1_J+?чi5ʡ! $3ڴQkHEޑ> g uQBDHIZ؈, 6;* 0iT:9'}5%J)2 Pҫf|LSG9-Zl3A䱧 #`4"FM) DM 4 ES)x|l u/֋/~. sӀ.(LLg'%@} V}R_ᚮV:?HQ5csggszaVbt.~H.92EK.TuS%OrD떋A}Gv x8uhUք|"%S|Ĺa 9-*:Yp[@Z&$ 5eJXCO│NY0"IA0IƝd|S?M /n%e|4Jw:Gj}CW^)L:B]u ȖZ2r+0AEs̵"c T -GSޱ"{weE92I!;͖zjgdEǓ t;ϴgxt]xn5hqx9W( 1iN4.r (Rh ĢkĨ{(,qARaF&pfDKAYjHzٖVپ^fR"_C( l({7U*;n^y,XW߸@RU-0-K~So_v CWުw:oyхy2-]a/p. tb%wF l~D6ڒjܹ;$Lkj՝}UKJRcNihަ3,1WQ[x8n{LhE]d0Mwß]"Hըl1Qg qUo,:CB&qJ붊YRq< L!j E ۤhC䓺wC҈Hpk*:nh.ύ0(cHXeo{ XW FiÜ(sj\w ;畑r_LR/5PBk*w85QyBYTЄ/]!%l7S?jd TJ[j+r"VbMpL*/8M+3E8ng aL[w}]gR[ۻՂ/&fz,@*ƴƦ3F1v-]"`%9I#~RXVEy={Rb*t#5$}%_w*ׂ|\4D$*aw8Y=KR.Ya$31J ױ2! @`t $aTs QK]\jQ [JJj0+-ZME[JR[a^jq`W)`F@WQk嵠R>\*%T}OV'Xc&nGQ+~kRR,v6KYm+_Jt^)kU Rj1OKޚ~ jCZ 9\ʘԁ%xkSjR1TI?\Kfp3bѣrH̦>ġN"1ͭtzئ Ongu詨-F-]zFJwջw@HffIdeKASڒ8[1'GiM  |o(t?-N96YRnKr{dp\n-dt<,~ b [ٲ~&rSJudCd CLV3)#(f+{t1>|.r',)|B"$in4GgMi;±3tFE<:JE@qbf H _9)||zdg;>{S^{l=P=/KEyV'Rh0It r C3M?,Zm:ˣTc"ؑyrwS5?u&#[{~ ~&||JLOtч^lg Hys+ Xȧs"v=5b6MW_ą8}FpyHEs"<]9K3L{OOQAK .1>w+Q|H3u8VBzu{թ= ̋boy%w= U3ĕOyNBv=|JZkXk"dB5f%&oJ9NxRX_, +oz/#[k^RƙΤQ/"gox{EW|W%7k|Ey, m/6ƕLa0M@(̤hίs  K&k&Qƥ$!'Ŗ5HUJ*B0fO6j(cQBBWaHz1Sz#A1k` *Z@Id9~><}n:k+HRC4 Oh o: ̽^5*G, G~>jE ^(,Ƒ3 f"x5ǖCx<'DZ0  #V i#)O?BkѤCjs>Nd7fEAhZΝ~ܟhey7yN5Hhy7?}E /@ܹwFM,n5y39iZS/{{i霽6[,Geet{A"5^G":C7x|ȪSXR '$Rǒb)-$ ֞CCQY ?!j٪e34Q\˃U*f@f Ŵw@XK&)POZi" f"9D4Bj {DtAygTfر@:4cZ%1 D d$#Gx@1oޜkP '{O귛5ד&WMZe+s;l)"UY,ziSJ[DBlo, A{ PN:AX n ; )MRo#PBDRxrkmH /~?9Yns6$ŁЯ(ERԐR=O4{U]]]]]8ߗ6)%~+3yAe @(# s 4A *&xhe.`" [b.&^!+iڻȕF*,\g1˺Wr1eyYNFK,+pʝOF^"K}q/zm󬇪X5HUJ ҎDj8ï]wAʧ jZ&**^ C{!"-1 U,AC@<IozsPӕkkr-Y㚏pTYZ8CjQCDUR;ɬOѡ3SHN<ɚڻ%5S(UߒMvs-[r2梱iOy|3,{TVz^B)Ƨu\2!@wq \gh gMLNNg۽;ӡ }[Q@(ܿ1FgP$886L)P/$?U\_{cM"'Hqğף; J;\fdʕN\S.K:p|Ok!qv$"Bu~ݖn$a\wJBPIEl- SI'kACx)ےAb b*U3\VC%(L;#B=UӺV-Vʴ"ea(p@u>(JR+ 4"FiN770*=Ve1G$UkSr1-b0*j=K soo>qr*i_?C޼~x~Rh0ϻx@G?On "ƈ׃#z/?:e|L'Bw3faF>3+l~N}$\RA;:\ H x}6 w65% T7ЦT7, E=YBk-oVO ׯ6MXO_4ӛafr,Pſl+-MTVaY~tD4V:\ބ: :ջ_У&n3%,M[O$mȓcIɤ%6-4*z|]V(,4_J*&7baG4T.F49qi>s$:́ F(V{hD^ak֣*qy- ;Cv=}ɹ.T{1xF'/"k_[IS: ֗ZΝj]|n@䩺3Rd#KXex' hRE]ˎ ؆fVݝ2} LIQ(rȯ<}ܕ_~1cw$T˨,f1-ZFce4v-bmBLURߺ +K$߬˩bmMۇbD/m/ҿvvw?kpV3Ofvv=gW豤 gw7<'BLghx~z뾔#xlQUᨠi?ߘcw,.Fs!#~w~4] Cc >oz;:KcvO0tG5֔]8|>//By=,rFn/0x.g)Jq1, \-I7.dJs% 閎ݚb":MQE]Iԅ9]kMnmH7.e T7TmjRp̽/|3T‘˚&1X1ZW!w䌤!rfN;Fp!B#+==i옝so1ԠE3E2ΣlwըVkmhI R &ښ[3 3ip^2͘hÔuuTGAތ zJ@2Yt 8o|j1Vy2ܷm2Ӫ/݈@B{  . oțO_Ɔk >yr;'sswqBcDQT|kϯN+r}"˜`1HrwD+T0AC`^⇕yx h Kݟ&&cvO)VpNH$d?v1<аT,o;}SIzR[`U,|0\B{16xB9l42{*ALftHC;RZ gYfKpbQ){Vh $N42-d<(F21C0)ETpm f4LE bTf'6|}I%W.OFج8ge=66GWٝXm*{jazdwuC ZkX%TK @#% ٥# ;F>?Br)V>餚t#0# '%6ZsJ9Ō۵clQ56y!DN -`J6J"e}x1fH1.5h#3UtS&CJgRnjCAB? X<,X2[7TcI7og0koy6i$v`=qs8d?L%#`0^[ ˣؾE( .n%vW] $=3mUi/Wn/d%Eƻ0K{waX_ݺ C-2a=cT筫ǔ~ѨBR/ 퇙պ*]پJ"?}ߡ졋[A)VB^)vsXmJ O"2Lvo/ FA,7 w"heQN1M(xk K.ւ[PuApb$+@A^ A VT"jAUc5r}&&T]T^ԅ..|m `ڶB' `q-aT0xn:_jJIԡRs!ؙo|%Ewn鴐E471<[ y&dSB@DPD1 k4FScG KR>{C7~#i.dGJ- //O !O#"zyy y'D _`@A'b< HLE;lubbj>X6tvݻdz/ąupt{/k;֣}\gixq?!C`Bå|jf 3+[^ia׍78j>09K]EYh9"aI[BM9_[UOvYϗW@F)B'nus>_z,H͚Rf e*Q "G f$Msse0` d{>]5,? c=EyG%"@~|?Os:8I1qD{sD`;C5)_m<`!>81#moX\xuxCڡ+rClÊBRu [_4 ZT42'@2!J;4BgcbD k`x@H`X)BL!+K֨'tldZH'/"oXXR7bݮMo*e۩Qh6[} \8jGMͯzk^uכgy<A%9LfL R)5_Y7W+yz6Udlpsg!b* wl?-vVxi}^ڏmicP`  ` 9ZHHdRiEvkWk2ݯ~/s>f22 '(MQ b iHDXXH(շp,́QRi\,W2SLS̀EA/9>!#T S2 72#(H:JYXXA2Ģ/̵<,DǴyr~K1 ƛK-%f[3!D44~/`r I8!PDY(#T犖u#wf4]?-ȳ7ficl(C0FᘾL-ҙgiyw=2yyyS̝&j:Z-Vl0FiaJZYjaaN*®Ųdq0$jdv:f~άsh8=^@D&.jcUW;yP;cK-dTJD Srq n2mhr"40L "R ($)Q3ey*s\ .%=ђcHelu66H9_l'Y!~xo]A!!H[f{,lL\{ 2\o 1%IQw7X#TYm_;69O!޽H;^SXrF>=9Y8}^ho:`)6)ɦ+h}o&?^q<~ 1 GyWoYw7 to7 pAۑtiAWI'a!jy_*|r_HKۗ RظFB̤ OQ4ͨBOJhJ Sjsq Vm%]wch1$RLd$XQ ZgjWR!L$V8/5fd0`Ƃ1B ԨM:e{xq$tn¥DD(>8Nlun$BǓ$r ) z?OĤ- }FaA0`10y-r )*϶emQ,!'1$j,Q [=TC8;W8J${/x%H|, G1Ҋ/ ٱ M$DwUuu818&hJR!!/GxN}z[BM F%a($ :ǬBB-ϝvriQr ñ9 &x_$P:XrۙτE/D,]Oem^x%)E@:ꎌBRj'0k@6|0#C2!\Ҿ1)Eŗ; {—[%ݳwpX3|F[|_fR` pN{v J:YņLB.nPz2w ` 1d;5&3ܱ;&܏{  fymm3g0Oj6sFyO;NjDIjK[,?I6vlحۢm`,r䮙5gB)df 9I@e@F;E$MG?B +\@ }ЛѦWl6e;_*9ɂgx[=c|n\8ǂC3aXJQ1#y.R2 s04,CRsZ`\ fVUu4P2p"a02qIA*\qγ<Wa5AUAh (9/{aj## oߺ:rnڕ6}NգiZбpoܠ1\Rewϗ]q7IK) o-N'vvR7W$8)ҙso/~wrNF~=3NWZ,9_}?;c)aP@F ==<{;t?7W_%+ݎY0ETpq3n4$BS}ïz*7jod$_jP5hL(E.Sg !23-@D i PXNSEpr( eOM: 慒+NGI-\PÇʓ>~ oVsOO_hKAٷr뷄d.0 Q73;ŔTD%VZaSs'ec/4JSJjB օT(0_xfL!hIɤVH˔g0@S 1raJOVA <3 )nJ#gKTqhqj,+cXM gCZ 7o!A?h;qVsП!Q/` Y_4OpS&\i daqKm.:EKIa.2I>lnzCaauB{F_'?]ҟt /bD[D짫kΠӚ˷yRe2 ;ym-lEW6xgvew|] ReKkDʚ$TCdB_3i ACϟKt4SOekza¦%vzؒ1np-\S[_+-@xF! B~nU}56wx}>*V]2Gm ?DVHl]_vHli_4쭞;{(-f82kP@V$` #0nFꨐR: zduÖ'?caqm6Ywяj<05Z=-̪0z :iyA6@ 1'1_ z례mcoO@tRVܵkF_= 'F;ab"Hw{Vb"1iyw&1&2(FR,3MQ9Ez?RP r(֔#,CRg0ggRh֟  \:.=_j mPM>$LRtp:1re@LeiE2u-ԉϠGH`ùݼC,u0LigIz EuDvE>ĜGpnBR6 Ij~5KT$N vA+l!h+X%!D-^j8 tI_) zS*r⿢饌g\jNaN]U,4^;O~ -z/~j9tAsϴ饂S4 }Wp|+we:_vxj DT7Ģ@]ݫ <x Qi1ͯCHwmm)/VR]NrN\yB rOc!ERΐ"mA&g׍&q3RL]OWsRSfĕRD.RUMd-U;-KjDԡTvZX3*3H#Igp9\@R T ˹uar$GaяW#z,x1)I)Akk *; Tq.y6N_fE . qغ\`> \ H{Ų¾@J]ה%ڤP"k& pק,\QJ]FԨEvj,XTIl NE0VFI ЄLH{^X:0ôɴhr#kk;-$ Um& |a0yX2GY>X *%ea(( S3?j\\Lj-38ꨕ}f60rh['.-HMTK)&1[!݇eg"E;k1XQ1`=6 Bb3LzwtS3 U2ҚD"s\FiM0cؕ~jj{ +%;~(ԺB4~8KrxR'4,In\&,nN2H'$^Axԑ.>={4+oLN@P>w&59Î|-eor+;#W/M1qiq;JcbAx+gu1]1eAFх`b>O]Y` /J/RJO)J-WXYϵ$!߸6JCڍ#RI9ڭ))u#γ<: M5wjLnmH7.K2ewBBjXm{ 1WŢ} $Z,RZXRX1@NYw5ѪR^yI0hxȨ8%1gJf33D ko 3e 把-Pˑז`6d+q2PDd 0"'KT(\r(ȔI!I`6TI 5C2 5#y`kP}Lu%09Gp(*8ߜ Q3eXv<9/f=}s^1bsefDnn1i<}ןM&tz7\{b|\\`Y1{ytL?S1եg߷Qw-nM ]w?{Sn?uzrjwe_n>FRcwO_īa ~ܾ Wߗ2(^!jO9~Myf%,hp70 wUfE^@P9$^kE4\J^>nū?P|~ _~'X r?\,R>/~ǫYJ>\}7O,(|Y#VgU7S/@WJLڳbDIk%Ä~Y5%nDJ9r(Wyi emzP*WkƜaFza&5 [ڑ:Cֱ2dz=]|Klքt0B|bhCt('pHO* !Fu48^!˜g,3C~9yTܒo.T#z^,8Q/Q:?IYl}ΏL&0\I釤bX&w=Ȓ+>/Hk*.]]30%@pwTL7 þO=PT\0Y\z| ts(Zؼ6:sמ90}Y&uV;S; 8QQ( fj+#奩g 7zE._u3'fxDqܐ'g~َiI~?y[q1ZK6+7t滫O&7wp|Q_tGvV4Z֬m}<78Vw͗W/2I}lN>zpSNsf QVN9#N%G𝺬\uzYO0, j\5΋_[&;ݾgQD13cS{Yytݲz3`10o1K8xx5u?JL: WL˦oB~}hɇ"x0&r3¡R?Sʥ*= OU̔J0O`AGDog_Z hQ)+gR =SP}_HٻҏFc;[t"0+Y_ ѕDwϾDYuV}lP\\wo}D*4dzy4N8A,A)KXŎw$][{ IN=騕cVSJv z~X8QQuP{SYX$R$ 7J ʆ}PF[~r*#P8OLm?1¸vt\6P69p%~ Ǫ ݊qe]$~JU6xU6eFΉ6Tt:~U% , 9˩#蟉9yٸneƮ/:%'t>+7/ }L9q_on (J5O_ON,F8J9*wz3թ'qw?b{'q1jx;N¯\ %B!=vБ'E MxA9Oq\31%&(s@` gAqܠSyI(8ULd qARҍy&-,za!"JHV>Gejk'+}mVw%Z*!<+]Q:򺭔eƝl T_֩NmNVeY v+;:&<]1`Q[3VI Y&ébY0Vom%(%jYu3WÜvσwun!`хKB3u=#$.ׄf3(yl oŕĝ3ExLAŎw5x li pWBrNF/M5*5zةQoϩ٫0Ne5)Vm* ܣp8M,Us~zD K:u ߕjP>=9vn${de2O}_/ iE(7[Jէ9OZ}̯*"ATu4^ܢ{8] ,JP-`26-edc= NXF!Ё#j,p-]`<`,eJ睋6 ”"!iz eom fӌpLDe7Xk uj=rM 2& ҅ 4a|/v#YK?y]n$`*o8騝jOiFp~dQ.7|1`WdV\Dg#E$#@yVQ˅BFj6T?+iS9vAfIc[kj$F̆KwX;|?6JAH ,Jv|@Q:Mi3Yu;L!˜3Qtر"(' =6zۋt텘 +GXyK58@W F% NN&NXpqz\ ѽ"l:s׉9~i>+Qn: (Tnz$h@pUv,2 z1zFɀa L¸hRx%('lELA7(Ŋy5ʑRu; 4PQD0j S^bs)KC0g=ՌQQ9nDe(x%` aC\`,` P{pBig!0N1F=:N6FHކw@\JkUJBR9i&Ӹ1JUoO)SYSf&t~X#&K-?r}&$LIܒS= r~qv[..bݻQK仏^OSWV.hs*fѱNy< u_. Y Q)xo}bĂ&^eS[&?_N%t&jlTu!3O$ey%f|sX5c+ j NxVCP-+nCg<Þ&ڨE9`~!G4%oь8=,)b)LD~bs{鐗\`,U)fzd0;bs-1jd0;/Am*BtNG K8馝nL1BJI|ͦ jicNbHq#zyrW:y߬\r589\{nLіaF2? Jxp<8C,BBq6SRJ<暫~,}|ReTȒFM9OA3˝ƊŀɍcVב5gfڨ8vB1YA0cgER p oo!9PAp@B*@q T T(H =f,M 23*Wih|@w=&BT:sׅ9o T s}5 06Pa"T 1 IY ap>"/ <TyVLqQ[Tі !.^RHcsYto`ZQjt aDQMVy])y4yP{LS Jұ}_~q7MdA?爽KU傈r Fv$qɛrhs pEٞϬ̿=чEߑuﳵF/D4)uKqF{MPP!xE幏uvz:=QR&C*qY9OFT(Xԟv4'GQ;ERപ@}:NKX }p}lNUSP=kNT}NSs=c=NR#P!1陠JLj+*ȡ|zpFfU;7ZG dHf<% NFaK!pC|:q K#5"{5Geo^JvȐ"*e )' 3=H $hL<8 eC`1f3(2c2EB! $J1Ul|-u(-DbT 3^:3{Qk$ ,jϔ!O2!setX[[BdxA7b58n$Nm#!Uu,i&ew>şkDJ!~.I_ޔI/C.ꅇq8lMU*[_3y Bd$;i0CO .دUX 01k{ly]ms7+*}0@>\V[uuw[Rx*V,GٺII)ᐲ89i<@%xL&]%.[-[Yz!gn%Y o/on~治믛'@^hHa yxG{HRjEWW쿲Sfa` H?NWW5TUM%jz|sL#~F[6m.M[]=1QmZF%UJ7vrxZ 0Ɂ(&-uqUW[X fpf_n-,X*`pԈDá7V+؄/b$sys>h%EѿCEsCZ."ct#VPd]?rK7gò.?^U߇=LӈAHڗ>c7:.?JKۻY܍c9JO{r܄7RcoyJae=+Skᮧ!DhMt~$wkŠtZQJ:ۻz), 7&p+'M[+%v(B^}{B6|&:žd<$ $+z~'Yv$S+&pUݫ d%;u[w:]m+aHt{fhU˕ixWGeZw˧g KGFi5&2 "g Һ:&91}*MS뱿XՆ*bWW-{UyKlm6 _Ah]/8W\)4Ĺ~ GƉc/S7;D]U13Q-@Hd\G1/5hmXHkb&T 1F`vtz!s?a[JFt1T#XRf(Ŧ)ox)r>Oyˁqx/LmrT+b $'5B0.(L t7y/>ϾE[$w7@G_Oc+Qtt;@ ;l +ă.hvIRqVr0,y_rmR^nAe`fa tWn ^ ǔryCnqIx!lQ ?<`4)!2f["2,lRPʔCR SۏJ E߮>u4P\uB0#ꨵP` 6{߮$WcG0R\ge]4:;Bx heKWO< b3u>h`{z(pDBuC&Z(0]0Q,pR@X`Dp2RJeyT+đ-8w|rT1AR!hϹʊB y!Y,BSf=zEjiJ'Np*%s/ zb-KQ2_ Y]:Q]'..c!u՚he23g1Lr{)n]䖠x$߮kͻOZ zPC5Ӝ{%8:VvJ[iPYx;ΏiaJSJ#SG'y*RiL^\]mMϣ'l IUR'yA*.% F4u0!4IHTZJM5н|uX5Jʃަ2mS2Db-Nq# bQ:O J]U@邀|dXPp3y)TiD1U>D GJtP*742&yTB0)eC2$f#Iq$k#x1Hu2]͉dSEZ/d2M #uSj('ȑM,n9{x"E{ɐD<ɫ!)u>ƎS:5,9  rG OvL?%Ap~_Gݍ! S2jы'Hɨ^ wZ痚'L8zt?DpjDOH֞V1-b ؎yGgP.z;%~Ԇjp_E BYr>e[ _D(¿`m(/RLJ:M%UCCkvz3 w#h_u<yTXP(.,*k|ܒ(WbS9I{G(Q!(4 TDȌ-'"8&~w\~zfBt/򲷂'I W6Fq1r‚go T'B \'(8pPR hPAL8i4*+Q9\t'zg|FEsIOP,L9.Qʣ5|G@&l M)ǻCA֒ Y\\BJr4ܘmB\ӬP{7ق̷igQK ٴW'7ulS4XZPJARMt wF9EKXH]߼ 7|6}#w=~uvÇpe僕x|~܎Ovoo_}cN'Kwig ,aȾn"QO/?x~ EI*ߒ7Z$N!c7򞊐|)}i@ܸx7#BpTM"g eQG0GnCާZ^*([( ~} y3a,魐?f^ 8ٔ GUr/Ruy[TK9J֚'_)cE-jȷ(Aޔ)4DУ=k0|ÏWߙPwm.WMb} traxqms1R7'뤌Ιi4%T}p7xA oMs&vVd20We^7w(K*P|h 0$B{_Q}t Qv1WX-85(x6vASTz'ӑ8yn& RΥ>u-]>ٔ ,a69 lmpi9G f^0bP4 [ (՜HmF.tP(`uDŽ7 #j":n!N$H^)mn79#^"_@Zy t;|aPC&l0ͥSx9Egl54VC&5%Z1I*r&wfeR)0Lr (/Vz8\vxV^E'E(cL+ RrB&.$FAwiYygyԱoJ ku Ui̥O!:G|n:Y)oՊ#F e*zn\ZCs`ԈZvEBO:7 g!Y0ݚJb(5JF+V*4S2mos^ %3ENDݞڵw^[  5Ҝ3fRV9Jj+We{:T2e>I>\0$o;R8:K 9 V;ҵNf=[Xå SŶSrv<+J 6F.RKL9LqbO'2a<* R {;iBsN^eUYkOv_esni>(pڲp2Sqy@[ )mݵdX`brʢ < h 8 (=Z/ 9^8# lC}m")E+oP)y:@p6' H8P Ωh(R{KC BZ_/<. '{ 5\iOUBOL`!YG G*erLq%=?2wXK>NT<. -ϮHӽ/49_ ZV tvrSozﶛJ$L2`7+ XZ?sCR>f9#r=L.0Sm~;Èz{?\oYֽkg1>ptN+JEZsR_ GrJ+ziEd;nHP/Lo<"x`nt<4 $3Ӯv]]R۳_2*T:wʖlW"`D0/ w"%[ H惂wUM K)C_H;+{V7*{(^6Vg#ly'8Xq^mP.D#}MͲG4;A"l٦ۛ!eDwMYQe~YP&g֠%^;67ѹͫуSq76סFIX̓^rk"#g0rbzKw{ڷ ~η^f{kNiă낽14g!}Fz(}a&;uvQf~toF޵iX{B^B|,ڊҍJҭ-bXuxL`I!ҭ YOI5yt0 1v:mL+ n퓋"[hcBɸ]5,b1Þ"8(Y7rm>GĞ^D,lƅmQ0rjF*\z\JER'!KC} 5EyҸ,0($8.-34B,BVRԌU\z\*KQ.%Ә\zm)CuR ~vTHH|)KhU7:(UIRZ #,-x /IF'Ig($|! T (m6,RN,VD O. آK` ( JM(Y~~ m_xBU$_?yXېl~0qfHy_v, gM9st-&IC %O`!o[7#:`FS׍O>'YthڒPyVX\KSK6ٳF%6+15iZq* iԌFdIs13mVBNRћM,tNԪDui3,<,7 ׳=yjf}q4IC ^1G'ziRN:"w*ZKhq" u~ ̑mxyH .z_BMiE .K+ZSR(xdb;wn6Dd"Q - f*o޵A8"YP+Kas6ndo#x6uElSpjyX5j ʷfRjŇeN[d$D#9Lg%6+15ph-!tQ#I ҆ 3mFq"'MАFM `Rz^]Ó170PRYJTbb)FX.IHӡ `y'a|%xTN1HI6RaL3Hne(䨨LEf)L?=\} u>؂mUVY1V 翿[͙THO//V~QL1+>6en\0DBxRg_^Ų$ɏOE] F+ʂyw\| o/f7fL:0JIɹ@Pݗ6k> %\q1,݇wѓѻkRŁ]m f øl}dE%4%f>Cb #s]=b&~+iA7נ{MX Ate]h,3fЭ$SJDRT 窇 AK Zkq:7@j!Q#5lniVЛL AG InO/<5f(n?2AuAgOz %^tQ'?ReE~8.ft"2. !h?"ܜ 1LdۺL!ﱂ{9HKf8)ԦPGZE.ZO "DDfЕPkr!,R0YdNKj8.Etd\UPSgYz\7>U^/b_@;s)sˢ ,#݇a\4"Іs%J1=ǘ&g9'HO͓xl<(Jp%unWTdx*<ޗÁUMXE"V;3.&gQ8PJ.Kp6 4qpb?P.4‚.z Ua#-* jp|RDsЩ T&@NC(oig%6+15 gl-z (Lf4Rޞ=iӈ6 ΅`Ok7 -s L@f3)huB2d$%s*Wxa-H*I?]/C,rqUÀTx5p<=fiՕa.v7+I_[Ͽ/g:ōKLlXf'r'y*}S%A4mtCf:q)sK8!L4w{g?eγC0iίYibVժmů5ʾGE>Rf`T(,{8ҡNjgCuOPEZ?^L(QiJs)PT 1;ftO59ܻks 2u5zp:wךv"V0}]7j WPMcnH D!BU7)$bjD[>8 jS!cSW^?828EA!ENzz塉2:1agmKB*Y{o/^ooؿ?WD("-z,'XS{n'OhNf^/XQ䃾{I:PFU$u ;ݞ*rI=lF(%[~ w_-Mp|Άe=r,}Z])߭g~#SM__b7bY/׷'Û F<}O~3/o/fbYǧEpŗon|DӹgAg֋ٍY,(/ti|YpBQȋ}\acSsN$e||׈x1,݇w>3KL#@/DܚǏƄ{^~~EBM!_XtS#&!UO8X$י+|ץMYN5],p&׍\ґø9!2@ uV`Uz`+w@M4Cp"}\2'`fiu:K\ 60.؄1wƹd"bV8.z`+s\ۛtmDj_=VmvdHviX'Lh55ȷ+<=YgV6 tI Vc۝%*<ϳ_4P>Fd S<7+{ >V6{;nLUH MCڜoSJ%Ґ(Z7VZV +B>ΈGd 45iBPke/zֽ߮{A sн]O94ɩ!DƠ'ܒ,n%S537* |hD !)hV0 uɖ"vZes"Ar YՌ^tjI{\W5QoT7[ )w$ᎩByk6!dZDk <Aww[#OfI EoHR8AhnkceyGᢇu^Vj%LZ+~n](tRLS}DS2BL-4D;z*HFPAsw&}("Mk@zٓ 7}0pr]}oB/oytǓlJ9ٵ_ lSDi|ĺx1c~ݙ;x8g ';4( .?+{=W>w gZݦ݆^j Al7Bhַr>ܳJ0+Hhk*E`l8;CfoYB;Jp~pzn扲`U_{wߤSiћ'K vg4.2ܬ)z] s1?] {f2w1HJ9'Z] IpUsc׭`d%/< 4?# %yzsl!5><Ǯa;jw\ N0o!_ٝ~ۏRE-#vh!)"ɒez˜J\Qk%'B',wD(cƊcyʌ_wn!t#pbo5N8ߛΆcǿ5d?[Tٟj\?f[R|wGYMٱЫܒCЛnr;UUBkz\.n? 恩@wbq4zu6vD@C߷df:3|0$oAZE}njE<#4@r_9eYʎ~2U4aM3ϋRp>9x֘W[ٳH:{'5Qn;!)fN1FN$ubP2uھv;ejLoڭ}ݨB O6sJNYnvqDMJ-L-ڭ}EB OVsj_rԽjQm(ɔWTgY4F9<*@Y0NTpCԊJ=1NZP=-gVPԥtD 3őhb"Zj84\k!PH5 fd7t9OWY|~m/wgAoCzg'_+w>yv]݆y.I̗%u<|[%1r@W HǓG12?ykH}B*GųPֹBPs3iO[f9_sicsZx!~X0jw7?Gba]$ E"٫^G"A=ZgެP+u,h#ZQ$.Og,eS~ub5{kw&"j`%.X{IBSWH跺TʔQ@9l!_] M\\1tw;BogRw3St׾A+o~h`10N. _J>/&GP=osSiU돲*c BHU]ϒ}TD*ʋh+Ṭ3_Jt9ukk o!u֟x2? J$0=Y0Me ?,^V^»n* _VnI{ԧgVzȤۢ:ߖSkejWf aI[#uRibd $@̳`b %@j(/Ɵ4Z T J2p^3j ItC~Q1L h奕)U@EGi5ᔺLZ2XKs9tgXEOS@H rq,Ep~xc]+UvX݋v! NIE ; LLZ,"R? 񘗁R⚝&俲zA1 AΧ+SLd7 ,,,˺#me &<4Df[Bw9p$ULd~䏋 2MXnEy* b%&OSNPȫi"W˗$e[!:3du*/BJ|+-Y!gkћ/׋7Bm7|iuN.HΘ>70J;r\ݤ=xxcV %&RL0MUR\τ&B&lJza[c%%Pq[w@ҴZ6 m珿ZnAwVā1MY%CEمf*yF#*ije5iJc57M},!$?/gR(H́w`LJchu*CM+FҍCGJ IY2>*dR<)8'*T D3ǣw9^St%oۨ莲 AzrlP{6j.q.[b+?676{+^r ՝qC!Rukl2]H (6Pxu;&PR/[lwb2fw;3ËD6hV29Е1LgU4ГLIz j΄ `x]f2^^X(H"?uRZgr +nD]G~Whd2reEcھU,Tۗ?%&U,cE>-a^B_YxVH $u!5_&8Q# BQ/ӦevRW{〙`,Pc8Jl΂R6:kgu Ỷtl ;[ӫ]mZCjn6Lh\JfZ*IRp;f%xO4!\M;VuZc[q,qvk##XKڦ)5V,^3eTyY(z)Q2Z)5BDQoFWWǓLi9/J}X F@VShܧ]ƻMcRY(TSCi6+gO mF7NZMh|h1Ժv zNrCwXo+Vgip_׷l[v'q4{7wϤD3|l8W_?wu3n g3ϫ𝽾'[HN1PВBl{f2#*K3-ѽV1.?ջ_fOv*3-һB6PcC9/w*E^ǜy%vH[#9` QO х!EXAyAmC8-W72^+ܵ5O|/gOdU4bqQ~św%wP&@s!_͗_}' 3d8e74Eth4NRRf8 >ap?eDM^j F0)>M_6n 6BsGtQF'9ROT\n ZǠ O7q,aSf+E^~(h&A޼~G/ o?下0ݟoc}qپ07.mʷQ.QÌas {49l]+7j޳C?kÜthk} %Z,e&3fOi!Ӊ p*~@S{Y2–gJ~q5Z5& ϜqLJ|4*ؐ$h@DUY!lHZF-I*h?{H`_; SXvd#ifv&jɗ,l[w㵻ůbq~[)y @3 FnF/ o{CWV1fUuCMB[Dm _AQ o(װָxjz2קj0(k$7 2I݇-Cjm4rAmHo'D]ʷt e uBedUp.y=k+H Tݟi@SJ~w^4#_v6Ywaw-tւ4}k}(:)b-.1 z*+&HPݦx@~)IIII[ngb`1)fyD*4,Xb2zYRYc>f/cqyUwxuc[ϼ\S]m^w TJjU$ ~J=2T gCH;xTqyNs 2*F3ڱOQX OF0w*d ݺ@7iH;G9kkE)FFsYL{S—d1e^Yiu*꨼t<*|E#_N"k߾U s׊ 5'unjFs 8B8sa̠ܕRBk|^{8DyJQT1Yʒ:NaG%.6G ۙ';pi\Uw'j}aѸWFjB{!m/-3& V80T䔍fB0%liC Ƃx^MzJ6R?ȩQIء<6mEZhj+>h) K uc Eu.pWj;̭AMx9QspU?)= CwN8"$,tZ|[_<>mr.QfkO5h#W=:Wg`kNVXX6_Dȁo<͗Z/{Yan":\-ݏҜ-ɋUCW8Bf Sx_`ס(zPW_*;)%Ю]wAsl}Й`}z/z/O>wz/4ʅl뽜tlhr9 <"4/"08a^KYD*>]gJܺ@1|X<8<=q?Ô3LÇfO4<"uV8* .oH'&l[R1?<}rﱿmIw7sQZ}(Oc[߭`Disq6Q|gPV jk!àdAx |O7d /_~Z܅ѫ ]W2YܔVoe=Vˠ)6>IZOEnsͯ 53K5,xfrYOKmc_l'?J*7)=YnU6ee] .ʠtjŻP=ש'n [hM!x̻a/ʠtjŻ3ɼ[ABև&zM &dW~aMWw jŽO%VϥGsvJRJu铃أVHC%i|FN`rCE>X&FHD릻gGdK!h:ֵjEv{i@3Ɇi7B>b [#Tx<ݖ]+ɴTen'}m;+>kPxYsLedb[ LK^T vą-!pN-^J-iPo9x[n5%. a]LfJyNS̞)/tdeR417 #@0z*DSN,k)ՒJ~R9=hBpD곂0fǸ5+&%7 7⥊Ke4N|2"d?ӢFr.ٓ؟;N(+/hlr>}EceVT4l>?fVEc:Dz׮ NO'WlZ7;->~o>ߍ^wҨ֜ Q̓2c3RJ!Y>El˝]8lul5p'[_1(1 $8!KQ k0lvsP ]BFqEv30?Vq]unԧ JL%$ r⭧㩴*Ɂo<__ˮTyɿ^Aо[g݋d-2O6κ׷x-謹[kZxCV$nMU(4w>(y(y(y(yVr[2 G K|sZs[l%1]H,qt Kρ9+i񝷵G۷2sEYREI {.:ߵuLD (mUf@Σ\褕5&'D`7y+. {ݍ+^-۫YF~BUoE~ϳ.*PAI]TWkA薎}U}U"B֜}u~/)G:5xQFjs<]" 9PK *6Ӣ2jsPI:GvMA,\IUsY,OnW6KF޳YN3weQh^vKZ8햃TN:ªb"Ye(Qlh!e$qnjc+8nᧈt=0H򝍳0I ^'%#I!0;n0#-e@?o̦U33Yax#ץ0%@qHCqV´Qe㘁s\~I(M!J3b=ʯF`!kKxc@ 0%ARe&M+@9JohZ@fou!Ť]jϒnpͿ5YV_w_}%eqP{i $( =pdn^$t6Aθ t€hψR8P 8aRΒU>q덀@nF(ZtFj[UniF;oB~ 0z&REc(gQjukY5 iJ\,$ 9>I'"DE/uwlztP:0bTSuE*}e1,fl ^DDwmV>8}mġF:!K!Isqivĺ,:~z`gsZcpew+iY#AIҤrdjT\p @اj4V)6 ey\ DprHtcEqᄋ-fGI旁eu06+l_8無8I`!xZtJ Y4{"gioӪ f` s@J.qdí8>{%%i$1i߫hO@"!)ogJ(E."`48" n]ۮ8~o7nD[pj>On'L)L$MH.d7G cH_^"SFe TCjЂ jn!ǔF[`z۬{,J c3{plZ z\FCMT;@y3uAmv 2 DE+ Lj;xA4StrW=_*ă:,<-و概~Ng3bU8{~ZrbF1N3F S6΍KHC˹(R /D͚H͑eyNC$ѣhR߿] k.~x9k)$09]RB}&<<0cpEJ0C ?`ln ?rl4\@ެP`w{eS )*ЬHKMxjVlԛIyrViX6ĹccmJk2m0ggN?/EYPgS4-lqF@*/^(R2llARO$x0, 0"NXOI˩d Zox5L(ot֥- Y$8%2C ^HKFІLEiBd&2$ -D57h#$ W %}voƃs9 ,'4qY[e K/_)Fޗ+0_&B_t9C0IڷnUXGzsTH]I&$L6`)â㇗rl\ rtrwCQV|2 ]qb ȀiVN9<ο<Z B$2p"S)j+ B=lF˵簚7> Dh1&b>ɎS-ZBw'XҞU6y&H. '9F޵+bː3@Ӣ/-pТoXv"ɲ$_v3mY%k)' ؎ r3cH+B[L{v/OVotf MyxMJV]lLPPpXH&.b}!Px]HzEJq7&ahh-A^rl* +oМVCڮ3 ܩHCe%d{7ȡ7"DW5IAȁ+SUCSJ˳yKpQd"~YE Rw5O/ |{yoNl[1 ?\s|,^"8_c=~q0gOqCsD-. ffaZ@)yLݵ@WSq́:ءfʰWrt.XAֈ.LNa{s`^=Oy[rr0dE3 k9QH0lhQ3.!t(>= }Ⴡ9$ R:nIɨD-]$b꧲9mOjT Ҷ-6vwΕb$|ՑÈv<\82r׋{:߽@Ca-y]IGܹrQE jl4YyCec 7>Ӿ uld"J"0ՊU)Q*0xbq.N/ozqGJw-Q[ jDlS[[:7~K2R,Kw͑٢,jq0^ >|S5_`{Yy*AGm||ע6y =1&d+IKqͻж&/rˤ'N:o %JC^ &A&b@Ʉ8zI"{ɩ}e+ uyHVe6lNrt;}^#hhsܳ&q9kC]nCXڨxqQ=WAfdPg^f O7Kel5q}z6E2}-T,p2?dFmcYճ0{kO?i"i/4IQ^[t`=2Xyڴ{ o_ X?bvOtU4ǀJ!I'Yƪxxl4xy&ߩOKGsGnTS7 I ?$Q=ASZoQΦ hp=3sOg8̎TX]CX(D Dk ) dBQK+m)eyg8#ee@{Hy G)Eo%BR2[2F UUbdUkd&(|?"SyƃrBt )1{ӆ[v]EC wO{d1Y=>ܷG0鑵F ǂ{d}ŲO킿#x^Bi3EҭE@F6-9iDI/g}`w=~3cû+7ЯmlQ{Զ fUEBXt%v-ǡ|&eVH*v0.gcRl0`.G{JCƙw6Ù%ט ASj6c2$NF!C jI^Z~TܮϘ]hHk!IP! = ,^rΠEshs7T<>Q̂%"E`1OZ9*L'ksTRdPIK4h6(.Hg {39p,$'x/ ghl_/irq~3: 4gZlK$hPyG"ؽ\K ŚwLaڗhBI‡*w飙T;ی1$ٗ8U)O--CYp:2 uklق puX7z>f]"Y*# u>C7<ѴP: 1ڮ^n]-/kk(})&[C?9]Zqu]dUYzL 𨀽1:Bj(j kȢ"8iu Ery(^MGz:;5z[/(ǃY *+0ܪH 3~<>(} )@k"J h3$ J, CLݪ.ꘪݲZv*ec9_'"2U~w~d&e1 2$'%\ܔiY,NVY]^|b|=/'/˓}jkɿ/FqIe<д EC:UC OO ^õԾk盟?= 7cljP\LlQͫIZ@+Tk`MJMv7? z̚x̧w2KۂoC^ރӅ6&0*"b!Scu\z#f~;*.Ij!GZG4BQ>%}vX}^2ɇ|FLόXŗzsd#ؾ3udUCc?y9RۣOuLvogu|eUOctQ CQxS`B^Jel w ޵pq:J~Ze13h67<]6GK>Z},b[$m%y3b;bEECI`ҼNa_qulv2vrsd i%SD>6@q2]IhŗE:_t!ZUy>C"D_dN(xɡxF6- _o+8iWM)>gn;-Lڏ^`Ⱦ?TtO!ܰ?Tt?]ݸ5 YZ ??iLt=G :7y]ty1)Eԫ7ʇ|J96t xRc lsٳM׀>h5w%E8FU*ht*2}y#u_'g'qKm]s7WXW:_UnnsvIS, 2cJbHʎsCR1ˢ?5 ݿqw::c 4:{"w\Y' <12ҭA0LwEx B?mc HX]Ow,Brxϲϟ@SB)rvVghbwT^ogsU-`)7Ap ' ZÏ~?trߏ˧_z¿_m᷅j[oM |ܡtE&=37N :;#mgN®EBQr[{_TU4ؿh'mϵHW: anU x, \{ 8"<;du ]xsS) Xab< 4&IkHL T~/IsLZCb>Cw!~J H. D@ (O \IjĻwpS ;g`}/Z31uhM[WayBn{ WY `mwD{;*0ZoBNc-8>UU{=/1is $7+, ʙWo~ d_^Dx_:7+ Z?*8`8o&(+>˿2*Q[)c_Kvhz|BBa:h_c,M}xUk$rAx_hZMKL/s‹4S3Н-],S1JEj7Dž5rQ''\z3@V1緟p3]x-}u}5GPg! o36w:b_q(3m:*!^}2Y<͉`'$aa@w%+@oS jl$D%9ЌN R.[3Vx4 _Q@*1|n:T*.f'GJG(<3;bͿ6 G2kR[_[ԅj^lz?a|ho>;>;>;>;e q̌cx9˵s.%<|h (Xn<͙?1TW]/Zs2߽uzWau=",+F ͨ5K}( 'CuJ9X+ (Lje9֨.zy6]+Srks[Ƴ㧎@ ]F،a2>wF3XfQftcgJ<*IRl]i/1;BA";.[WY\_X}PWN% A sG9>0E) b'52Da$ӎXx0צſC WϏt@𵾓X5p~"aCd-ejfdNg \AukL6i(E?rPPĤ_V&5ũV^ T3'jʿ>?,9US Nh$rm0sYt +)UkS5MW[6U\sq|QKLρP樧0) ڒ ~deJ3H e'v9Ptk!D֜?9=x~ugB Ry8rΌZ'_ڭXGFh>{fT]='khT"'J+a$Y>{3_WpV8!7}?آx` ]6 |YsT' Z VSQynAX@F v7<0(HEIreuyW =x>͜k5s&uD^?8"א%juBXy&I:OQH]*Ux7$39?+,)-]'xJ7?岴nsF5'PEY9S| kZG]"_$檒\ցO-ɨ$չ7PkWewlde{'zT*mT*vTZkT3@밉t1ΚnP#TcYQ]H k$yLB1V*yP+ EV#B*šs%5B/N`ksc53,R2,"؈( r+N ,U].V8i#,ەTXs/ C ;Q QױT*!TK| YI,He94a20cYTEa(tD蚪3DW_\ AY,[8hcsG#B`ׅøq:aW̥<3Ȇ+bK=^O]o|NFXuNCYrv* th`#wwїC:y$eI1M& SsLB7/d~fT|+(V=VQ3m`C*X3nbXڈbNw%Z(T[du0(Z̸c1Җ VB=ŚGa !܀g=ֿ0L=c2P X@&cKvh ]$Mٙ biAA``pV$WXD[`N\,-8T\," ;ˍ'#0x(ȭFMf\j1‽z՛Vs@;HS=I)Nu*K\ AeJyKܨ]pT/ ^Z5 j@P*6MgbZjY R*ewVt3e QK%G' HD a lVfJ)>9qs#q({`"K2̤["֫#3 d{z '~ LGC᭲E>"N*tO&?3f0E4/eA8aTXZ/Cd96U/LmJ _ܠ뿼z n4O6vG\xuxfPa# `<[3r4K)vA[G +CU,݋vQJ7H~Ǝ!0)Cp" &T H^:qh{?yE<ٰӟZco v&+7q(yءϠΐz k9V>ę܄pX'W!Ź3PgA>4QMPX67$Zx, aå48.6B)S0a1b\K$pKeʬ,EӔ sюXYeW`E7Җ‚!Q4p 1|75N }NE%*y%>=PR!gh;DeF[$B0QݙSXueh]I8\;nuAr@paP͡HH$8BfK4 dipXLI- 2# +!3x"dF|.^a%]L"X<9^`6F^GE}S-{"h4InN2J4;93P$EO_HKs4 %RH<~N?PЬQZ"Q-#rNu%%4-ꀑRiZZQ->|Z*=cEQd%ZBf^I9 #BtqtE"PHk {mPܻ`JU3iI׵۝hm-zOg5@f`q^nZĹμ_{"~dZ[Sb]WJ7PXB^̒(|4TQj'}P-5)/bхZ ;_RaL*yr{]]?:FI[+mwHL_|uv/w7&'CU,sNĄc^C<ž-m5Kњ>S%d?7K~bw|)r kLH33.,vRn4A[;*Yzl=׮Z^~>{*Y1yfZkcڇO5*lD""venDL SɃ.%ɹ̋ZI1' ճ&`Hv:5hZ9cEdi4.KM.!CQR5g[\0w6u) 깘wJgH;s]d"8;!Gؙk-ĩ wN}pimrFod8'pi|B < _ DEԽ],MCYK&J_u}.wS8E]:Z4 HlN}։N}Rr3( Y.$,ҳ/tׁ)#lZhf٩1Mb%մO@ 1&1aN}h}no`ZI)[R$GPP,r (ofkTD/$ZkK`ڪt1^[`Ҵ1v[> -m@E0h{l[F@+A!2.&Kx s0א`d,diuꚻi~xeqvo{O"g`02y\h;O '(/ xpKcʃa_k, /G1k ZVs]E᪴(B&}f9"{owUf&4}9ɨ,e'ٴ\ך>'N"^,;t;{.=-dJ(),楥2) %}$ ±HOhs9+LPr-D>DI;0{u>8P<ӬBm ̞Z$ρ>78>wuQ$Lӻa9[Ji,#YDZH9,}vvRvF$f8xT16# سp6F "]7$AxGWg#j=GO>G3doߝtɷ"mnjK&G/+5P[b=ݺ쩝+㉱0*7*BJ8/Ɗ>ђk!ό ml7iػeP/GʳAs\+F岆18W?yaa -@i"d3#rGv AH,@K)G @bgͱ4lVb(b/E,|:,5cm~=9݄K&E痷fcIe%ًl?ژ }CCtʤvw=JكBGLf@h 52%"m]`{,+=2rê%-5Njd\pcզoوc&vҧ"?I3ءU@(1+YLL>4xzyhGRqBq`imI qP9yr޼`J:ip]|ZmtȩfLc8{yv]&gj(+خ$~_n7@ JX.U"jL[cnաvtw@و({J:Q}pșpMe<9 @fF ph6y+51v6#>V1Anp ,5.)N! 1gNX!#v!g)MXJOބ[Ѥ$Zԛbd5nև̹E4$כ, ѬXIZZ\n"AJ Fѐ2 y.ðaIpwˠ``*'Iv2=z0rIC~*I9zOa[WNw4nLq}ZXDևUN8uCÎuAt}Gv(B)fnݏ%Z>4u JVk/|W+]~dPBlxe-W5;fhGo"G ! BO[SFBDakx ?Y! _6~u~_& K_e׎ S7:Ȥ032},r NMb˿ llt?=h\lA["~څ%Ws||192.168.126.11:17697: read: connection reset by peer" start-of-body= Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.082935 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:58756->192.168.126.11:17697: read: connection reset by peer" Mar 09 13:20:59 crc kubenswrapper[4764]: W0309 13:20:59.398539 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.398615 4764 trace.go:236] Trace[72767381]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Mar-2026 13:20:49.396) (total time: 10001ms): Mar 09 13:20:59 crc kubenswrapper[4764]: Trace[72767381]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:20:59.398) Mar 09 13:20:59 crc kubenswrapper[4764]: Trace[72767381]: [10.001664318s] [10.001664318s] END Mar 09 13:20:59 crc kubenswrapper[4764]: E0309 13:20:59.398633 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 09 13:20:59 crc kubenswrapper[4764]: E0309 13:20:59.399162 4764 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:59Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b2ee66c1ac777 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.491472247 +0000 UTC m=+0.741644175,LastTimestamp:2026-03-09 13:20:45.491472247 +0000 UTC m=+0.741644175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:59 crc kubenswrapper[4764]: E0309 13:20:59.402381 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:59Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.403395 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:59Z is after 2026-02-23T05:33:13Z Mar 09 13:20:59 crc kubenswrapper[4764]: E0309 13:20:59.405654 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:59Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.408168 4764 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.408213 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 09 13:20:59 crc kubenswrapper[4764]: E0309 13:20:59.408547 4764 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:20:59 crc kubenswrapper[4764]: W0309 13:20:59.408729 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:59Z is after 2026-02-23T05:33:13Z Mar 09 13:20:59 crc kubenswrapper[4764]: E0309 13:20:59.408798 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:20:59 crc kubenswrapper[4764]: W0309 13:20:59.413392 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:59Z is after 2026-02-23T05:33:13Z Mar 09 13:20:59 crc kubenswrapper[4764]: E0309 13:20:59.413450 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.414724 4764 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.414802 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.496784 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:59Z is after 2026-02-23T05:33:13Z Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.637032 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.639263 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="62b986f679483da9ce4281abd51942798908d1dd6876ca39c947d22c8cd8458c" exitCode=255 Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.639350 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"62b986f679483da9ce4281abd51942798908d1dd6876ca39c947d22c8cd8458c"} Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.639834 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.641132 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.641179 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.641199 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:59 crc kubenswrapper[4764]: I0309 13:20:59.642001 4764 scope.go:117] "RemoveContainer" containerID="62b986f679483da9ce4281abd51942798908d1dd6876ca39c947d22c8cd8458c" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.205476 4764 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]log ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]etcd ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/generic-apiserver-start-informers ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/priority-and-fairness-filter ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/start-apiextensions-informers ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/start-apiextensions-controllers ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/crd-informer-synced ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/start-system-namespaces-controller ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 09 13:21:00 crc kubenswrapper[4764]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 09 13:21:00 crc kubenswrapper[4764]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/bootstrap-controller ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/start-kube-aggregator-informers ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/apiservice-registration-controller ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/apiservice-discovery-controller ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]autoregister-completion ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/apiservice-openapi-controller ok Mar 09 13:21:00 crc kubenswrapper[4764]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 09 13:21:00 crc kubenswrapper[4764]: livez check failed Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.206054 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.270013 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.270434 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.272295 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.272347 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.272366 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.306150 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.497104 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:00Z is after 2026-02-23T05:33:13Z Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.644134 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.644844 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.646308 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="328b7a569c478d1b319039ad9297eb410278ee0ad8035b6d6a7e45dfc164092d" exitCode=255 Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.646387 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"328b7a569c478d1b319039ad9297eb410278ee0ad8035b6d6a7e45dfc164092d"} Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.646434 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.646452 4764 scope.go:117] "RemoveContainer" containerID="62b986f679483da9ce4281abd51942798908d1dd6876ca39c947d22c8cd8458c" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.646636 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.647388 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.647450 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.647469 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.647975 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.648014 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.648026 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.648922 4764 scope.go:117] "RemoveContainer" containerID="328b7a569c478d1b319039ad9297eb410278ee0ad8035b6d6a7e45dfc164092d" Mar 09 13:21:00 crc kubenswrapper[4764]: E0309 13:21:00.649156 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:21:00 crc kubenswrapper[4764]: I0309 13:21:00.662335 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 09 13:21:01 crc kubenswrapper[4764]: I0309 13:21:01.498561 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:01Z is after 2026-02-23T05:33:13Z Mar 09 13:21:01 crc kubenswrapper[4764]: I0309 13:21:01.650558 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 13:21:01 crc kubenswrapper[4764]: I0309 13:21:01.653241 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:01 crc kubenswrapper[4764]: I0309 13:21:01.654063 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:01 crc kubenswrapper[4764]: I0309 13:21:01.654105 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:01 crc kubenswrapper[4764]: I0309 13:21:01.654117 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:02 crc kubenswrapper[4764]: I0309 13:21:02.126276 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:21:02 crc kubenswrapper[4764]: I0309 13:21:02.126464 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:02 crc kubenswrapper[4764]: I0309 13:21:02.127847 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:02 crc kubenswrapper[4764]: I0309 13:21:02.127916 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:02 crc kubenswrapper[4764]: I0309 13:21:02.127940 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:02 crc kubenswrapper[4764]: I0309 13:21:02.499296 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:02Z is after 2026-02-23T05:33:13Z Mar 09 13:21:02 crc kubenswrapper[4764]: I0309 13:21:02.918265 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:21:02 crc kubenswrapper[4764]: I0309 13:21:02.918504 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:02 crc kubenswrapper[4764]: I0309 13:21:02.919638 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:02 crc kubenswrapper[4764]: I0309 13:21:02.919695 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:02 crc kubenswrapper[4764]: I0309 13:21:02.919708 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:02 crc kubenswrapper[4764]: I0309 13:21:02.920278 4764 scope.go:117] "RemoveContainer" containerID="328b7a569c478d1b319039ad9297eb410278ee0ad8035b6d6a7e45dfc164092d" Mar 09 13:21:02 crc kubenswrapper[4764]: E0309 13:21:02.920455 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:21:03 crc kubenswrapper[4764]: W0309 13:21:03.250798 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:03Z is after 2026-02-23T05:33:13Z Mar 09 13:21:03 crc kubenswrapper[4764]: E0309 13:21:03.250867 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:03Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:21:03 crc kubenswrapper[4764]: I0309 13:21:03.496957 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:03Z is after 2026-02-23T05:33:13Z Mar 09 13:21:03 crc kubenswrapper[4764]: W0309 13:21:03.867540 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:03Z is after 2026-02-23T05:33:13Z Mar 09 13:21:03 crc kubenswrapper[4764]: E0309 13:21:03.867626 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:03Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:21:04 crc kubenswrapper[4764]: W0309 13:21:04.211856 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:04Z is after 2026-02-23T05:33:13Z Mar 09 13:21:04 crc kubenswrapper[4764]: E0309 13:21:04.211924 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:04Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:21:04 crc kubenswrapper[4764]: I0309 13:21:04.499489 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:04Z is after 2026-02-23T05:33:13Z Mar 09 13:21:04 crc kubenswrapper[4764]: W0309 13:21:04.929352 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:04Z is after 2026-02-23T05:33:13Z Mar 09 13:21:04 crc kubenswrapper[4764]: E0309 13:21:04.930817 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:04Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.204985 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.205578 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.207002 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.207035 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.207044 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.207504 4764 scope.go:117] "RemoveContainer" containerID="328b7a569c478d1b319039ad9297eb410278ee0ad8035b6d6a7e45dfc164092d" Mar 09 13:21:05 crc kubenswrapper[4764]: E0309 13:21:05.207710 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.209960 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.499387 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:05Z is after 2026-02-23T05:33:13Z Mar 09 13:21:05 crc kubenswrapper[4764]: E0309 13:21:05.620510 4764 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.663253 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.664227 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.664260 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.664271 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.664931 4764 scope.go:117] "RemoveContainer" containerID="328b7a569c478d1b319039ad9297eb410278ee0ad8035b6d6a7e45dfc164092d" Mar 09 13:21:05 crc kubenswrapper[4764]: E0309 13:21:05.665155 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.805825 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:05 crc kubenswrapper[4764]: E0309 13:21:05.806912 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:05Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.808061 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.808137 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.808159 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:05 crc kubenswrapper[4764]: I0309 13:21:05.808194 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:21:05 crc kubenswrapper[4764]: E0309 13:21:05.813510 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:05Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 13:21:06 crc kubenswrapper[4764]: I0309 13:21:06.090678 4764 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 13:21:06 crc kubenswrapper[4764]: I0309 13:21:06.090755 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 13:21:06 crc kubenswrapper[4764]: I0309 13:21:06.500018 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:06Z is after 2026-02-23T05:33:13Z Mar 09 13:21:07 crc kubenswrapper[4764]: I0309 13:21:07.496587 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:07Z is after 2026-02-23T05:33:13Z Mar 09 13:21:07 crc kubenswrapper[4764]: I0309 13:21:07.913841 4764 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 13:21:07 crc kubenswrapper[4764]: E0309 13:21:07.917877 4764 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:07Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:21:08 crc kubenswrapper[4764]: I0309 13:21:08.468194 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:21:08 crc kubenswrapper[4764]: I0309 13:21:08.468501 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:08 crc kubenswrapper[4764]: I0309 13:21:08.469866 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:08 crc kubenswrapper[4764]: I0309 13:21:08.469900 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:08 crc kubenswrapper[4764]: I0309 13:21:08.469940 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:08 crc kubenswrapper[4764]: I0309 13:21:08.470508 4764 scope.go:117] "RemoveContainer" containerID="328b7a569c478d1b319039ad9297eb410278ee0ad8035b6d6a7e45dfc164092d" Mar 09 13:21:08 crc kubenswrapper[4764]: E0309 13:21:08.470687 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:21:08 crc kubenswrapper[4764]: I0309 13:21:08.496948 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:08Z is after 2026-02-23T05:33:13Z Mar 09 13:21:09 crc kubenswrapper[4764]: E0309 13:21:09.404487 4764 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:09Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b2ee66c1ac777 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.491472247 +0000 UTC m=+0.741644175,LastTimestamp:2026-03-09 13:20:45.491472247 +0000 UTC m=+0.741644175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:09 crc kubenswrapper[4764]: I0309 13:21:09.497735 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:09Z is after 2026-02-23T05:33:13Z Mar 09 13:21:10 crc kubenswrapper[4764]: I0309 13:21:10.497804 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:10Z is after 2026-02-23T05:33:13Z Mar 09 13:21:10 crc kubenswrapper[4764]: W0309 13:21:10.677170 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:10Z is after 2026-02-23T05:33:13Z Mar 09 13:21:10 crc kubenswrapper[4764]: E0309 13:21:10.677231 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:10Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:21:11 crc kubenswrapper[4764]: I0309 13:21:11.498261 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:11Z is after 2026-02-23T05:33:13Z Mar 09 13:21:12 crc kubenswrapper[4764]: I0309 13:21:12.496844 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:12Z is after 2026-02-23T05:33:13Z Mar 09 13:21:12 crc kubenswrapper[4764]: E0309 13:21:12.813321 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:12Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 13:21:12 crc kubenswrapper[4764]: I0309 13:21:12.814363 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:12 crc kubenswrapper[4764]: I0309 13:21:12.816299 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:12 crc kubenswrapper[4764]: I0309 13:21:12.816339 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:12 crc kubenswrapper[4764]: I0309 13:21:12.816354 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:12 crc kubenswrapper[4764]: I0309 13:21:12.816382 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:21:12 crc kubenswrapper[4764]: E0309 13:21:12.821306 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:12Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 13:21:12 crc kubenswrapper[4764]: W0309 13:21:12.967400 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:12Z is after 2026-02-23T05:33:13Z Mar 09 13:21:12 crc kubenswrapper[4764]: E0309 13:21:12.967710 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:12Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:21:13 crc kubenswrapper[4764]: I0309 13:21:13.497839 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:13Z is after 2026-02-23T05:33:13Z Mar 09 13:21:14 crc kubenswrapper[4764]: I0309 13:21:14.497472 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:14Z is after 2026-02-23T05:33:13Z Mar 09 13:21:14 crc kubenswrapper[4764]: W0309 13:21:14.754804 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:14Z is after 2026-02-23T05:33:13Z Mar 09 13:21:14 crc kubenswrapper[4764]: E0309 13:21:14.754948 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:14Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:21:15 crc kubenswrapper[4764]: I0309 13:21:15.497174 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:15Z is after 2026-02-23T05:33:13Z Mar 09 13:21:15 crc kubenswrapper[4764]: E0309 13:21:15.620731 4764 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.090057 4764 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.090218 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.090431 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.090733 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.092394 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.092445 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.092460 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.093080 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"fb990e0575b0c9fc6674c0828a3e6e8ebaba534f1375aae50660850aefb97d69"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.093286 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://fb990e0575b0c9fc6674c0828a3e6e8ebaba534f1375aae50660850aefb97d69" gracePeriod=30 Mar 09 13:21:16 crc kubenswrapper[4764]: W0309 13:21:16.153124 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:16Z is after 2026-02-23T05:33:13Z Mar 09 13:21:16 crc kubenswrapper[4764]: E0309 13:21:16.153291 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:16Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.497024 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:16Z is after 2026-02-23T05:33:13Z Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.691115 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.691419 4764 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="fb990e0575b0c9fc6674c0828a3e6e8ebaba534f1375aae50660850aefb97d69" exitCode=255 Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.691761 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"fb990e0575b0c9fc6674c0828a3e6e8ebaba534f1375aae50660850aefb97d69"} Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.691794 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f"} Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.691893 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.692794 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.692818 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:16 crc kubenswrapper[4764]: I0309 13:21:16.692829 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:17 crc kubenswrapper[4764]: I0309 13:21:17.497245 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:17Z is after 2026-02-23T05:33:13Z Mar 09 13:21:18 crc kubenswrapper[4764]: I0309 13:21:18.498290 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:18Z is after 2026-02-23T05:33:13Z Mar 09 13:21:19 crc kubenswrapper[4764]: E0309 13:21:19.409829 4764 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:19Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b2ee66c1ac777 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.491472247 +0000 UTC m=+0.741644175,LastTimestamp:2026-03-09 13:20:45.491472247 +0000 UTC m=+0.741644175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:19 crc kubenswrapper[4764]: I0309 13:21:19.498786 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:19Z is after 2026-02-23T05:33:13Z Mar 09 13:21:19 crc kubenswrapper[4764]: I0309 13:21:19.559165 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:19 crc kubenswrapper[4764]: I0309 13:21:19.560696 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:19 crc kubenswrapper[4764]: I0309 13:21:19.560736 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:19 crc kubenswrapper[4764]: I0309 13:21:19.560745 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:19 crc kubenswrapper[4764]: I0309 13:21:19.561266 4764 scope.go:117] "RemoveContainer" containerID="328b7a569c478d1b319039ad9297eb410278ee0ad8035b6d6a7e45dfc164092d" Mar 09 13:21:19 crc kubenswrapper[4764]: E0309 13:21:19.816624 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:19Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 13:21:19 crc kubenswrapper[4764]: I0309 13:21:19.821836 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:19 crc kubenswrapper[4764]: I0309 13:21:19.822997 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:19 crc kubenswrapper[4764]: I0309 13:21:19.823025 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:19 crc kubenswrapper[4764]: I0309 13:21:19.823034 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:19 crc kubenswrapper[4764]: I0309 13:21:19.823054 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:21:19 crc kubenswrapper[4764]: E0309 13:21:19.825417 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:19Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 13:21:20 crc kubenswrapper[4764]: I0309 13:21:20.496813 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:20Z is after 2026-02-23T05:33:13Z Mar 09 13:21:20 crc kubenswrapper[4764]: I0309 13:21:20.702921 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 13:21:20 crc kubenswrapper[4764]: I0309 13:21:20.704098 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9e2c9fa564beaaf635042afaf0505ac0356f3e639d3ce86340bd1e2df8ced0a3"} Mar 09 13:21:20 crc kubenswrapper[4764]: I0309 13:21:20.704232 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:20 crc kubenswrapper[4764]: I0309 13:21:20.705090 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:20 crc kubenswrapper[4764]: I0309 13:21:20.705136 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:20 crc kubenswrapper[4764]: I0309 13:21:20.705151 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:20 crc kubenswrapper[4764]: I0309 13:21:20.765831 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:21:20 crc kubenswrapper[4764]: I0309 13:21:20.766047 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:20 crc kubenswrapper[4764]: I0309 13:21:20.767423 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:20 crc kubenswrapper[4764]: I0309 13:21:20.767463 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:20 crc kubenswrapper[4764]: I0309 13:21:20.767471 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:21 crc kubenswrapper[4764]: I0309 13:21:21.497823 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:21Z is after 2026-02-23T05:33:13Z Mar 09 13:21:21 crc kubenswrapper[4764]: I0309 13:21:21.709563 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 13:21:21 crc kubenswrapper[4764]: I0309 13:21:21.710299 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 13:21:21 crc kubenswrapper[4764]: I0309 13:21:21.711949 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9e2c9fa564beaaf635042afaf0505ac0356f3e639d3ce86340bd1e2df8ced0a3" exitCode=255 Mar 09 13:21:21 crc kubenswrapper[4764]: I0309 13:21:21.711983 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9e2c9fa564beaaf635042afaf0505ac0356f3e639d3ce86340bd1e2df8ced0a3"} Mar 09 13:21:21 crc kubenswrapper[4764]: I0309 13:21:21.712017 4764 scope.go:117] "RemoveContainer" containerID="328b7a569c478d1b319039ad9297eb410278ee0ad8035b6d6a7e45dfc164092d" Mar 09 13:21:21 crc kubenswrapper[4764]: I0309 13:21:21.712147 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:21 crc kubenswrapper[4764]: I0309 13:21:21.713370 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:21 crc kubenswrapper[4764]: I0309 13:21:21.713400 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:21 crc kubenswrapper[4764]: I0309 13:21:21.713409 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:21 crc kubenswrapper[4764]: I0309 13:21:21.713906 4764 scope.go:117] "RemoveContainer" containerID="9e2c9fa564beaaf635042afaf0505ac0356f3e639d3ce86340bd1e2df8ced0a3" Mar 09 13:21:21 crc kubenswrapper[4764]: E0309 13:21:21.714067 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:21:22 crc kubenswrapper[4764]: I0309 13:21:22.496725 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:22Z is after 2026-02-23T05:33:13Z Mar 09 13:21:22 crc kubenswrapper[4764]: I0309 13:21:22.715421 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 13:21:22 crc kubenswrapper[4764]: I0309 13:21:22.918823 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:21:22 crc kubenswrapper[4764]: I0309 13:21:22.919128 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:22 crc kubenswrapper[4764]: I0309 13:21:22.920771 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:22 crc kubenswrapper[4764]: I0309 13:21:22.920841 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:22 crc kubenswrapper[4764]: I0309 13:21:22.920854 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:22 crc kubenswrapper[4764]: I0309 13:21:22.921615 4764 scope.go:117] "RemoveContainer" containerID="9e2c9fa564beaaf635042afaf0505ac0356f3e639d3ce86340bd1e2df8ced0a3" Mar 09 13:21:22 crc kubenswrapper[4764]: E0309 13:21:22.921877 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:21:23 crc kubenswrapper[4764]: I0309 13:21:23.089624 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:21:23 crc kubenswrapper[4764]: I0309 13:21:23.089838 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:23 crc kubenswrapper[4764]: I0309 13:21:23.091172 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:23 crc kubenswrapper[4764]: I0309 13:21:23.091227 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:23 crc kubenswrapper[4764]: I0309 13:21:23.091241 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:23 crc kubenswrapper[4764]: I0309 13:21:23.496702 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:23Z is after 2026-02-23T05:33:13Z Mar 09 13:21:24 crc kubenswrapper[4764]: I0309 13:21:24.497415 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:24Z is after 2026-02-23T05:33:13Z Mar 09 13:21:24 crc kubenswrapper[4764]: I0309 13:21:24.833345 4764 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 13:21:24 crc kubenswrapper[4764]: E0309 13:21:24.838089 4764 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:24Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:21:24 crc kubenswrapper[4764]: E0309 13:21:24.839394 4764 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 09 13:21:25 crc kubenswrapper[4764]: I0309 13:21:25.497076 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:25Z is after 2026-02-23T05:33:13Z Mar 09 13:21:25 crc kubenswrapper[4764]: E0309 13:21:25.621012 4764 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 13:21:26 crc kubenswrapper[4764]: I0309 13:21:26.090111 4764 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 13:21:26 crc kubenswrapper[4764]: I0309 13:21:26.090195 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 13:21:26 crc kubenswrapper[4764]: I0309 13:21:26.496889 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:26Z is after 2026-02-23T05:33:13Z Mar 09 13:21:26 crc kubenswrapper[4764]: E0309 13:21:26.822718 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:26Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 13:21:26 crc kubenswrapper[4764]: I0309 13:21:26.825840 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:26 crc kubenswrapper[4764]: I0309 13:21:26.827603 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:26 crc kubenswrapper[4764]: I0309 13:21:26.827751 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:26 crc kubenswrapper[4764]: I0309 13:21:26.827772 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:26 crc kubenswrapper[4764]: I0309 13:21:26.827822 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:21:26 crc kubenswrapper[4764]: E0309 13:21:26.831058 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:26Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 13:21:27 crc kubenswrapper[4764]: I0309 13:21:27.497748 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:27Z is after 2026-02-23T05:33:13Z Mar 09 13:21:28 crc kubenswrapper[4764]: I0309 13:21:28.468066 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:21:28 crc kubenswrapper[4764]: I0309 13:21:28.468268 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:28 crc kubenswrapper[4764]: I0309 13:21:28.469524 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:28 crc kubenswrapper[4764]: I0309 13:21:28.469588 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:28 crc kubenswrapper[4764]: I0309 13:21:28.469607 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:28 crc kubenswrapper[4764]: I0309 13:21:28.470527 4764 scope.go:117] "RemoveContainer" containerID="9e2c9fa564beaaf635042afaf0505ac0356f3e639d3ce86340bd1e2df8ced0a3" Mar 09 13:21:28 crc kubenswrapper[4764]: E0309 13:21:28.470849 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:21:28 crc kubenswrapper[4764]: I0309 13:21:28.498918 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:28Z is after 2026-02-23T05:33:13Z Mar 09 13:21:28 crc kubenswrapper[4764]: W0309 13:21:28.513098 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:28Z is after 2026-02-23T05:33:13Z Mar 09 13:21:28 crc kubenswrapper[4764]: E0309 13:21:28.513173 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:28Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:21:29 crc kubenswrapper[4764]: E0309 13:21:29.413698 4764 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:29Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b2ee66c1ac777 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.491472247 +0000 UTC m=+0.741644175,LastTimestamp:2026-03-09 13:20:45.491472247 +0000 UTC m=+0.741644175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:29 crc kubenswrapper[4764]: I0309 13:21:29.499973 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:29Z is after 2026-02-23T05:33:13Z Mar 09 13:21:29 crc kubenswrapper[4764]: W0309 13:21:29.679123 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:29Z is after 2026-02-23T05:33:13Z Mar 09 13:21:29 crc kubenswrapper[4764]: E0309 13:21:29.679249 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:21:30 crc kubenswrapper[4764]: I0309 13:21:30.497405 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:30Z is after 2026-02-23T05:33:13Z Mar 09 13:21:31 crc kubenswrapper[4764]: I0309 13:21:31.496673 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:31Z is after 2026-02-23T05:33:13Z Mar 09 13:21:32 crc kubenswrapper[4764]: I0309 13:21:32.497062 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:32Z is after 2026-02-23T05:33:13Z Mar 09 13:21:33 crc kubenswrapper[4764]: I0309 13:21:33.118692 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 13:21:33 crc kubenswrapper[4764]: I0309 13:21:33.118863 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:33 crc kubenswrapper[4764]: I0309 13:21:33.119987 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:33 crc kubenswrapper[4764]: I0309 13:21:33.120028 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:33 crc kubenswrapper[4764]: I0309 13:21:33.120038 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:33 crc kubenswrapper[4764]: I0309 13:21:33.497696 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:33Z is after 2026-02-23T05:33:13Z Mar 09 13:21:33 crc kubenswrapper[4764]: E0309 13:21:33.828292 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:33Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 13:21:33 crc kubenswrapper[4764]: I0309 13:21:33.831315 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:33 crc kubenswrapper[4764]: I0309 13:21:33.832466 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:33 crc kubenswrapper[4764]: I0309 13:21:33.832498 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:33 crc kubenswrapper[4764]: I0309 13:21:33.832564 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:33 crc kubenswrapper[4764]: I0309 13:21:33.832586 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:21:33 crc kubenswrapper[4764]: E0309 13:21:33.836398 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:33Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 13:21:34 crc kubenswrapper[4764]: W0309 13:21:34.391418 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:34Z is after 2026-02-23T05:33:13Z Mar 09 13:21:34 crc kubenswrapper[4764]: E0309 13:21:34.391521 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:34Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:21:34 crc kubenswrapper[4764]: I0309 13:21:34.497083 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:34Z is after 2026-02-23T05:33:13Z Mar 09 13:21:35 crc kubenswrapper[4764]: I0309 13:21:35.497123 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:35Z is after 2026-02-23T05:33:13Z Mar 09 13:21:35 crc kubenswrapper[4764]: E0309 13:21:35.621760 4764 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 13:21:36 crc kubenswrapper[4764]: I0309 13:21:36.090289 4764 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 13:21:36 crc kubenswrapper[4764]: I0309 13:21:36.090410 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 13:21:36 crc kubenswrapper[4764]: I0309 13:21:36.500311 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:36Z is after 2026-02-23T05:33:13Z Mar 09 13:21:37 crc kubenswrapper[4764]: I0309 13:21:37.497053 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:37Z is after 2026-02-23T05:33:13Z Mar 09 13:21:37 crc kubenswrapper[4764]: W0309 13:21:37.556076 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:37Z is after 2026-02-23T05:33:13Z Mar 09 13:21:37 crc kubenswrapper[4764]: E0309 13:21:37.556141 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:21:38 crc kubenswrapper[4764]: I0309 13:21:38.496435 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:38Z is after 2026-02-23T05:33:13Z Mar 09 13:21:39 crc kubenswrapper[4764]: E0309 13:21:39.420049 4764 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:39Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b2ee66c1ac777 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.491472247 +0000 UTC m=+0.741644175,LastTimestamp:2026-03-09 13:20:45.491472247 +0000 UTC m=+0.741644175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:39 crc kubenswrapper[4764]: I0309 13:21:39.497080 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:39Z is after 2026-02-23T05:33:13Z Mar 09 13:21:40 crc kubenswrapper[4764]: I0309 13:21:40.502309 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:40 crc kubenswrapper[4764]: I0309 13:21:40.836993 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:40 crc kubenswrapper[4764]: I0309 13:21:40.838971 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:40 crc kubenswrapper[4764]: I0309 13:21:40.839049 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:40 crc kubenswrapper[4764]: I0309 13:21:40.839079 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:40 crc kubenswrapper[4764]: I0309 13:21:40.839133 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:21:40 crc kubenswrapper[4764]: E0309 13:21:40.839723 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 13:21:40 crc kubenswrapper[4764]: E0309 13:21:40.846128 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 13:21:41 crc kubenswrapper[4764]: I0309 13:21:41.498100 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:42 crc kubenswrapper[4764]: I0309 13:21:42.499298 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:43 crc kubenswrapper[4764]: I0309 13:21:43.497777 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:43 crc kubenswrapper[4764]: I0309 13:21:43.559463 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:43 crc kubenswrapper[4764]: I0309 13:21:43.560441 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:43 crc kubenswrapper[4764]: I0309 13:21:43.560471 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:43 crc kubenswrapper[4764]: I0309 13:21:43.560481 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:43 crc kubenswrapper[4764]: I0309 13:21:43.561699 4764 scope.go:117] "RemoveContainer" containerID="9e2c9fa564beaaf635042afaf0505ac0356f3e639d3ce86340bd1e2df8ced0a3" Mar 09 13:21:43 crc kubenswrapper[4764]: I0309 13:21:43.772179 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 13:21:43 crc kubenswrapper[4764]: I0309 13:21:43.774309 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356"} Mar 09 13:21:43 crc kubenswrapper[4764]: I0309 13:21:43.774436 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:43 crc kubenswrapper[4764]: I0309 13:21:43.775248 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:43 crc kubenswrapper[4764]: I0309 13:21:43.775291 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:43 crc kubenswrapper[4764]: I0309 13:21:43.775304 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:44 crc kubenswrapper[4764]: I0309 13:21:44.500349 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:44 crc kubenswrapper[4764]: I0309 13:21:44.777931 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 13:21:44 crc kubenswrapper[4764]: I0309 13:21:44.779228 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 13:21:44 crc kubenswrapper[4764]: I0309 13:21:44.781458 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356" exitCode=255 Mar 09 13:21:44 crc kubenswrapper[4764]: I0309 13:21:44.781497 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356"} Mar 09 13:21:44 crc kubenswrapper[4764]: I0309 13:21:44.781555 4764 scope.go:117] "RemoveContainer" containerID="9e2c9fa564beaaf635042afaf0505ac0356f3e639d3ce86340bd1e2df8ced0a3" Mar 09 13:21:44 crc kubenswrapper[4764]: I0309 13:21:44.781721 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:44 crc kubenswrapper[4764]: I0309 13:21:44.782628 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:44 crc kubenswrapper[4764]: I0309 13:21:44.782698 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:44 crc kubenswrapper[4764]: I0309 13:21:44.782708 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:44 crc kubenswrapper[4764]: I0309 13:21:44.783665 4764 scope.go:117] "RemoveContainer" containerID="033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356" Mar 09 13:21:44 crc kubenswrapper[4764]: E0309 13:21:44.783889 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:21:45 crc kubenswrapper[4764]: I0309 13:21:45.500197 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:45 crc kubenswrapper[4764]: E0309 13:21:45.621981 4764 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 13:21:45 crc kubenswrapper[4764]: I0309 13:21:45.787554 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.089985 4764 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.090440 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.090751 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.091197 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.093437 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.093477 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.093487 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.094051 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.094173 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f" gracePeriod=30 Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.497493 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.797279 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.798361 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.798680 4764 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f" exitCode=255 Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.798719 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f"} Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.798759 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"98efc0508ec5c94130c0707470eed2e0ae1117e57a8139af476e03a4bc67e788"} Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.798779 4764 scope.go:117] "RemoveContainer" containerID="fb990e0575b0c9fc6674c0828a3e6e8ebaba534f1375aae50660850aefb97d69" Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.798886 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.799807 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.799838 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:46 crc kubenswrapper[4764]: I0309 13:21:46.799849 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:47 crc kubenswrapper[4764]: I0309 13:21:47.498897 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:47 crc kubenswrapper[4764]: I0309 13:21:47.803802 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 13:21:47 crc kubenswrapper[4764]: I0309 13:21:47.846441 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:47 crc kubenswrapper[4764]: E0309 13:21:47.847074 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 13:21:47 crc kubenswrapper[4764]: I0309 13:21:47.848446 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:47 crc kubenswrapper[4764]: I0309 13:21:47.848491 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:47 crc kubenswrapper[4764]: I0309 13:21:47.848504 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:47 crc kubenswrapper[4764]: I0309 13:21:47.848536 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:21:47 crc kubenswrapper[4764]: E0309 13:21:47.855240 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 13:21:48 crc kubenswrapper[4764]: I0309 13:21:48.467906 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:21:48 crc kubenswrapper[4764]: I0309 13:21:48.468163 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:48 crc kubenswrapper[4764]: I0309 13:21:48.469804 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:48 crc kubenswrapper[4764]: I0309 13:21:48.469847 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:48 crc kubenswrapper[4764]: I0309 13:21:48.469864 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:48 crc kubenswrapper[4764]: I0309 13:21:48.470582 4764 scope.go:117] "RemoveContainer" containerID="033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356" Mar 09 13:21:48 crc kubenswrapper[4764]: E0309 13:21:48.470824 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:21:48 crc kubenswrapper[4764]: I0309 13:21:48.500216 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.426018 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66c1ac777 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.491472247 +0000 UTC m=+0.741644175,LastTimestamp:2026-03-09 13:20:45.491472247 +0000 UTC m=+0.741644175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.429845 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90ae28 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532753448 +0000 UTC m=+0.782925356,LastTimestamp:2026-03-09 13:20:45.532753448 +0000 UTC m=+0.782925356,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.434856 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90e367 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532767079 +0000 UTC m=+0.782938987,LastTimestamp:2026-03-09 13:20:45.532767079 +0000 UTC m=+0.782938987,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.439063 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e910e0f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532777999 +0000 UTC m=+0.782949907,LastTimestamp:2026-03-09 13:20:45.532777999 +0000 UTC m=+0.782949907,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.443318 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee6739ccec1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.617434305 +0000 UTC m=+0.867606213,LastTimestamp:2026-03-09 13:20:45.617434305 +0000 UTC m=+0.867606213,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.447625 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e90ae28\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90ae28 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532753448 +0000 UTC m=+0.782925356,LastTimestamp:2026-03-09 13:20:45.660763503 +0000 UTC m=+0.910935411,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.451248 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e90e367\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90e367 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532767079 +0000 UTC m=+0.782938987,LastTimestamp:2026-03-09 13:20:45.660789644 +0000 UTC m=+0.910961552,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.455530 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e910e0f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e910e0f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532777999 +0000 UTC m=+0.782949907,LastTimestamp:2026-03-09 13:20:45.660799444 +0000 UTC m=+0.910971352,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.459188 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e90ae28\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90ae28 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532753448 +0000 UTC m=+0.782925356,LastTimestamp:2026-03-09 13:20:45.662210973 +0000 UTC m=+0.912382881,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.461709 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e90e367\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90e367 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532767079 +0000 UTC m=+0.782938987,LastTimestamp:2026-03-09 13:20:45.662228033 +0000 UTC m=+0.912399941,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.463307 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e910e0f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e910e0f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532777999 +0000 UTC m=+0.782949907,LastTimestamp:2026-03-09 13:20:45.662237653 +0000 UTC m=+0.912409561,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.465550 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e90ae28\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90ae28 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532753448 +0000 UTC m=+0.782925356,LastTimestamp:2026-03-09 13:20:45.662904587 +0000 UTC m=+0.913076495,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.467717 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e90e367\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90e367 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532767079 +0000 UTC m=+0.782938987,LastTimestamp:2026-03-09 13:20:45.662920297 +0000 UTC m=+0.913092195,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.472163 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e910e0f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e910e0f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532777999 +0000 UTC m=+0.782949907,LastTimestamp:2026-03-09 13:20:45.662928328 +0000 UTC m=+0.913100236,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.477270 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e90ae28\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90ae28 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532753448 +0000 UTC m=+0.782925356,LastTimestamp:2026-03-09 13:20:45.66303954 +0000 UTC m=+0.913211448,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.485833 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e90e367\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90e367 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532767079 +0000 UTC m=+0.782938987,LastTimestamp:2026-03-09 13:20:45.66304797 +0000 UTC m=+0.913219878,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.492777 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e910e0f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e910e0f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532777999 +0000 UTC m=+0.782949907,LastTimestamp:2026-03-09 13:20:45.66305535 +0000 UTC m=+0.913227258,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: I0309 13:21:49.497553 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.497894 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e90ae28\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90ae28 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532753448 +0000 UTC m=+0.782925356,LastTimestamp:2026-03-09 13:20:45.663195463 +0000 UTC m=+0.913367371,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.511343 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e90e367\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90e367 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532767079 +0000 UTC m=+0.782938987,LastTimestamp:2026-03-09 13:20:45.663206333 +0000 UTC m=+0.913378241,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.517253 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e910e0f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e910e0f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532777999 +0000 UTC m=+0.782949907,LastTimestamp:2026-03-09 13:20:45.663230864 +0000 UTC m=+0.913402772,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.521909 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e90ae28\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90ae28 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532753448 +0000 UTC m=+0.782925356,LastTimestamp:2026-03-09 13:20:45.664022 +0000 UTC m=+0.914193908,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.525905 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e90e367\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90e367 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532767079 +0000 UTC m=+0.782938987,LastTimestamp:2026-03-09 13:20:45.664058001 +0000 UTC m=+0.914229899,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.530127 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e910e0f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e910e0f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532777999 +0000 UTC m=+0.782949907,LastTimestamp:2026-03-09 13:20:45.664065361 +0000 UTC m=+0.914237269,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.534960 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e90ae28\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90ae28 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532753448 +0000 UTC m=+0.782925356,LastTimestamp:2026-03-09 13:20:45.66499769 +0000 UTC m=+0.915169598,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.540921 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2ee66e90e367\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2ee66e90e367 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:45.532767079 +0000 UTC m=+0.782938987,LastTimestamp:2026-03-09 13:20:45.66500745 +0000 UTC m=+0.915179358,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.546040 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee68d8d4ac9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.052625097 +0000 UTC m=+1.302797005,LastTimestamp:2026-03-09 13:20:46.052625097 +0000 UTC m=+1.302797005,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.550135 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee68d8de552 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.052664658 +0000 UTC m=+1.302836566,LastTimestamp:2026-03-09 13:20:46.052664658 +0000 UTC m=+1.302836566,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.554040 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b2ee68d8d7bf7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.052637687 +0000 UTC m=+1.302809635,LastTimestamp:2026-03-09 13:20:46.052637687 +0000 UTC m=+1.302809635,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.557572 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee68e94934f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.069879631 +0000 UTC m=+1.320051549,LastTimestamp:2026-03-09 13:20:46.069879631 +0000 UTC m=+1.320051549,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.568909 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2ee68ed8357d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.074312061 +0000 UTC m=+1.324483990,LastTimestamp:2026-03-09 13:20:46.074312061 +0000 UTC m=+1.324483990,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.572400 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b2ee6b427710e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.700261646 +0000 UTC m=+1.950433554,LastTimestamp:2026-03-09 13:20:46.700261646 +0000 UTC m=+1.950433554,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.575894 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee6b429075a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.700365658 +0000 UTC m=+1.950537566,LastTimestamp:2026-03-09 13:20:46.700365658 +0000 UTC m=+1.950537566,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.580240 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2ee6b42a5c6a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.70045297 +0000 UTC m=+1.950624878,LastTimestamp:2026-03-09 13:20:46.70045297 +0000 UTC m=+1.950624878,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.586499 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee6b42ade38 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.7004862 +0000 UTC m=+1.950658108,LastTimestamp:2026-03-09 13:20:46.7004862 +0000 UTC m=+1.950658108,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.590918 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee6b42b40e5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.700511461 +0000 UTC m=+1.950683369,LastTimestamp:2026-03-09 13:20:46.700511461 +0000 UTC m=+1.950683369,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.594847 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2ee6b578e8bf openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.722377919 +0000 UTC m=+1.972549827,LastTimestamp:2026-03-09 13:20:46.722377919 +0000 UTC m=+1.972549827,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.600233 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee6b5b56cfb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.726343931 +0000 UTC m=+1.976515839,LastTimestamp:2026-03-09 13:20:46.726343931 +0000 UTC m=+1.976515839,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.606316 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee6b5b6d588 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.726436232 +0000 UTC m=+1.976608140,LastTimestamp:2026-03-09 13:20:46.726436232 +0000 UTC m=+1.976608140,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.609913 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee6b5b75e4b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.726471243 +0000 UTC m=+1.976643151,LastTimestamp:2026-03-09 13:20:46.726471243 +0000 UTC m=+1.976643151,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.613499 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b2ee6b5bce909 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.726834441 +0000 UTC m=+1.977006339,LastTimestamp:2026-03-09 13:20:46.726834441 +0000 UTC m=+1.977006339,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.617938 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee6b5cb3b48 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.727773 +0000 UTC m=+1.977944908,LastTimestamp:2026-03-09 13:20:46.727773 +0000 UTC m=+1.977944908,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.621242 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee6ca74dc64 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.074434148 +0000 UTC m=+2.324606056,LastTimestamp:2026-03-09 13:20:47.074434148 +0000 UTC m=+2.324606056,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.624592 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee6cb076563 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.084037475 +0000 UTC m=+2.334209383,LastTimestamp:2026-03-09 13:20:47.084037475 +0000 UTC m=+2.334209383,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.627802 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee6cb16efb6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.085055926 +0000 UTC m=+2.335227834,LastTimestamp:2026-03-09 13:20:47.085055926 +0000 UTC m=+2.335227834,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.633996 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee6d595366d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.261103725 +0000 UTC m=+2.511275653,LastTimestamp:2026-03-09 13:20:47.261103725 +0000 UTC m=+2.511275653,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.637559 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee6d647f4f2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.272817906 +0000 UTC m=+2.522989824,LastTimestamp:2026-03-09 13:20:47.272817906 +0000 UTC m=+2.522989824,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.641018 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee6d65bd7ca openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.274121162 +0000 UTC m=+2.524293080,LastTimestamp:2026-03-09 13:20:47.274121162 +0000 UTC m=+2.524293080,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.644316 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee6e07a4c29 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.443889193 +0000 UTC m=+2.694061101,LastTimestamp:2026-03-09 13:20:47.443889193 +0000 UTC m=+2.694061101,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.648503 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee6e1b42eb3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.464459955 +0000 UTC m=+2.714631873,LastTimestamp:2026-03-09 13:20:47.464459955 +0000 UTC m=+2.714631873,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.652387 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2ee6e8616152 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.576473938 +0000 UTC m=+2.826645846,LastTimestamp:2026-03-09 13:20:47.576473938 +0000 UTC m=+2.826645846,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.656298 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee6e8c58cd7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.583038679 +0000 UTC m=+2.833210587,LastTimestamp:2026-03-09 13:20:47.583038679 +0000 UTC m=+2.833210587,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.660018 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee6e9032c5a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.58707721 +0000 UTC m=+2.837249118,LastTimestamp:2026-03-09 13:20:47.58707721 +0000 UTC m=+2.837249118,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.663166 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b2ee6e90d5c7f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.587744895 +0000 UTC m=+2.837916813,LastTimestamp:2026-03-09 13:20:47.587744895 +0000 UTC m=+2.837916813,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.666386 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2ee6f446c9ce openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.776057806 +0000 UTC m=+3.026229704,LastTimestamp:2026-03-09 13:20:47.776057806 +0000 UTC m=+3.026229704,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.669621 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee6f47996d4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.779387092 +0000 UTC m=+3.029558990,LastTimestamp:2026-03-09 13:20:47.779387092 +0000 UTC m=+3.029558990,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.672735 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee6f47acc02 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.779466242 +0000 UTC m=+3.029638150,LastTimestamp:2026-03-09 13:20:47.779466242 +0000 UTC m=+3.029638150,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.677097 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b2ee6f47b9557 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.779517783 +0000 UTC m=+3.029689691,LastTimestamp:2026-03-09 13:20:47.779517783 +0000 UTC m=+3.029689691,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.680978 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2ee6f4e69bfd openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.786531837 +0000 UTC m=+3.036703745,LastTimestamp:2026-03-09 13:20:47.786531837 +0000 UTC m=+3.036703745,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.684889 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2ee6f4fdb55a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.788045658 +0000 UTC m=+3.038217566,LastTimestamp:2026-03-09 13:20:47.788045658 +0000 UTC m=+3.038217566,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.689326 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee6f5a52033 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.799017523 +0000 UTC m=+3.049189431,LastTimestamp:2026-03-09 13:20:47.799017523 +0000 UTC m=+3.049189431,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.693618 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee6f5b44c7d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.800011901 +0000 UTC m=+3.050183809,LastTimestamp:2026-03-09 13:20:47.800011901 +0000 UTC m=+3.050183809,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.698605 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b2ee6f5fd87b4 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.804811188 +0000 UTC m=+3.054983096,LastTimestamp:2026-03-09 13:20:47.804811188 +0000 UTC m=+3.054983096,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.702334 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee6f6b79f2c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.817006892 +0000 UTC m=+3.067178800,LastTimestamp:2026-03-09 13:20:47.817006892 +0000 UTC m=+3.067178800,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.705770 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2ee6feabb9c8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.950445 +0000 UTC m=+3.200616908,LastTimestamp:2026-03-09 13:20:47.950445 +0000 UTC m=+3.200616908,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.709690 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee6fec32ec0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.951982272 +0000 UTC m=+3.202154180,LastTimestamp:2026-03-09 13:20:47.951982272 +0000 UTC m=+3.202154180,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.713081 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2ee6ff576936 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.961696566 +0000 UTC m=+3.211868474,LastTimestamp:2026-03-09 13:20:47.961696566 +0000 UTC m=+3.211868474,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.716725 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2ee6ff6870b3 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.962812595 +0000 UTC m=+3.212984513,LastTimestamp:2026-03-09 13:20:47.962812595 +0000 UTC m=+3.212984513,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.721330 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee6ffbb9367 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.968260967 +0000 UTC m=+3.218432875,LastTimestamp:2026-03-09 13:20:47.968260967 +0000 UTC m=+3.218432875,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.725224 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee6ffcc72e4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.969366756 +0000 UTC m=+3.219538664,LastTimestamp:2026-03-09 13:20:47.969366756 +0000 UTC m=+3.219538664,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.729103 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2ee7098eab6d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:48.133090157 +0000 UTC m=+3.383262065,LastTimestamp:2026-03-09 13:20:48.133090157 +0000 UTC m=+3.383262065,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.733097 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee7099595f3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:48.133543411 +0000 UTC m=+3.383715329,LastTimestamp:2026-03-09 13:20:48.133543411 +0000 UTC m=+3.383715329,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.736708 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2ee70a6bc347 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:48.147579719 +0000 UTC m=+3.397751637,LastTimestamp:2026-03-09 13:20:48.147579719 +0000 UTC m=+3.397751637,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.740735 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee70a91c514 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:48.150070548 +0000 UTC m=+3.400242466,LastTimestamp:2026-03-09 13:20:48.150070548 +0000 UTC m=+3.400242466,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.744011 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee70aa27556 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:48.151164246 +0000 UTC m=+3.401336154,LastTimestamp:2026-03-09 13:20:48.151164246 +0000 UTC m=+3.401336154,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.747398 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee7154fe183 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:48.330301827 +0000 UTC m=+3.580473735,LastTimestamp:2026-03-09 13:20:48.330301827 +0000 UTC m=+3.580473735,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.750750 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee715fff996 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:48.341842326 +0000 UTC m=+3.592014244,LastTimestamp:2026-03-09 13:20:48.341842326 +0000 UTC m=+3.592014244,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.754091 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee7161286eb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:48.343058155 +0000 UTC m=+3.593230073,LastTimestamp:2026-03-09 13:20:48.343058155 +0000 UTC m=+3.593230073,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.757282 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee71f7d4291 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:48.501047953 +0000 UTC m=+3.751219871,LastTimestamp:2026-03-09 13:20:48.501047953 +0000 UTC m=+3.751219871,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.761159 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee7201be9c3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:48.511445443 +0000 UTC m=+3.761617351,LastTimestamp:2026-03-09 13:20:48.511445443 +0000 UTC m=+3.761617351,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.764788 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee72590fcd6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:48.603004118 +0000 UTC m=+3.853176026,LastTimestamp:2026-03-09 13:20:48.603004118 +0000 UTC m=+3.853176026,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.768402 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee730b5d425 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:48.789967909 +0000 UTC m=+4.040139827,LastTimestamp:2026-03-09 13:20:48.789967909 +0000 UTC m=+4.040139827,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.772148 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee7320a4493 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:48.812278931 +0000 UTC m=+4.062450849,LastTimestamp:2026-03-09 13:20:48.812278931 +0000 UTC m=+4.062450849,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.777150 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee761ac119c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:49.611411868 +0000 UTC m=+4.861583776,LastTimestamp:2026-03-09 13:20:49.611411868 +0000 UTC m=+4.861583776,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.780909 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee76b06f41d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:49.768363037 +0000 UTC m=+5.018534945,LastTimestamp:2026-03-09 13:20:49.768363037 +0000 UTC m=+5.018534945,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.783962 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee76b880026 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:49.776820262 +0000 UTC m=+5.026992170,LastTimestamp:2026-03-09 13:20:49.776820262 +0000 UTC m=+5.026992170,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.787213 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee76b9749e4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:49.77782218 +0000 UTC m=+5.027994088,LastTimestamp:2026-03-09 13:20:49.77782218 +0000 UTC m=+5.027994088,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.790464 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee774e9e7e5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:49.934231525 +0000 UTC m=+5.184403473,LastTimestamp:2026-03-09 13:20:49.934231525 +0000 UTC m=+5.184403473,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.793756 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee775959394 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:49.945482132 +0000 UTC m=+5.195654050,LastTimestamp:2026-03-09 13:20:49.945482132 +0000 UTC m=+5.195654050,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.797450 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee775a4a94a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:49.94647073 +0000 UTC m=+5.196642648,LastTimestamp:2026-03-09 13:20:49.94647073 +0000 UTC m=+5.196642648,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.801006 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee78113039b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:50.138252187 +0000 UTC m=+5.388424095,LastTimestamp:2026-03-09 13:20:50.138252187 +0000 UTC m=+5.388424095,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.806276 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee781cf1248 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:50.150576712 +0000 UTC m=+5.400748620,LastTimestamp:2026-03-09 13:20:50.150576712 +0000 UTC m=+5.400748620,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.810216 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee781e3a579 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:50.151925113 +0000 UTC m=+5.402097041,LastTimestamp:2026-03-09 13:20:50.151925113 +0000 UTC m=+5.402097041,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.813594 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee78d19debc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:50.340028092 +0000 UTC m=+5.590200010,LastTimestamp:2026-03-09 13:20:50.340028092 +0000 UTC m=+5.590200010,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.817625 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee78dcccc17 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:50.351754263 +0000 UTC m=+5.601926171,LastTimestamp:2026-03-09 13:20:50.351754263 +0000 UTC m=+5.601926171,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.820950 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee78ddfdea0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:50.353004192 +0000 UTC m=+5.603176100,LastTimestamp:2026-03-09 13:20:50.353004192 +0000 UTC m=+5.603176100,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.824800 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee796fd4daf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:50.505928111 +0000 UTC m=+5.756100019,LastTimestamp:2026-03-09 13:20:50.505928111 +0000 UTC m=+5.756100019,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.827772 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ee79789d9f6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:50.515139062 +0000 UTC m=+5.765310970,LastTimestamp:2026-03-09 13:20:50.515139062 +0000 UTC m=+5.765310970,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.833323 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 13:21:49 crc kubenswrapper[4764]: &Event{ObjectMeta:{kube-controller-manager-crc.189b2ee8e3ce739d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 09 13:21:49 crc kubenswrapper[4764]: body: Mar 09 13:21:49 crc kubenswrapper[4764]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:56.089670557 +0000 UTC m=+11.339842495,LastTimestamp:2026-03-09 13:20:56.089670557 +0000 UTC m=+11.339842495,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 13:21:49 crc kubenswrapper[4764]: > Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.837019 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee8e3cfea81 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:56.089766529 +0000 UTC m=+11.339938447,LastTimestamp:2026-03-09 13:20:56.089766529 +0000 UTC m=+11.339938447,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.840567 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 09 13:21:49 crc kubenswrapper[4764]: &Event{ObjectMeta:{kube-apiserver-crc.189b2ee99637bf19 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Liveness probe error: Get "https://192.168.126.11:17697/healthz": read tcp 192.168.126.11:58756->192.168.126.11:17697: read: connection reset by peer Mar 09 13:21:49 crc kubenswrapper[4764]: body: Mar 09 13:21:49 crc kubenswrapper[4764]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:59.082915609 +0000 UTC m=+14.333087537,LastTimestamp:2026-03-09 13:20:59.082915609 +0000 UTC m=+14.333087537,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 13:21:49 crc kubenswrapper[4764]: > Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.844400 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee996388eea openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Liveness probe failed: Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:58756->192.168.126.11:17697: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:59.08296881 +0000 UTC m=+14.333140738,LastTimestamp:2026-03-09 13:20:59.08296881 +0000 UTC m=+14.333140738,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.849447 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 09 13:21:49 crc kubenswrapper[4764]: &Event{ObjectMeta:{kube-apiserver-crc.189b2ee9a99b35e1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 09 13:21:49 crc kubenswrapper[4764]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 09 13:21:49 crc kubenswrapper[4764]: Mar 09 13:21:49 crc kubenswrapper[4764]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:59.408201185 +0000 UTC m=+14.658373103,LastTimestamp:2026-03-09 13:20:59.408201185 +0000 UTC m=+14.658373103,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 13:21:49 crc kubenswrapper[4764]: > Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.852822 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee9a99bd790 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:59.408242576 +0000 UTC m=+14.658414504,LastTimestamp:2026-03-09 13:20:59.408242576 +0000 UTC m=+14.658414504,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.856550 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b2ee9a99b35e1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 09 13:21:49 crc kubenswrapper[4764]: &Event{ObjectMeta:{kube-apiserver-crc.189b2ee9a99b35e1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 09 13:21:49 crc kubenswrapper[4764]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 09 13:21:49 crc kubenswrapper[4764]: Mar 09 13:21:49 crc kubenswrapper[4764]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:59.408201185 +0000 UTC m=+14.658373103,LastTimestamp:2026-03-09 13:20:59.414788 +0000 UTC m=+14.664959908,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 13:21:49 crc kubenswrapper[4764]: > Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.859696 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b2ee9a99bd790\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee9a99bd790 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:59.408242576 +0000 UTC m=+14.658414504,LastTimestamp:2026-03-09 13:20:59.414822691 +0000 UTC m=+14.664994599,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.863256 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b2ee7161286eb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee7161286eb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:48.343058155 +0000 UTC m=+3.593230073,LastTimestamp:2026-03-09 13:20:59.64380987 +0000 UTC m=+14.893981778,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.869451 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 13:21:49 crc kubenswrapper[4764]: &Event{ObjectMeta:{kube-controller-manager-crc.189b2eeb37ea9416 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 13:21:49 crc kubenswrapper[4764]: body: Mar 09 13:21:49 crc kubenswrapper[4764]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:21:06.090734614 +0000 UTC m=+21.340906542,LastTimestamp:2026-03-09 13:21:06.090734614 +0000 UTC m=+21.340906542,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 13:21:49 crc kubenswrapper[4764]: > Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.874126 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2eeb37eb4e2f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:21:06.090782255 +0000 UTC m=+21.340954163,LastTimestamp:2026-03-09 13:21:06.090782255 +0000 UTC m=+21.340954163,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.877727 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2eeb37ea9416\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 13:21:49 crc kubenswrapper[4764]: &Event{ObjectMeta:{kube-controller-manager-crc.189b2eeb37ea9416 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 13:21:49 crc kubenswrapper[4764]: body: Mar 09 13:21:49 crc kubenswrapper[4764]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:21:06.090734614 +0000 UTC m=+21.340906542,LastTimestamp:2026-03-09 13:21:16.090163232 +0000 UTC m=+31.340335170,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 13:21:49 crc kubenswrapper[4764]: > Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.880662 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2eeb37eb4e2f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2eeb37eb4e2f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:21:06.090782255 +0000 UTC m=+21.340954163,LastTimestamp:2026-03-09 13:21:16.090308845 +0000 UTC m=+31.340480793,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.883515 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2eed8c1cfe41 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:21:16.093259329 +0000 UTC m=+31.343431247,LastTimestamp:2026-03-09 13:21:16.093259329 +0000 UTC m=+31.343431247,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.886579 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2ee6b5cb3b48\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee6b5cb3b48 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:46.727773 +0000 UTC m=+1.977944908,LastTimestamp:2026-03-09 13:21:16.258092657 +0000 UTC m=+31.508264565,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.889599 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2ee6ca74dc64\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee6ca74dc64 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.074434148 +0000 UTC m=+2.324606056,LastTimestamp:2026-03-09 13:21:16.443096842 +0000 UTC m=+31.693268740,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.892515 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2ee6cb076563\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee6cb076563 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:47.084037475 +0000 UTC m=+2.334209383,LastTimestamp:2026-03-09 13:21:16.454293273 +0000 UTC m=+31.704465181,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.897085 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2eeb37ea9416\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 13:21:49 crc kubenswrapper[4764]: &Event{ObjectMeta:{kube-controller-manager-crc.189b2eeb37ea9416 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 13:21:49 crc kubenswrapper[4764]: body: Mar 09 13:21:49 crc kubenswrapper[4764]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:21:06.090734614 +0000 UTC m=+21.340906542,LastTimestamp:2026-03-09 13:21:26.090168382 +0000 UTC m=+41.340340300,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 13:21:49 crc kubenswrapper[4764]: > Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.900565 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2eeb37eb4e2f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2eeb37eb4e2f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:21:06.090782255 +0000 UTC m=+21.340954163,LastTimestamp:2026-03-09 13:21:26.090219873 +0000 UTC m=+41.340391781,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:21:49 crc kubenswrapper[4764]: E0309 13:21:49.904958 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2eeb37ea9416\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 13:21:49 crc kubenswrapper[4764]: &Event{ObjectMeta:{kube-controller-manager-crc.189b2eeb37ea9416 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 13:21:49 crc kubenswrapper[4764]: body: Mar 09 13:21:49 crc kubenswrapper[4764]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:21:06.090734614 +0000 UTC m=+21.340906542,LastTimestamp:2026-03-09 13:21:36.090367668 +0000 UTC m=+51.340539616,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 13:21:49 crc kubenswrapper[4764]: > Mar 09 13:21:50 crc kubenswrapper[4764]: I0309 13:21:50.499240 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:50 crc kubenswrapper[4764]: I0309 13:21:50.766459 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:21:50 crc kubenswrapper[4764]: I0309 13:21:50.766676 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:50 crc kubenswrapper[4764]: I0309 13:21:50.767681 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:50 crc kubenswrapper[4764]: I0309 13:21:50.767716 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:50 crc kubenswrapper[4764]: I0309 13:21:50.767726 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:51 crc kubenswrapper[4764]: I0309 13:21:51.499885 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:52 crc kubenswrapper[4764]: I0309 13:21:52.497692 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:52 crc kubenswrapper[4764]: I0309 13:21:52.918854 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:21:52 crc kubenswrapper[4764]: I0309 13:21:52.919062 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:52 crc kubenswrapper[4764]: I0309 13:21:52.920194 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:52 crc kubenswrapper[4764]: I0309 13:21:52.920236 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:52 crc kubenswrapper[4764]: I0309 13:21:52.920248 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:52 crc kubenswrapper[4764]: I0309 13:21:52.920987 4764 scope.go:117] "RemoveContainer" containerID="033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356" Mar 09 13:21:52 crc kubenswrapper[4764]: E0309 13:21:52.921185 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:21:53 crc kubenswrapper[4764]: I0309 13:21:53.090110 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:21:53 crc kubenswrapper[4764]: I0309 13:21:53.090522 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:53 crc kubenswrapper[4764]: I0309 13:21:53.091735 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:53 crc kubenswrapper[4764]: I0309 13:21:53.091766 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:53 crc kubenswrapper[4764]: I0309 13:21:53.091778 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:53 crc kubenswrapper[4764]: I0309 13:21:53.095218 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:21:53 crc kubenswrapper[4764]: I0309 13:21:53.497562 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:53 crc kubenswrapper[4764]: I0309 13:21:53.559247 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:53 crc kubenswrapper[4764]: I0309 13:21:53.560302 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:53 crc kubenswrapper[4764]: I0309 13:21:53.560333 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:53 crc kubenswrapper[4764]: I0309 13:21:53.560345 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:53 crc kubenswrapper[4764]: I0309 13:21:53.817781 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:53 crc kubenswrapper[4764]: I0309 13:21:53.818691 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:53 crc kubenswrapper[4764]: I0309 13:21:53.818821 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:53 crc kubenswrapper[4764]: I0309 13:21:53.818908 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:54 crc kubenswrapper[4764]: I0309 13:21:54.497881 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:54 crc kubenswrapper[4764]: E0309 13:21:54.852760 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 13:21:54 crc kubenswrapper[4764]: I0309 13:21:54.856133 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:54 crc kubenswrapper[4764]: I0309 13:21:54.857122 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:54 crc kubenswrapper[4764]: I0309 13:21:54.857148 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:54 crc kubenswrapper[4764]: I0309 13:21:54.857157 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:54 crc kubenswrapper[4764]: I0309 13:21:54.857180 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:21:54 crc kubenswrapper[4764]: E0309 13:21:54.861019 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 13:21:55 crc kubenswrapper[4764]: I0309 13:21:55.497328 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:55 crc kubenswrapper[4764]: E0309 13:21:55.622316 4764 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 13:21:56 crc kubenswrapper[4764]: I0309 13:21:56.498831 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:56 crc kubenswrapper[4764]: I0309 13:21:56.841653 4764 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 13:21:56 crc kubenswrapper[4764]: I0309 13:21:56.855604 4764 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 09 13:21:57 crc kubenswrapper[4764]: W0309 13:21:57.237784 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 09 13:21:57 crc kubenswrapper[4764]: E0309 13:21:57.237862 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 09 13:21:57 crc kubenswrapper[4764]: I0309 13:21:57.498081 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:58 crc kubenswrapper[4764]: I0309 13:21:58.498915 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:59 crc kubenswrapper[4764]: I0309 13:21:59.499375 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:22:00 crc kubenswrapper[4764]: I0309 13:22:00.306032 4764 csr.go:261] certificate signing request csr-b28kp is approved, waiting to be issued Mar 09 13:22:00 crc kubenswrapper[4764]: I0309 13:22:00.312984 4764 csr.go:257] certificate signing request csr-b28kp is issued Mar 09 13:22:00 crc kubenswrapper[4764]: I0309 13:22:00.332064 4764 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 09 13:22:00 crc kubenswrapper[4764]: I0309 13:22:00.409881 4764 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 09 13:22:00 crc kubenswrapper[4764]: I0309 13:22:00.769631 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:22:00 crc kubenswrapper[4764]: I0309 13:22:00.769784 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:22:00 crc kubenswrapper[4764]: I0309 13:22:00.770945 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:00 crc kubenswrapper[4764]: I0309 13:22:00.770992 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:00 crc kubenswrapper[4764]: I0309 13:22:00.771004 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.314724 4764 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-18 20:27:13.663276113 +0000 UTC Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.314780 4764 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7567h5m12.348503808s for next certificate rotation Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.861692 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.863736 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.863900 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.864019 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.864239 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.874606 4764 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.875023 4764 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 09 13:22:01 crc kubenswrapper[4764]: E0309 13:22:01.875075 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.878871 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.878902 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.878912 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.878927 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.878937 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:01Z","lastTransitionTime":"2026-03-09T13:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:01 crc kubenswrapper[4764]: E0309 13:22:01.897048 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.909290 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.909337 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.909351 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.909369 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.909381 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:01Z","lastTransitionTime":"2026-03-09T13:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:01 crc kubenswrapper[4764]: E0309 13:22:01.926767 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.936250 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.936305 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.936317 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.936335 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.936349 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:01Z","lastTransitionTime":"2026-03-09T13:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:01 crc kubenswrapper[4764]: E0309 13:22:01.947247 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.955214 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.955283 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.955296 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.955330 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:01 crc kubenswrapper[4764]: I0309 13:22:01.955343 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:01Z","lastTransitionTime":"2026-03-09T13:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:01 crc kubenswrapper[4764]: E0309 13:22:01.966250 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:01 crc kubenswrapper[4764]: E0309 13:22:01.966414 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:22:01 crc kubenswrapper[4764]: E0309 13:22:01.966448 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:02 crc kubenswrapper[4764]: E0309 13:22:02.066831 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:02 crc kubenswrapper[4764]: E0309 13:22:02.167604 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:02 crc kubenswrapper[4764]: E0309 13:22:02.268371 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:02 crc kubenswrapper[4764]: E0309 13:22:02.368968 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:02 crc kubenswrapper[4764]: E0309 13:22:02.469081 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:02 crc kubenswrapper[4764]: E0309 13:22:02.569761 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:02 crc kubenswrapper[4764]: E0309 13:22:02.670815 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:02 crc kubenswrapper[4764]: E0309 13:22:02.771384 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:02 crc kubenswrapper[4764]: E0309 13:22:02.872516 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:02 crc kubenswrapper[4764]: E0309 13:22:02.972611 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:03 crc kubenswrapper[4764]: E0309 13:22:03.072749 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:03 crc kubenswrapper[4764]: E0309 13:22:03.173394 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:03 crc kubenswrapper[4764]: E0309 13:22:03.273676 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:03 crc kubenswrapper[4764]: E0309 13:22:03.374612 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:03 crc kubenswrapper[4764]: E0309 13:22:03.475506 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:03 crc kubenswrapper[4764]: I0309 13:22:03.559690 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:22:03 crc kubenswrapper[4764]: I0309 13:22:03.560815 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:03 crc kubenswrapper[4764]: I0309 13:22:03.560853 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:03 crc kubenswrapper[4764]: I0309 13:22:03.560866 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:03 crc kubenswrapper[4764]: I0309 13:22:03.561587 4764 scope.go:117] "RemoveContainer" containerID="033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356" Mar 09 13:22:03 crc kubenswrapper[4764]: E0309 13:22:03.561785 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:22:03 crc kubenswrapper[4764]: E0309 13:22:03.576786 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:03 crc kubenswrapper[4764]: E0309 13:22:03.677579 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:03 crc kubenswrapper[4764]: E0309 13:22:03.778486 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:03 crc kubenswrapper[4764]: E0309 13:22:03.879136 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:03 crc kubenswrapper[4764]: E0309 13:22:03.979874 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:04 crc kubenswrapper[4764]: E0309 13:22:04.080826 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:04 crc kubenswrapper[4764]: E0309 13:22:04.181794 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:04 crc kubenswrapper[4764]: E0309 13:22:04.282743 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:04 crc kubenswrapper[4764]: I0309 13:22:04.378115 4764 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 09 13:22:04 crc kubenswrapper[4764]: E0309 13:22:04.383005 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:04 crc kubenswrapper[4764]: E0309 13:22:04.484049 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:04 crc kubenswrapper[4764]: I0309 13:22:04.558906 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:22:04 crc kubenswrapper[4764]: I0309 13:22:04.560122 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:04 crc kubenswrapper[4764]: I0309 13:22:04.560178 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:04 crc kubenswrapper[4764]: I0309 13:22:04.560191 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:04 crc kubenswrapper[4764]: E0309 13:22:04.584373 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:04 crc kubenswrapper[4764]: E0309 13:22:04.685484 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:04 crc kubenswrapper[4764]: E0309 13:22:04.786509 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:04 crc kubenswrapper[4764]: E0309 13:22:04.887115 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:04 crc kubenswrapper[4764]: E0309 13:22:04.987449 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:05 crc kubenswrapper[4764]: E0309 13:22:05.088288 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:05 crc kubenswrapper[4764]: E0309 13:22:05.189022 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:05 crc kubenswrapper[4764]: E0309 13:22:05.289369 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:05 crc kubenswrapper[4764]: E0309 13:22:05.389810 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:05 crc kubenswrapper[4764]: E0309 13:22:05.490307 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:05 crc kubenswrapper[4764]: E0309 13:22:05.590775 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:05 crc kubenswrapper[4764]: E0309 13:22:05.623413 4764 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 13:22:05 crc kubenswrapper[4764]: E0309 13:22:05.691529 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:05 crc kubenswrapper[4764]: E0309 13:22:05.792315 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:05 crc kubenswrapper[4764]: E0309 13:22:05.892722 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:05 crc kubenswrapper[4764]: E0309 13:22:05.993514 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:06 crc kubenswrapper[4764]: E0309 13:22:06.094343 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:06 crc kubenswrapper[4764]: E0309 13:22:06.195119 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:06 crc kubenswrapper[4764]: E0309 13:22:06.295816 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:06 crc kubenswrapper[4764]: E0309 13:22:06.396020 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:06 crc kubenswrapper[4764]: E0309 13:22:06.496888 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:06 crc kubenswrapper[4764]: E0309 13:22:06.597104 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:06 crc kubenswrapper[4764]: E0309 13:22:06.697959 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:06 crc kubenswrapper[4764]: E0309 13:22:06.798934 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:06 crc kubenswrapper[4764]: E0309 13:22:06.899844 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:07 crc kubenswrapper[4764]: E0309 13:22:07.000684 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:07 crc kubenswrapper[4764]: E0309 13:22:07.101676 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:07 crc kubenswrapper[4764]: E0309 13:22:07.202532 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:07 crc kubenswrapper[4764]: E0309 13:22:07.303439 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:07 crc kubenswrapper[4764]: E0309 13:22:07.404004 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:07 crc kubenswrapper[4764]: E0309 13:22:07.504785 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:07 crc kubenswrapper[4764]: E0309 13:22:07.605229 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:07 crc kubenswrapper[4764]: E0309 13:22:07.706370 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:07 crc kubenswrapper[4764]: E0309 13:22:07.807337 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:07 crc kubenswrapper[4764]: E0309 13:22:07.908035 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:08 crc kubenswrapper[4764]: E0309 13:22:08.008993 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:08 crc kubenswrapper[4764]: E0309 13:22:08.109976 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:08 crc kubenswrapper[4764]: E0309 13:22:08.211208 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:08 crc kubenswrapper[4764]: E0309 13:22:08.312286 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:08 crc kubenswrapper[4764]: E0309 13:22:08.413462 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:08 crc kubenswrapper[4764]: E0309 13:22:08.513870 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:08 crc kubenswrapper[4764]: E0309 13:22:08.614359 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:08 crc kubenswrapper[4764]: E0309 13:22:08.715437 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:08 crc kubenswrapper[4764]: E0309 13:22:08.816171 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:08 crc kubenswrapper[4764]: E0309 13:22:08.917198 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:09 crc kubenswrapper[4764]: E0309 13:22:09.017549 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:09 crc kubenswrapper[4764]: E0309 13:22:09.118380 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:09 crc kubenswrapper[4764]: E0309 13:22:09.218515 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:09 crc kubenswrapper[4764]: E0309 13:22:09.318982 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:09 crc kubenswrapper[4764]: E0309 13:22:09.419122 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:09 crc kubenswrapper[4764]: E0309 13:22:09.519546 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:09 crc kubenswrapper[4764]: E0309 13:22:09.620350 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:09 crc kubenswrapper[4764]: E0309 13:22:09.721619 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:09 crc kubenswrapper[4764]: E0309 13:22:09.822814 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:09 crc kubenswrapper[4764]: E0309 13:22:09.923315 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:10 crc kubenswrapper[4764]: E0309 13:22:10.023827 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:10 crc kubenswrapper[4764]: E0309 13:22:10.124932 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:10 crc kubenswrapper[4764]: E0309 13:22:10.226081 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:10 crc kubenswrapper[4764]: E0309 13:22:10.326781 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:10 crc kubenswrapper[4764]: E0309 13:22:10.427248 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:10 crc kubenswrapper[4764]: E0309 13:22:10.527969 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:10 crc kubenswrapper[4764]: E0309 13:22:10.628824 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:10 crc kubenswrapper[4764]: E0309 13:22:10.729967 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:10 crc kubenswrapper[4764]: E0309 13:22:10.830349 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:10 crc kubenswrapper[4764]: E0309 13:22:10.931146 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:11 crc kubenswrapper[4764]: E0309 13:22:11.031293 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:11 crc kubenswrapper[4764]: E0309 13:22:11.131475 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:11 crc kubenswrapper[4764]: E0309 13:22:11.232549 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:11 crc kubenswrapper[4764]: E0309 13:22:11.333555 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:11 crc kubenswrapper[4764]: E0309 13:22:11.434727 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:11 crc kubenswrapper[4764]: E0309 13:22:11.535011 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:11 crc kubenswrapper[4764]: E0309 13:22:11.635586 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:11 crc kubenswrapper[4764]: E0309 13:22:11.736718 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:11 crc kubenswrapper[4764]: E0309 13:22:11.837722 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:11 crc kubenswrapper[4764]: E0309 13:22:11.938743 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.039847 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.140924 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.241186 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.342452 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.360955 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.364721 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.364750 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.364759 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.364772 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.364782 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:12Z","lastTransitionTime":"2026-03-09T13:22:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.374238 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.382506 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.382550 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.382560 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.382579 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.382589 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:12Z","lastTransitionTime":"2026-03-09T13:22:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.391992 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.395359 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.395408 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.395418 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.395433 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.395462 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:12Z","lastTransitionTime":"2026-03-09T13:22:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.404787 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.407859 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.407884 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.407892 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.407905 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:12 crc kubenswrapper[4764]: I0309 13:22:12.407915 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:12Z","lastTransitionTime":"2026-03-09T13:22:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.416860 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.416959 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.443381 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.544379 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.644947 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.745357 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.845723 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:12 crc kubenswrapper[4764]: E0309 13:22:12.946860 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:13 crc kubenswrapper[4764]: E0309 13:22:13.047951 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:13 crc kubenswrapper[4764]: E0309 13:22:13.148183 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:13 crc kubenswrapper[4764]: E0309 13:22:13.248366 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:13 crc kubenswrapper[4764]: E0309 13:22:13.348912 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:13 crc kubenswrapper[4764]: E0309 13:22:13.449597 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:13 crc kubenswrapper[4764]: E0309 13:22:13.550314 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:13 crc kubenswrapper[4764]: E0309 13:22:13.651271 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:13 crc kubenswrapper[4764]: E0309 13:22:13.751815 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:13 crc kubenswrapper[4764]: E0309 13:22:13.852762 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:13 crc kubenswrapper[4764]: E0309 13:22:13.953479 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.054428 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.155799 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.256394 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.356860 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.457245 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.521687 4764 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.531439 4764 apiserver.go:52] "Watching apiserver" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.536886 4764 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.537259 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.537810 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.537921 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.537980 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.538260 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.538310 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.537810 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.538409 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.539278 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.539378 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.539762 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.541763 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.542008 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.542245 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.542346 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.542369 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.543002 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.543153 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.545023 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.560007 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.560065 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.560078 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.560096 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.560109 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:14Z","lastTransitionTime":"2026-03-09T13:22:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.570942 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.580789 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.591898 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.595327 4764 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.600973 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606472 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606525 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606558 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606587 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606614 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606670 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606701 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606729 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606757 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606789 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606823 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606849 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606878 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606935 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606949 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.606949 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607032 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607052 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607069 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607087 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607102 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607163 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607180 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607197 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607213 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607227 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607242 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607257 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607271 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607285 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607299 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607314 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607331 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607346 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607361 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607376 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607392 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607409 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607427 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607442 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607458 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607477 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607494 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607509 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607547 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607564 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607578 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607594 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607611 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607625 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607654 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607691 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607706 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607724 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607739 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607754 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607771 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607788 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607804 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607822 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607840 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607855 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607870 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607910 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607950 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607966 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607980 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607762 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607785 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.608957 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.607943 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.608979 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609003 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.608041 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.608147 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.608143 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.608263 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.608316 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.608404 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.608563 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.608691 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.608707 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.608762 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.608809 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.608893 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.608984 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609140 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609212 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609220 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609315 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609555 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609602 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609623 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609656 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609672 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609686 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609701 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609719 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609736 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609752 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609768 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609784 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609800 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609815 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609830 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609845 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609860 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609876 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609890 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609906 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609911 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609921 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609938 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609953 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609968 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609984 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.609998 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610015 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610033 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610048 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610063 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610077 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610097 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610115 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610131 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610146 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610161 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610176 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610168 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610191 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610208 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610222 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610236 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610255 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610276 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610300 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610316 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610326 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610334 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610387 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610448 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610475 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610496 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610513 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610552 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610584 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610606 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610628 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610664 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610685 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610709 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610732 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610753 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610776 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610799 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610820 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610842 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610865 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610889 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610915 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610933 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610950 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610966 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610981 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610997 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611013 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611027 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611044 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611060 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611075 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611090 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611105 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611121 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611136 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611152 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611168 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611190 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611215 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611236 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611252 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611271 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611288 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611303 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611320 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611337 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611352 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611367 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611383 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611399 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611415 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611431 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611446 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611464 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611485 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611508 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611524 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611549 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611565 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611581 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611597 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611611 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611627 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611656 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611673 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611692 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611716 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611741 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611765 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611792 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611813 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611829 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611846 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611863 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611879 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611895 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611912 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611994 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612014 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612033 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612048 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612065 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612082 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612099 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612116 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612200 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612230 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612252 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612273 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612291 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612313 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612351 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612381 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612405 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612430 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612453 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612477 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612513 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612537 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612596 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612610 4764 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612622 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612637 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612667 4764 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612682 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612694 4764 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612706 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612718 4764 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612731 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612743 4764 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612756 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612803 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612818 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612832 4764 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612846 4764 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612858 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612868 4764 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612878 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612887 4764 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612896 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612907 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612916 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612925 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612935 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612946 4764 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612954 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612964 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612973 4764 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.613702 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610818 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.610867 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611059 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611628 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611626 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.611759 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612126 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612169 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612139 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612511 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612523 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612612 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.612772 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.613439 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.613456 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.613825 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.613894 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.613923 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.613988 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.614158 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.614248 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.614257 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.614415 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.614772 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.614952 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.615148 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.615201 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.615280 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.615398 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.615697 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.615857 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.615884 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.616052 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.613246 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.616224 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.616550 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.617168 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.617184 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.617139 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.617094 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.617333 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.617632 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.617852 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.617958 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.617994 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.618187 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.623224 4764 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.623837 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.623849 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.624292 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.624567 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.624739 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.624922 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.624980 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.625012 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.625190 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.625489 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.625732 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.625757 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.625763 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.625869 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.626138 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.626852 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.627996 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.628206 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.628268 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:15.128247985 +0000 UTC m=+90.378419893 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.628877 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.630475 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:22:15.130454773 +0000 UTC m=+90.380626701 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.630883 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.631457 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.631487 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.634043 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:15.133509113 +0000 UTC m=+90.383681031 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.636915 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.637276 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.640789 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.643588 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.643614 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.643629 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.643720 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:15.143701191 +0000 UTC m=+90.393873109 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.647067 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.647565 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.647566 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.647893 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.649475 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.649916 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.649946 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.650139 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.650393 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.650894 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.651067 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.655279 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.655359 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.655769 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.655930 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.656916 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.657410 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.655815 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.657510 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.657597 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.658046 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.658433 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.658690 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.658830 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.659038 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.659423 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.659489 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.659828 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.659848 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.659859 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.659904 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:15.159889156 +0000 UTC m=+90.410061054 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.660086 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.661731 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.661743 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.661821 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.662229 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.662283 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.662866 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.663315 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.663373 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.663357 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.663497 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.663519 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.663529 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.663545 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.663556 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:14Z","lastTransitionTime":"2026-03-09T13:22:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.663773 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.664842 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.665075 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.665134 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.665172 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.665836 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.665908 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.665920 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.665958 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.665508 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.666150 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.664979 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.666696 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.666412 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.666502 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.666939 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.666956 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.667042 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.666587 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.666895 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.667229 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.667389 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.667574 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.667876 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.667939 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.667949 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.668033 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.668281 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.668288 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.668596 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.668957 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.668909 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.669199 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.669206 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.669277 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.669335 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.669407 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.669523 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.669721 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.669790 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.669489 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.669969 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.670254 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.670706 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.670790 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.673718 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.673857 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.673928 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.673933 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.673956 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.674072 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.674249 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.674461 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.674540 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.674548 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.675163 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.676271 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.676319 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.676504 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.676798 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.677090 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.676677 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.677724 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.677968 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.678075 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.683992 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.686412 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.702052 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.705565 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713625 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713692 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713718 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713773 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713794 4764 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713803 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713814 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713823 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713831 4764 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713839 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713849 4764 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713857 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713851 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713865 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713911 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713924 4764 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713934 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713943 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713952 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713961 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713970 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713978 4764 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713986 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.713995 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714003 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714011 4764 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714020 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714029 4764 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714039 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714049 4764 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714057 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714066 4764 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714076 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714083 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714092 4764 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714100 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714109 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714117 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714125 4764 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714134 4764 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714143 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714151 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714159 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714167 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714176 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714185 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714194 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714204 4764 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714212 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714221 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714229 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714237 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714245 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714254 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714261 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714269 4764 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714277 4764 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714285 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714294 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714306 4764 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714316 4764 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714328 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714339 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714380 4764 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714389 4764 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714400 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714410 4764 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714420 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714430 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714440 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714449 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714457 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714466 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714476 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714485 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714495 4764 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714505 4764 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714515 4764 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714524 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714533 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714545 4764 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714555 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714564 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714575 4764 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714584 4764 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714594 4764 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714603 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714611 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714620 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714629 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714637 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714659 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714668 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714677 4764 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714687 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714694 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714702 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714711 4764 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714719 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714729 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714738 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714751 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714760 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714770 4764 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714779 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714788 4764 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714796 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714807 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714815 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714823 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714833 4764 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714843 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714853 4764 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714861 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714869 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714877 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714885 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714893 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714900 4764 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714908 4764 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714916 4764 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714924 4764 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714932 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714939 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714948 4764 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714955 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714964 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714971 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714979 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714987 4764 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.714994 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715002 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715011 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715019 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715027 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715036 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715044 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715052 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715061 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715070 4764 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715078 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715100 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715108 4764 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715117 4764 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715125 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715133 4764 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715140 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715149 4764 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715157 4764 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715165 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715173 4764 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715181 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715189 4764 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715197 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715205 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715213 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715221 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715229 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715237 4764 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715244 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715252 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715260 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715268 4764 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715276 4764 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715284 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715293 4764 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715301 4764 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715309 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.715317 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.769367 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.769571 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.769599 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.769617 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.769633 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:14Z","lastTransitionTime":"2026-03-09T13:22:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.855685 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.863746 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:22:14 crc kubenswrapper[4764]: W0309 13:22:14.866743 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-e5d77a6f769d325d48bde0d33b59e1563f232c38daf331f3dc8506a16387e41b WatchSource:0}: Error finding container e5d77a6f769d325d48bde0d33b59e1563f232c38daf331f3dc8506a16387e41b: Status 404 returned error can't find the container with id e5d77a6f769d325d48bde0d33b59e1563f232c38daf331f3dc8506a16387e41b Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.868996 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:22:14 crc kubenswrapper[4764]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 09 13:22:14 crc kubenswrapper[4764]: if [[ -f "/env/_master" ]]; then Mar 09 13:22:14 crc kubenswrapper[4764]: set -o allexport Mar 09 13:22:14 crc kubenswrapper[4764]: source "/env/_master" Mar 09 13:22:14 crc kubenswrapper[4764]: set +o allexport Mar 09 13:22:14 crc kubenswrapper[4764]: fi Mar 09 13:22:14 crc kubenswrapper[4764]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 09 13:22:14 crc kubenswrapper[4764]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 09 13:22:14 crc kubenswrapper[4764]: ho_enable="--enable-hybrid-overlay" Mar 09 13:22:14 crc kubenswrapper[4764]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 09 13:22:14 crc kubenswrapper[4764]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 09 13:22:14 crc kubenswrapper[4764]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 09 13:22:14 crc kubenswrapper[4764]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 09 13:22:14 crc kubenswrapper[4764]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 09 13:22:14 crc kubenswrapper[4764]: --webhook-host=127.0.0.1 \ Mar 09 13:22:14 crc kubenswrapper[4764]: --webhook-port=9743 \ Mar 09 13:22:14 crc kubenswrapper[4764]: ${ho_enable} \ Mar 09 13:22:14 crc kubenswrapper[4764]: --enable-interconnect \ Mar 09 13:22:14 crc kubenswrapper[4764]: --disable-approver \ Mar 09 13:22:14 crc kubenswrapper[4764]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 09 13:22:14 crc kubenswrapper[4764]: --wait-for-kubernetes-api=200s \ Mar 09 13:22:14 crc kubenswrapper[4764]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 09 13:22:14 crc kubenswrapper[4764]: --loglevel="${LOGLEVEL}" Mar 09 13:22:14 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 13:22:14 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.871693 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.871887 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.871929 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.871940 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.871958 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.871970 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:14Z","lastTransitionTime":"2026-03-09T13:22:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.872076 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:22:14 crc kubenswrapper[4764]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 09 13:22:14 crc kubenswrapper[4764]: if [[ -f "/env/_master" ]]; then Mar 09 13:22:14 crc kubenswrapper[4764]: set -o allexport Mar 09 13:22:14 crc kubenswrapper[4764]: source "/env/_master" Mar 09 13:22:14 crc kubenswrapper[4764]: set +o allexport Mar 09 13:22:14 crc kubenswrapper[4764]: fi Mar 09 13:22:14 crc kubenswrapper[4764]: Mar 09 13:22:14 crc kubenswrapper[4764]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 09 13:22:14 crc kubenswrapper[4764]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 09 13:22:14 crc kubenswrapper[4764]: --disable-webhook \ Mar 09 13:22:14 crc kubenswrapper[4764]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 09 13:22:14 crc kubenswrapper[4764]: --loglevel="${LOGLEVEL}" Mar 09 13:22:14 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 13:22:14 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.873891 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 09 13:22:14 crc kubenswrapper[4764]: W0309 13:22:14.875048 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-0b14fc603ec2fd3d4793aad753f5d6dff5d9f5af18473d9a20b44a72babf41b0 WatchSource:0}: Error finding container 0b14fc603ec2fd3d4793aad753f5d6dff5d9f5af18473d9a20b44a72babf41b0: Status 404 returned error can't find the container with id 0b14fc603ec2fd3d4793aad753f5d6dff5d9f5af18473d9a20b44a72babf41b0 Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.877159 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:22:14 crc kubenswrapper[4764]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 09 13:22:14 crc kubenswrapper[4764]: set -o allexport Mar 09 13:22:14 crc kubenswrapper[4764]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 09 13:22:14 crc kubenswrapper[4764]: source /etc/kubernetes/apiserver-url.env Mar 09 13:22:14 crc kubenswrapper[4764]: else Mar 09 13:22:14 crc kubenswrapper[4764]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 09 13:22:14 crc kubenswrapper[4764]: exit 1 Mar 09 13:22:14 crc kubenswrapper[4764]: fi Mar 09 13:22:14 crc kubenswrapper[4764]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 09 13:22:14 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 13:22:14 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.878613 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.885578 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 09 13:22:14 crc kubenswrapper[4764]: E0309 13:22:14.886905 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.973792 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.973825 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.973833 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.973845 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:14 crc kubenswrapper[4764]: I0309 13:22:14.973854 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:14Z","lastTransitionTime":"2026-03-09T13:22:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.075688 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.075725 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.075734 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.075747 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.075758 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:15Z","lastTransitionTime":"2026-03-09T13:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.178345 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.178383 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.178392 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.178406 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.178415 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:15Z","lastTransitionTime":"2026-03-09T13:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.218983 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.219079 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.219154 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:22:16.219125445 +0000 UTC m=+91.469297363 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.219167 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.219212 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.219233 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:16.219223498 +0000 UTC m=+91.469395416 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.219271 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.219325 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.219337 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.219360 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.219377 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.219428 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.219441 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:16.219418953 +0000 UTC m=+91.469590901 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.219460 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:16.219451364 +0000 UTC m=+91.469623282 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.219526 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.219540 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.219552 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.219591 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:16.219578567 +0000 UTC m=+91.469750575 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.281173 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.281212 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.281220 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.281234 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.281243 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:15Z","lastTransitionTime":"2026-03-09T13:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.384066 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.384179 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.384222 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.384241 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.384253 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:15Z","lastTransitionTime":"2026-03-09T13:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.486504 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.486561 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.486574 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.486591 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.486602 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:15Z","lastTransitionTime":"2026-03-09T13:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.563075 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.563954 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.565238 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.566026 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.567238 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.568002 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.568710 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.569839 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.570617 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.571189 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.571859 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.572543 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.573835 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.574451 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.575090 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.576173 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.576940 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.578103 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.578676 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.579360 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.580817 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.581549 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.582102 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.583102 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.583890 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.585585 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.586232 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.587030 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.589714 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.589868 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.589893 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.589904 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.589923 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.589972 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:15Z","lastTransitionTime":"2026-03-09T13:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.590266 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.591337 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.591842 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.592875 4764 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.592981 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.595128 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.596119 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.596521 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.596821 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.598315 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.599136 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.600266 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.601054 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.602332 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.602945 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.604122 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.604880 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.605480 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.606086 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.606640 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.607686 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.608338 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.609904 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.610528 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.611396 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.611955 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.612926 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.613484 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.613967 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.615756 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.623943 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.691485 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.691520 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.691529 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.691544 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.691553 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:15Z","lastTransitionTime":"2026-03-09T13:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.794197 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.794272 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.794294 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.794323 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.794351 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:15Z","lastTransitionTime":"2026-03-09T13:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.868696 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"0b3c4a4cbe4daceca4eb2a00886df91a34c91833b42afef05407e8fa7c8e57a1"} Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.870507 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"0b14fc603ec2fd3d4793aad753f5d6dff5d9f5af18473d9a20b44a72babf41b0"} Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.870698 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.871502 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e5d77a6f769d325d48bde0d33b59e1563f232c38daf331f3dc8506a16387e41b"} Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.871798 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:22:15 crc kubenswrapper[4764]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 09 13:22:15 crc kubenswrapper[4764]: set -o allexport Mar 09 13:22:15 crc kubenswrapper[4764]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 09 13:22:15 crc kubenswrapper[4764]: source /etc/kubernetes/apiserver-url.env Mar 09 13:22:15 crc kubenswrapper[4764]: else Mar 09 13:22:15 crc kubenswrapper[4764]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 09 13:22:15 crc kubenswrapper[4764]: exit 1 Mar 09 13:22:15 crc kubenswrapper[4764]: fi Mar 09 13:22:15 crc kubenswrapper[4764]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 09 13:22:15 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 13:22:15 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.871998 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.872867 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:22:15 crc kubenswrapper[4764]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 09 13:22:15 crc kubenswrapper[4764]: if [[ -f "/env/_master" ]]; then Mar 09 13:22:15 crc kubenswrapper[4764]: set -o allexport Mar 09 13:22:15 crc kubenswrapper[4764]: source "/env/_master" Mar 09 13:22:15 crc kubenswrapper[4764]: set +o allexport Mar 09 13:22:15 crc kubenswrapper[4764]: fi Mar 09 13:22:15 crc kubenswrapper[4764]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 09 13:22:15 crc kubenswrapper[4764]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 09 13:22:15 crc kubenswrapper[4764]: ho_enable="--enable-hybrid-overlay" Mar 09 13:22:15 crc kubenswrapper[4764]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 09 13:22:15 crc kubenswrapper[4764]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 09 13:22:15 crc kubenswrapper[4764]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 09 13:22:15 crc kubenswrapper[4764]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 09 13:22:15 crc kubenswrapper[4764]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 09 13:22:15 crc kubenswrapper[4764]: --webhook-host=127.0.0.1 \ Mar 09 13:22:15 crc kubenswrapper[4764]: --webhook-port=9743 \ Mar 09 13:22:15 crc kubenswrapper[4764]: ${ho_enable} \ Mar 09 13:22:15 crc kubenswrapper[4764]: --enable-interconnect \ Mar 09 13:22:15 crc kubenswrapper[4764]: --disable-approver \ Mar 09 13:22:15 crc kubenswrapper[4764]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 09 13:22:15 crc kubenswrapper[4764]: --wait-for-kubernetes-api=200s \ Mar 09 13:22:15 crc kubenswrapper[4764]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 09 13:22:15 crc kubenswrapper[4764]: --loglevel="${LOGLEVEL}" Mar 09 13:22:15 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 13:22:15 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.873889 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.874628 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:22:15 crc kubenswrapper[4764]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 09 13:22:15 crc kubenswrapper[4764]: if [[ -f "/env/_master" ]]; then Mar 09 13:22:15 crc kubenswrapper[4764]: set -o allexport Mar 09 13:22:15 crc kubenswrapper[4764]: source "/env/_master" Mar 09 13:22:15 crc kubenswrapper[4764]: set +o allexport Mar 09 13:22:15 crc kubenswrapper[4764]: fi Mar 09 13:22:15 crc kubenswrapper[4764]: Mar 09 13:22:15 crc kubenswrapper[4764]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 09 13:22:15 crc kubenswrapper[4764]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 09 13:22:15 crc kubenswrapper[4764]: --disable-webhook \ Mar 09 13:22:15 crc kubenswrapper[4764]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 09 13:22:15 crc kubenswrapper[4764]: --loglevel="${LOGLEVEL}" Mar 09 13:22:15 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 13:22:15 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:22:15 crc kubenswrapper[4764]: E0309 13:22:15.876557 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.879313 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.888572 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.896492 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.896526 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.896536 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.896551 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.896562 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:15Z","lastTransitionTime":"2026-03-09T13:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.899680 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.909766 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.921587 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.932364 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.943375 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.952235 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.961365 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.969743 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.977675 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.988721 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.999041 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.999065 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.999073 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.999103 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:15 crc kubenswrapper[4764]: I0309 13:22:15.999112 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:15Z","lastTransitionTime":"2026-03-09T13:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.101310 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.101347 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.101357 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.101372 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.101383 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:16Z","lastTransitionTime":"2026-03-09T13:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.203714 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.203771 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.203783 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.203795 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.203804 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:16Z","lastTransitionTime":"2026-03-09T13:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.228323 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.228412 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:22:18.228395196 +0000 UTC m=+93.478567104 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.228465 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.228494 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.228515 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.228584 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.228605 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.228624 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:18.228614352 +0000 UTC m=+93.478786250 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.228663 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:18.228633342 +0000 UTC m=+93.478805250 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.228919 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.228977 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.229010 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.229028 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.229031 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.229043 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.229045 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.229082 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:18.229070554 +0000 UTC m=+93.479242462 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.229100 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:18.229091514 +0000 UTC m=+93.479263422 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.306091 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.306164 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.306178 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.306195 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.306241 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:16Z","lastTransitionTime":"2026-03-09T13:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.408664 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.408715 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.408731 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.408749 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.408765 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:16Z","lastTransitionTime":"2026-03-09T13:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.510431 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.510454 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.510463 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.510476 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.510484 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:16Z","lastTransitionTime":"2026-03-09T13:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.559019 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.559119 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.559164 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.559290 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.559557 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.559617 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.575360 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.576791 4764 scope.go:117] "RemoveContainer" containerID="033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356" Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.577141 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.612398 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.612438 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.612449 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.612464 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.612475 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:16Z","lastTransitionTime":"2026-03-09T13:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.714807 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.714858 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.714868 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.714882 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.714891 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:16Z","lastTransitionTime":"2026-03-09T13:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.817567 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.817608 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.817617 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.817635 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.817663 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:16Z","lastTransitionTime":"2026-03-09T13:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.874351 4764 scope.go:117] "RemoveContainer" containerID="033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356" Mar 09 13:22:16 crc kubenswrapper[4764]: E0309 13:22:16.874513 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.919158 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.919187 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.919195 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.919208 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:16 crc kubenswrapper[4764]: I0309 13:22:16.919217 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:16Z","lastTransitionTime":"2026-03-09T13:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.021795 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.021835 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.021847 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.021863 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.021874 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:17Z","lastTransitionTime":"2026-03-09T13:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.124575 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.124615 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.124623 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.124637 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.124662 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:17Z","lastTransitionTime":"2026-03-09T13:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.226407 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.226444 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.226455 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.226472 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.226485 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:17Z","lastTransitionTime":"2026-03-09T13:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.329182 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.329268 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.329293 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.329323 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.329342 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:17Z","lastTransitionTime":"2026-03-09T13:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.431218 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.431285 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.431303 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.431328 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.431347 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:17Z","lastTransitionTime":"2026-03-09T13:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.534099 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.534155 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.534166 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.534183 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.534195 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:17Z","lastTransitionTime":"2026-03-09T13:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.636389 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.636427 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.636438 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.636452 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.636462 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:17Z","lastTransitionTime":"2026-03-09T13:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.738388 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.738505 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.738522 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.738538 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.738546 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:17Z","lastTransitionTime":"2026-03-09T13:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.841568 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.841685 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.841723 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.841752 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.841771 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:17Z","lastTransitionTime":"2026-03-09T13:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.944008 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.944085 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.944107 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.944137 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:17 crc kubenswrapper[4764]: I0309 13:22:17.944158 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:17Z","lastTransitionTime":"2026-03-09T13:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.046179 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.046221 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.046233 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.046249 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.046261 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:18Z","lastTransitionTime":"2026-03-09T13:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.148761 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.148793 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.148804 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.148817 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.148828 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:18Z","lastTransitionTime":"2026-03-09T13:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.246387 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.246475 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.246508 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.246531 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:22:22.246512969 +0000 UTC m=+97.496684877 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.246555 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.246578 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.246660 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:22.246606961 +0000 UTC m=+97.496778869 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.246679 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.246696 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.246708 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.246741 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:22.246727575 +0000 UTC m=+97.496899483 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.246577 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.246812 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.246824 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.246833 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.246853 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.246856 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:22.246849488 +0000 UTC m=+97.497021396 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.247013 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:22.246959191 +0000 UTC m=+97.497131129 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.251202 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.251271 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.251295 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.251326 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.251347 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:18Z","lastTransitionTime":"2026-03-09T13:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.353742 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.353833 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.353844 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.353861 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.353876 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:18Z","lastTransitionTime":"2026-03-09T13:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.455747 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.456001 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.456068 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.456143 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.456240 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:18Z","lastTransitionTime":"2026-03-09T13:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.558590 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.558608 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.558590 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.558708 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.558763 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.558797 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:18 crc kubenswrapper[4764]: E0309 13:22:18.558813 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.558820 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.558833 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.558846 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.558855 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:18Z","lastTransitionTime":"2026-03-09T13:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.661234 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.661563 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.661638 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.661734 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.661803 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:18Z","lastTransitionTime":"2026-03-09T13:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.763737 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.763779 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.763791 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.763809 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.763822 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:18Z","lastTransitionTime":"2026-03-09T13:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.868752 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.868816 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.868832 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.868857 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.868873 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:18Z","lastTransitionTime":"2026-03-09T13:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.971482 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.971521 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.971532 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.971547 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:18 crc kubenswrapper[4764]: I0309 13:22:18.971556 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:18Z","lastTransitionTime":"2026-03-09T13:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.073363 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.073392 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.073400 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.073414 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.073423 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:19Z","lastTransitionTime":"2026-03-09T13:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.175793 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.175838 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.175847 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.175860 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.175870 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:19Z","lastTransitionTime":"2026-03-09T13:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.277701 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.277743 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.277758 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.277773 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.277785 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:19Z","lastTransitionTime":"2026-03-09T13:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.379731 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.379768 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.379780 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.379794 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.379803 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:19Z","lastTransitionTime":"2026-03-09T13:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.482512 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.482772 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.482875 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.482954 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.483017 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:19Z","lastTransitionTime":"2026-03-09T13:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.585177 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.585436 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.585531 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.585606 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.585685 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:19Z","lastTransitionTime":"2026-03-09T13:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.688949 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.689003 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.689018 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.689038 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.689056 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:19Z","lastTransitionTime":"2026-03-09T13:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.792307 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.792380 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.792396 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.792421 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.792442 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:19Z","lastTransitionTime":"2026-03-09T13:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.894962 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.895015 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.895033 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.895059 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.895076 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:19Z","lastTransitionTime":"2026-03-09T13:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.998339 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.998680 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.998863 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.999004 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:19 crc kubenswrapper[4764]: I0309 13:22:19.999150 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:19Z","lastTransitionTime":"2026-03-09T13:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.101668 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.101696 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.101704 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.101716 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.101724 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:20Z","lastTransitionTime":"2026-03-09T13:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.205095 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.205183 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.205208 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.205242 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.205264 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:20Z","lastTransitionTime":"2026-03-09T13:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.307828 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.307868 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.307882 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.307898 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.307908 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:20Z","lastTransitionTime":"2026-03-09T13:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.409947 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.409997 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.410007 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.410020 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.410029 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:20Z","lastTransitionTime":"2026-03-09T13:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.512235 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.512287 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.512303 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.512326 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.512341 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:20Z","lastTransitionTime":"2026-03-09T13:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.558912 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.558918 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.558937 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:20 crc kubenswrapper[4764]: E0309 13:22:20.559056 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:20 crc kubenswrapper[4764]: E0309 13:22:20.559340 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:20 crc kubenswrapper[4764]: E0309 13:22:20.559439 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.614369 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.614406 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.614420 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.614436 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.614447 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:20Z","lastTransitionTime":"2026-03-09T13:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.716735 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.716770 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.716780 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.716796 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.716807 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:20Z","lastTransitionTime":"2026-03-09T13:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.820584 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.820624 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.820633 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.820663 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.820677 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:20Z","lastTransitionTime":"2026-03-09T13:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.922911 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.922964 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.922972 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.922988 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:20 crc kubenswrapper[4764]: I0309 13:22:20.922999 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:20Z","lastTransitionTime":"2026-03-09T13:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.025220 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.025258 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.025273 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.025288 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.025300 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:21Z","lastTransitionTime":"2026-03-09T13:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.128087 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.128132 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.128143 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.128158 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.128170 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:21Z","lastTransitionTime":"2026-03-09T13:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.230990 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.231048 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.231061 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.231079 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.231095 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:21Z","lastTransitionTime":"2026-03-09T13:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.334403 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.334447 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.334459 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.334472 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.334482 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:21Z","lastTransitionTime":"2026-03-09T13:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.436862 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.436901 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.436909 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.436923 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.436931 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:21Z","lastTransitionTime":"2026-03-09T13:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.539284 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.539331 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.539342 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.539357 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.539377 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:21Z","lastTransitionTime":"2026-03-09T13:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.641798 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.641840 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.641850 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.641865 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.641875 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:21Z","lastTransitionTime":"2026-03-09T13:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.744435 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.744473 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.744482 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.744495 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.744503 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:21Z","lastTransitionTime":"2026-03-09T13:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.847782 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.847829 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.847840 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.847858 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.847870 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:21Z","lastTransitionTime":"2026-03-09T13:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.950099 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.950158 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.950172 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.950189 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:21 crc kubenswrapper[4764]: I0309 13:22:21.950202 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:21Z","lastTransitionTime":"2026-03-09T13:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.052382 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.052444 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.052455 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.052472 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.052482 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:22Z","lastTransitionTime":"2026-03-09T13:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.155209 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.155243 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.155250 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.155265 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.155302 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:22Z","lastTransitionTime":"2026-03-09T13:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.257495 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.257586 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.257608 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.257681 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.257705 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:22Z","lastTransitionTime":"2026-03-09T13:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.283059 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.283220 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.283277 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.283334 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:22:30.283299938 +0000 UTC m=+105.533471856 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.283401 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.283411 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.283466 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.283531 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:30.283495653 +0000 UTC m=+105.533667691 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.283628 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.283701 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.283701 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.283723 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.283738 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.283746 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.283810 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:30.283786651 +0000 UTC m=+105.533958569 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.283739 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.283900 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:30.283888754 +0000 UTC m=+105.534060672 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.283922 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:30.283912214 +0000 UTC m=+105.534084132 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.360663 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.360725 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.360745 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.360771 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.360786 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:22Z","lastTransitionTime":"2026-03-09T13:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.421610 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.421681 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.421690 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.421706 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.421716 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:22Z","lastTransitionTime":"2026-03-09T13:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.432892 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.437288 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.437357 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.437372 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.437394 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.437408 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:22Z","lastTransitionTime":"2026-03-09T13:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.450726 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.456466 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.456519 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.456532 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.456551 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.456566 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:22Z","lastTransitionTime":"2026-03-09T13:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.467952 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.473487 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.473548 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.473559 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.473584 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.473595 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:22Z","lastTransitionTime":"2026-03-09T13:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.490518 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.495601 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.495704 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.495718 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.495742 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.495757 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:22Z","lastTransitionTime":"2026-03-09T13:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.510214 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.510468 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.512623 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.512702 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.512713 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.512732 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.512746 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:22Z","lastTransitionTime":"2026-03-09T13:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.558988 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.559136 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.559251 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.559177 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.559374 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:22 crc kubenswrapper[4764]: E0309 13:22:22.559482 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.615766 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.615819 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.615830 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.615849 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.615863 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:22Z","lastTransitionTime":"2026-03-09T13:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.718291 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.718354 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.718363 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.718388 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.718401 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:22Z","lastTransitionTime":"2026-03-09T13:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.820515 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.820547 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.820555 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.820571 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.820580 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:22Z","lastTransitionTime":"2026-03-09T13:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.924536 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.924591 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.924603 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.924621 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:22 crc kubenswrapper[4764]: I0309 13:22:22.924655 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:22Z","lastTransitionTime":"2026-03-09T13:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.027250 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.027513 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.027600 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.027794 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.027931 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:23Z","lastTransitionTime":"2026-03-09T13:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.130783 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.130822 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.130834 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.130849 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.130859 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:23Z","lastTransitionTime":"2026-03-09T13:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.233402 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.233456 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.233470 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.233487 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.233498 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:23Z","lastTransitionTime":"2026-03-09T13:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.336421 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.336476 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.336501 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.336526 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.336541 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:23Z","lastTransitionTime":"2026-03-09T13:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.439572 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.439612 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.439622 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.439637 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.439664 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:23Z","lastTransitionTime":"2026-03-09T13:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.541221 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.541255 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.541264 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.541277 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.541285 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:23Z","lastTransitionTime":"2026-03-09T13:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.643199 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.643231 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.643265 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.643287 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.643297 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:23Z","lastTransitionTime":"2026-03-09T13:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.745266 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.745314 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.745328 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.745344 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.745356 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:23Z","lastTransitionTime":"2026-03-09T13:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.847088 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.847118 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.847126 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.847139 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.847147 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:23Z","lastTransitionTime":"2026-03-09T13:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.949438 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.949484 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.949497 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.949515 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:23 crc kubenswrapper[4764]: I0309 13:22:23.949529 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:23Z","lastTransitionTime":"2026-03-09T13:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.051404 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.051442 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.051457 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.051471 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.051480 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:24Z","lastTransitionTime":"2026-03-09T13:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.154032 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.154261 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.154268 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.154284 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.154302 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:24Z","lastTransitionTime":"2026-03-09T13:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.256233 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.256276 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.256286 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.256303 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.256314 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:24Z","lastTransitionTime":"2026-03-09T13:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.358408 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.358456 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.358465 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.358477 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.358487 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:24Z","lastTransitionTime":"2026-03-09T13:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.461100 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.461164 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.461176 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.461193 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.461204 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:24Z","lastTransitionTime":"2026-03-09T13:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.559551 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.559598 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.559626 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:24 crc kubenswrapper[4764]: E0309 13:22:24.559765 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:24 crc kubenswrapper[4764]: E0309 13:22:24.559863 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:24 crc kubenswrapper[4764]: E0309 13:22:24.560020 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.563676 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.563708 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.563718 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.563733 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.563743 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:24Z","lastTransitionTime":"2026-03-09T13:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.665944 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.665992 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.666003 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.666020 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.666030 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:24Z","lastTransitionTime":"2026-03-09T13:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.768282 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.768322 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.768337 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.768353 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.768363 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:24Z","lastTransitionTime":"2026-03-09T13:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.870235 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.870285 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.870299 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.870316 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.870328 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:24Z","lastTransitionTime":"2026-03-09T13:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.972841 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.972896 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.972907 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.972926 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:24 crc kubenswrapper[4764]: I0309 13:22:24.972937 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:24Z","lastTransitionTime":"2026-03-09T13:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.075683 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.075764 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.075777 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.075817 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.075829 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:25Z","lastTransitionTime":"2026-03-09T13:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.178682 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.178732 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.178743 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.178760 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.178773 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:25Z","lastTransitionTime":"2026-03-09T13:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.280618 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.280694 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.280709 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.280726 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.280736 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:25Z","lastTransitionTime":"2026-03-09T13:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.383030 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.383066 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.383073 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.383088 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.383099 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:25Z","lastTransitionTime":"2026-03-09T13:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.484931 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.484972 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.484986 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.485003 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.485015 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:25Z","lastTransitionTime":"2026-03-09T13:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.569590 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.578912 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.587477 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.587573 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.587616 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.587629 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.587670 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.587680 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:25Z","lastTransitionTime":"2026-03-09T13:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.598375 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.609590 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.619981 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.636148 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.689394 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.689426 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.689435 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.689447 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.689456 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:25Z","lastTransitionTime":"2026-03-09T13:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.791584 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.791633 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.791666 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.791688 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.791703 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:25Z","lastTransitionTime":"2026-03-09T13:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.893629 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.893692 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.893703 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.893718 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.893727 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:25Z","lastTransitionTime":"2026-03-09T13:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.996287 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.996325 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.996334 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.996349 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:25 crc kubenswrapper[4764]: I0309 13:22:25.996360 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:25Z","lastTransitionTime":"2026-03-09T13:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.098962 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.099013 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.099025 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.099044 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.099056 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:26Z","lastTransitionTime":"2026-03-09T13:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.201514 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.201558 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.201569 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.201584 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.201595 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:26Z","lastTransitionTime":"2026-03-09T13:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.303861 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.303895 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.303904 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.303918 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.303929 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:26Z","lastTransitionTime":"2026-03-09T13:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.406211 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.406252 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.406261 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.406305 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.406315 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:26Z","lastTransitionTime":"2026-03-09T13:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.508121 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.508164 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.508178 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.508198 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.508210 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:26Z","lastTransitionTime":"2026-03-09T13:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.559611 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.559702 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.559622 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:26 crc kubenswrapper[4764]: E0309 13:22:26.559754 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:26 crc kubenswrapper[4764]: E0309 13:22:26.559835 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:26 crc kubenswrapper[4764]: E0309 13:22:26.560000 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.610370 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.610615 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.610631 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.610663 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.610674 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:26Z","lastTransitionTime":"2026-03-09T13:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.713243 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.713290 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.713300 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.713314 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.713324 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:26Z","lastTransitionTime":"2026-03-09T13:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.815776 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.815813 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.815820 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.815833 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.815842 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:26Z","lastTransitionTime":"2026-03-09T13:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.918137 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.918193 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.918202 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.918216 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:26 crc kubenswrapper[4764]: I0309 13:22:26.918225 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:26Z","lastTransitionTime":"2026-03-09T13:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.020702 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.020740 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.020751 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.020766 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.020777 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:27Z","lastTransitionTime":"2026-03-09T13:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.123149 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.123191 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.123208 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.123224 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.123234 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:27Z","lastTransitionTime":"2026-03-09T13:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.225832 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.225865 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.225875 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.225889 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.225900 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:27Z","lastTransitionTime":"2026-03-09T13:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.328130 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.328175 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.328191 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.328210 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.328224 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:27Z","lastTransitionTime":"2026-03-09T13:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.429999 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.430035 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.430045 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.430059 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.430069 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:27Z","lastTransitionTime":"2026-03-09T13:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.532519 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.532599 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.532613 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.532630 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.532690 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:27Z","lastTransitionTime":"2026-03-09T13:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.635515 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.635569 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.635582 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.635607 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.635622 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:27Z","lastTransitionTime":"2026-03-09T13:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.656619 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-r5bnx"] Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.656998 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-r5bnx" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.659632 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.659761 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.660030 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.665496 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.674871 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.686555 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.696346 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.707608 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.713450 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.720586 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.727263 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.732469 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7fede188-66d9-4cb1-af19-c94afe7fbcde-hosts-file\") pod \"node-resolver-r5bnx\" (UID: \"7fede188-66d9-4cb1-af19-c94afe7fbcde\") " pod="openshift-dns/node-resolver-r5bnx" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.732538 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjp5c\" (UniqueName: \"kubernetes.io/projected/7fede188-66d9-4cb1-af19-c94afe7fbcde-kube-api-access-pjp5c\") pod \"node-resolver-r5bnx\" (UID: \"7fede188-66d9-4cb1-af19-c94afe7fbcde\") " pod="openshift-dns/node-resolver-r5bnx" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.737830 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.737873 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.737883 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.737897 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.737907 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:27Z","lastTransitionTime":"2026-03-09T13:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.833722 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjp5c\" (UniqueName: \"kubernetes.io/projected/7fede188-66d9-4cb1-af19-c94afe7fbcde-kube-api-access-pjp5c\") pod \"node-resolver-r5bnx\" (UID: \"7fede188-66d9-4cb1-af19-c94afe7fbcde\") " pod="openshift-dns/node-resolver-r5bnx" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.833801 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7fede188-66d9-4cb1-af19-c94afe7fbcde-hosts-file\") pod \"node-resolver-r5bnx\" (UID: \"7fede188-66d9-4cb1-af19-c94afe7fbcde\") " pod="openshift-dns/node-resolver-r5bnx" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.833911 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7fede188-66d9-4cb1-af19-c94afe7fbcde-hosts-file\") pod \"node-resolver-r5bnx\" (UID: \"7fede188-66d9-4cb1-af19-c94afe7fbcde\") " pod="openshift-dns/node-resolver-r5bnx" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.840942 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.840983 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.840992 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.841006 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.841015 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:27Z","lastTransitionTime":"2026-03-09T13:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.851097 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjp5c\" (UniqueName: \"kubernetes.io/projected/7fede188-66d9-4cb1-af19-c94afe7fbcde-kube-api-access-pjp5c\") pod \"node-resolver-r5bnx\" (UID: \"7fede188-66d9-4cb1-af19-c94afe7fbcde\") " pod="openshift-dns/node-resolver-r5bnx" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.943019 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.943108 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.943126 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.943151 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.943198 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:27Z","lastTransitionTime":"2026-03-09T13:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:27 crc kubenswrapper[4764]: I0309 13:22:27.969125 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-r5bnx" Mar 09 13:22:27 crc kubenswrapper[4764]: E0309 13:22:27.984877 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:22:27 crc kubenswrapper[4764]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 09 13:22:27 crc kubenswrapper[4764]: set -uo pipefail Mar 09 13:22:27 crc kubenswrapper[4764]: Mar 09 13:22:27 crc kubenswrapper[4764]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 09 13:22:27 crc kubenswrapper[4764]: Mar 09 13:22:27 crc kubenswrapper[4764]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 09 13:22:27 crc kubenswrapper[4764]: HOSTS_FILE="/etc/hosts" Mar 09 13:22:27 crc kubenswrapper[4764]: TEMP_FILE="/etc/hosts.tmp" Mar 09 13:22:27 crc kubenswrapper[4764]: Mar 09 13:22:27 crc kubenswrapper[4764]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 09 13:22:27 crc kubenswrapper[4764]: Mar 09 13:22:27 crc kubenswrapper[4764]: # Make a temporary file with the old hosts file's attributes. Mar 09 13:22:27 crc kubenswrapper[4764]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 09 13:22:27 crc kubenswrapper[4764]: echo "Failed to preserve hosts file. Exiting." Mar 09 13:22:27 crc kubenswrapper[4764]: exit 1 Mar 09 13:22:27 crc kubenswrapper[4764]: fi Mar 09 13:22:27 crc kubenswrapper[4764]: Mar 09 13:22:27 crc kubenswrapper[4764]: while true; do Mar 09 13:22:27 crc kubenswrapper[4764]: declare -A svc_ips Mar 09 13:22:27 crc kubenswrapper[4764]: for svc in "${services[@]}"; do Mar 09 13:22:27 crc kubenswrapper[4764]: # Fetch service IP from cluster dns if present. We make several tries Mar 09 13:22:27 crc kubenswrapper[4764]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 09 13:22:27 crc kubenswrapper[4764]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 09 13:22:27 crc kubenswrapper[4764]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 09 13:22:27 crc kubenswrapper[4764]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 09 13:22:27 crc kubenswrapper[4764]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 09 13:22:27 crc kubenswrapper[4764]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 09 13:22:27 crc kubenswrapper[4764]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 09 13:22:27 crc kubenswrapper[4764]: for i in ${!cmds[*]} Mar 09 13:22:27 crc kubenswrapper[4764]: do Mar 09 13:22:27 crc kubenswrapper[4764]: ips=($(eval "${cmds[i]}")) Mar 09 13:22:27 crc kubenswrapper[4764]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 09 13:22:27 crc kubenswrapper[4764]: svc_ips["${svc}"]="${ips[@]}" Mar 09 13:22:27 crc kubenswrapper[4764]: break Mar 09 13:22:27 crc kubenswrapper[4764]: fi Mar 09 13:22:27 crc kubenswrapper[4764]: done Mar 09 13:22:27 crc kubenswrapper[4764]: done Mar 09 13:22:27 crc kubenswrapper[4764]: Mar 09 13:22:27 crc kubenswrapper[4764]: # Update /etc/hosts only if we get valid service IPs Mar 09 13:22:27 crc kubenswrapper[4764]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 09 13:22:27 crc kubenswrapper[4764]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 09 13:22:27 crc kubenswrapper[4764]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 09 13:22:27 crc kubenswrapper[4764]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 09 13:22:27 crc kubenswrapper[4764]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 09 13:22:27 crc kubenswrapper[4764]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 09 13:22:27 crc kubenswrapper[4764]: sleep 60 & wait Mar 09 13:22:27 crc kubenswrapper[4764]: continue Mar 09 13:22:27 crc kubenswrapper[4764]: fi Mar 09 13:22:27 crc kubenswrapper[4764]: Mar 09 13:22:27 crc kubenswrapper[4764]: # Append resolver entries for services Mar 09 13:22:27 crc kubenswrapper[4764]: rc=0 Mar 09 13:22:27 crc kubenswrapper[4764]: for svc in "${!svc_ips[@]}"; do Mar 09 13:22:27 crc kubenswrapper[4764]: for ip in ${svc_ips[${svc}]}; do Mar 09 13:22:27 crc kubenswrapper[4764]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 09 13:22:27 crc kubenswrapper[4764]: done Mar 09 13:22:27 crc kubenswrapper[4764]: done Mar 09 13:22:27 crc kubenswrapper[4764]: if [[ $rc -ne 0 ]]; then Mar 09 13:22:27 crc kubenswrapper[4764]: sleep 60 & wait Mar 09 13:22:27 crc kubenswrapper[4764]: continue Mar 09 13:22:27 crc kubenswrapper[4764]: fi Mar 09 13:22:27 crc kubenswrapper[4764]: Mar 09 13:22:27 crc kubenswrapper[4764]: Mar 09 13:22:27 crc kubenswrapper[4764]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 09 13:22:27 crc kubenswrapper[4764]: # Replace /etc/hosts with our modified version if needed Mar 09 13:22:27 crc kubenswrapper[4764]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 09 13:22:27 crc kubenswrapper[4764]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 09 13:22:27 crc kubenswrapper[4764]: fi Mar 09 13:22:27 crc kubenswrapper[4764]: sleep 60 & wait Mar 09 13:22:27 crc kubenswrapper[4764]: unset svc_ips Mar 09 13:22:27 crc kubenswrapper[4764]: done Mar 09 13:22:27 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pjp5c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-r5bnx_openshift-dns(7fede188-66d9-4cb1-af19-c94afe7fbcde): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 13:22:27 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:22:27 crc kubenswrapper[4764]: E0309 13:22:27.985950 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-r5bnx" podUID="7fede188-66d9-4cb1-af19-c94afe7fbcde" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.023866 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-xxczl"] Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.024160 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-zmzm7"] Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.024356 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.024514 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.024851 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-crvdf"] Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.025716 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.030238 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.030573 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.030825 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.031094 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.031208 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.031372 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.031775 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.031864 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.031901 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.031910 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.033838 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.035754 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.046124 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.046156 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.046165 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.046178 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.046206 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:28Z","lastTransitionTime":"2026-03-09T13:22:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.048172 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.088076 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.111441 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.119850 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.129199 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135132 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-cnibin\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135163 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk5pb\" (UniqueName: \"kubernetes.io/projected/202a1f58-ce83-4374-ac48-dc806f7b9d6b-kube-api-access-rk5pb\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135182 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-cni-binary-copy\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135198 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135213 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-cnibin\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135228 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-run-netns\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135244 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-hostroot\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135259 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6bcdd179-43c2-427c-9fac-7155c122e922-mcd-auth-proxy-config\") pod \"machine-config-daemon-xxczl\" (UID: \"6bcdd179-43c2-427c-9fac-7155c122e922\") " pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135272 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25jms\" (UniqueName: \"kubernetes.io/projected/6bcdd179-43c2-427c-9fac-7155c122e922-kube-api-access-25jms\") pod \"machine-config-daemon-xxczl\" (UID: \"6bcdd179-43c2-427c-9fac-7155c122e922\") " pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135302 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6bcdd179-43c2-427c-9fac-7155c122e922-rootfs\") pod \"machine-config-daemon-xxczl\" (UID: \"6bcdd179-43c2-427c-9fac-7155c122e922\") " pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135317 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-var-lib-cni-multus\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135331 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/202a1f58-ce83-4374-ac48-dc806f7b9d6b-multus-daemon-config\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135347 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-etc-kubernetes\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135371 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-run-multus-certs\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135386 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-os-release\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135498 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-os-release\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135541 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/202a1f58-ce83-4374-ac48-dc806f7b9d6b-cni-binary-copy\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135666 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-run-k8s-cni-cncf-io\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135694 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-system-cni-dir\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135828 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6bcdd179-43c2-427c-9fac-7155c122e922-proxy-tls\") pod \"machine-config-daemon-xxczl\" (UID: \"6bcdd179-43c2-427c-9fac-7155c122e922\") " pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135853 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-multus-socket-dir-parent\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135914 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-system-cni-dir\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.135939 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.136056 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-multus-conf-dir\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.136095 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjrkl\" (UniqueName: \"kubernetes.io/projected/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-kube-api-access-qjrkl\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.136138 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-multus-cni-dir\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.136158 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-var-lib-kubelet\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.136255 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-var-lib-cni-bin\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.136553 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.144993 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.148032 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.148056 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.148066 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.148079 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.148089 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:28Z","lastTransitionTime":"2026-03-09T13:22:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.152163 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.160946 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.169245 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.178412 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.184761 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.194157 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.206004 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.213844 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.222102 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.230561 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237122 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6bcdd179-43c2-427c-9fac-7155c122e922-mcd-auth-proxy-config\") pod \"machine-config-daemon-xxczl\" (UID: \"6bcdd179-43c2-427c-9fac-7155c122e922\") " pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237188 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25jms\" (UniqueName: \"kubernetes.io/projected/6bcdd179-43c2-427c-9fac-7155c122e922-kube-api-access-25jms\") pod \"machine-config-daemon-xxczl\" (UID: \"6bcdd179-43c2-427c-9fac-7155c122e922\") " pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237211 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-run-netns\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237250 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-hostroot\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237285 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-var-lib-cni-multus\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237307 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/202a1f58-ce83-4374-ac48-dc806f7b9d6b-multus-daemon-config\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237357 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6bcdd179-43c2-427c-9fac-7155c122e922-rootfs\") pod \"machine-config-daemon-xxczl\" (UID: \"6bcdd179-43c2-427c-9fac-7155c122e922\") " pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237379 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-run-multus-certs\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237384 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-var-lib-cni-multus\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237391 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-hostroot\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237421 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-etc-kubernetes\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237446 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/202a1f58-ce83-4374-ac48-dc806f7b9d6b-cni-binary-copy\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237462 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6bcdd179-43c2-427c-9fac-7155c122e922-rootfs\") pod \"machine-config-daemon-xxczl\" (UID: \"6bcdd179-43c2-427c-9fac-7155c122e922\") " pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237466 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-os-release\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237497 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-run-multus-certs\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237346 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-run-netns\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237513 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-os-release\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237557 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6bcdd179-43c2-427c-9fac-7155c122e922-proxy-tls\") pod \"machine-config-daemon-xxczl\" (UID: \"6bcdd179-43c2-427c-9fac-7155c122e922\") " pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237576 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-multus-socket-dir-parent\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237593 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-run-k8s-cni-cncf-io\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237610 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-system-cni-dir\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237627 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-system-cni-dir\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237658 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237609 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-os-release\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237688 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-multus-conf-dir\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237713 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-etc-kubernetes\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237728 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjrkl\" (UniqueName: \"kubernetes.io/projected/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-kube-api-access-qjrkl\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237768 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-multus-cni-dir\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237783 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6bcdd179-43c2-427c-9fac-7155c122e922-mcd-auth-proxy-config\") pod \"machine-config-daemon-xxczl\" (UID: \"6bcdd179-43c2-427c-9fac-7155c122e922\") " pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237801 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-run-k8s-cni-cncf-io\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237836 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-system-cni-dir\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237791 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-var-lib-kubelet\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237855 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-system-cni-dir\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237842 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-var-lib-kubelet\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237906 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-multus-conf-dir\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237922 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-multus-socket-dir-parent\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237950 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-multus-cni-dir\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.237948 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-os-release\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.238020 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-var-lib-cni-bin\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.238073 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-host-var-lib-cni-bin\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.238084 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk5pb\" (UniqueName: \"kubernetes.io/projected/202a1f58-ce83-4374-ac48-dc806f7b9d6b-kube-api-access-rk5pb\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.238110 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-cnibin\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.238170 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/202a1f58-ce83-4374-ac48-dc806f7b9d6b-multus-daemon-config\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.238179 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.238186 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-cnibin\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.238204 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-cnibin\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.238249 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-cni-binary-copy\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.238249 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/202a1f58-ce83-4374-ac48-dc806f7b9d6b-cnibin\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.238393 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.238581 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/202a1f58-ce83-4374-ac48-dc806f7b9d6b-cni-binary-copy\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.238720 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.238889 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-cni-binary-copy\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.240096 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.240349 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6bcdd179-43c2-427c-9fac-7155c122e922-proxy-tls\") pod \"machine-config-daemon-xxczl\" (UID: \"6bcdd179-43c2-427c-9fac-7155c122e922\") " pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.249353 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.250554 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.250584 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.250593 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.250609 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.250620 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:28Z","lastTransitionTime":"2026-03-09T13:22:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.253816 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjrkl\" (UniqueName: \"kubernetes.io/projected/072442e6-8ece-4f72-a8cb-ad7ef1e3facb-kube-api-access-qjrkl\") pod \"multus-additional-cni-plugins-crvdf\" (UID: \"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\") " pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.254025 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25jms\" (UniqueName: \"kubernetes.io/projected/6bcdd179-43c2-427c-9fac-7155c122e922-kube-api-access-25jms\") pod \"machine-config-daemon-xxczl\" (UID: \"6bcdd179-43c2-427c-9fac-7155c122e922\") " pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.258539 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk5pb\" (UniqueName: \"kubernetes.io/projected/202a1f58-ce83-4374-ac48-dc806f7b9d6b-kube-api-access-rk5pb\") pod \"multus-zmzm7\" (UID: \"202a1f58-ce83-4374-ac48-dc806f7b9d6b\") " pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.262625 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.271765 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.345961 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zmzm7" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.352038 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.352082 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.352095 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.352112 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.352123 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:28Z","lastTransitionTime":"2026-03-09T13:22:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:28 crc kubenswrapper[4764]: W0309 13:22:28.355322 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod202a1f58_ce83_4374_ac48_dc806f7b9d6b.slice/crio-a445c4d4ff8c0e7680f3c86119feb92a65fc69ae77f288e56f271709f496c0ac WatchSource:0}: Error finding container a445c4d4ff8c0e7680f3c86119feb92a65fc69ae77f288e56f271709f496c0ac: Status 404 returned error can't find the container with id a445c4d4ff8c0e7680f3c86119feb92a65fc69ae77f288e56f271709f496c0ac Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.357397 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:22:28 crc kubenswrapper[4764]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 09 13:22:28 crc kubenswrapper[4764]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 09 13:22:28 crc kubenswrapper[4764]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rk5pb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-zmzm7_openshift-multus(202a1f58-ce83-4374-ac48-dc806f7b9d6b): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 13:22:28 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.358600 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-zmzm7" podUID="202a1f58-ce83-4374-ac48-dc806f7b9d6b" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.369254 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.375233 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-crvdf" Mar 09 13:22:28 crc kubenswrapper[4764]: W0309 13:22:28.381460 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bcdd179_43c2_427c_9fac_7155c122e922.slice/crio-53e57fc2015ccb3794b104e1fb3b1e815a6b724ebb5c7cdb477b30d5440dd8ad WatchSource:0}: Error finding container 53e57fc2015ccb3794b104e1fb3b1e815a6b724ebb5c7cdb477b30d5440dd8ad: Status 404 returned error can't find the container with id 53e57fc2015ccb3794b104e1fb3b1e815a6b724ebb5c7cdb477b30d5440dd8ad Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.383252 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-25jms,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.388304 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-25jms,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 09 13:22:28 crc kubenswrapper[4764]: W0309 13:22:28.389470 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod072442e6_8ece_4f72_a8cb_ad7ef1e3facb.slice/crio-40f63e24da2ee3c429a77ccc1ab25e68446cce3675cd0f629706236630ebbef0 WatchSource:0}: Error finding container 40f63e24da2ee3c429a77ccc1ab25e68446cce3675cd0f629706236630ebbef0: Status 404 returned error can't find the container with id 40f63e24da2ee3c429a77ccc1ab25e68446cce3675cd0f629706236630ebbef0 Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.389522 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.391565 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qjrkl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-crvdf_openshift-multus(072442e6-8ece-4f72-a8cb-ad7ef1e3facb): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.394841 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-crvdf" podUID="072442e6-8ece-4f72-a8cb-ad7ef1e3facb" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.404167 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7kggv"] Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.405083 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.408166 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.408184 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.408936 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.409080 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.409130 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.410365 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.410529 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.417347 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.427017 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.435818 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.444846 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.453502 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.455354 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.455385 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.455402 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.455436 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.455450 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:28Z","lastTransitionTime":"2026-03-09T13:22:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.462298 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.471125 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.479972 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.486084 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.494139 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.504394 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.517589 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.540814 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-ovn\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.540847 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.540868 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-env-overrides\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.540882 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-slash\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.540896 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-systemd\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.540910 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-log-socket\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.540923 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-run-ovn-kubernetes\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.540938 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-cni-bin\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.541043 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-run-netns\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.541114 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-openvswitch\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.541268 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovn-node-metrics-cert\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.541381 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-etc-openvswitch\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.541416 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovnkube-script-lib\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.541442 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-systemd-units\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.541476 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovnkube-config\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.541498 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5xrv\" (UniqueName: \"kubernetes.io/projected/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-kube-api-access-r5xrv\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.541549 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-var-lib-openvswitch\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.541706 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-node-log\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.541733 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-cni-netd\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.541763 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-kubelet\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.557445 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.557500 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.557510 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.557526 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.557539 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:28Z","lastTransitionTime":"2026-03-09T13:22:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.558692 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.558704 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.558864 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.559021 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.559277 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.559352 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.560194 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:22:28 crc kubenswrapper[4764]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 09 13:22:28 crc kubenswrapper[4764]: if [[ -f "/env/_master" ]]; then Mar 09 13:22:28 crc kubenswrapper[4764]: set -o allexport Mar 09 13:22:28 crc kubenswrapper[4764]: source "/env/_master" Mar 09 13:22:28 crc kubenswrapper[4764]: set +o allexport Mar 09 13:22:28 crc kubenswrapper[4764]: fi Mar 09 13:22:28 crc kubenswrapper[4764]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 09 13:22:28 crc kubenswrapper[4764]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 09 13:22:28 crc kubenswrapper[4764]: ho_enable="--enable-hybrid-overlay" Mar 09 13:22:28 crc kubenswrapper[4764]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 09 13:22:28 crc kubenswrapper[4764]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 09 13:22:28 crc kubenswrapper[4764]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 09 13:22:28 crc kubenswrapper[4764]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 09 13:22:28 crc kubenswrapper[4764]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 09 13:22:28 crc kubenswrapper[4764]: --webhook-host=127.0.0.1 \ Mar 09 13:22:28 crc kubenswrapper[4764]: --webhook-port=9743 \ Mar 09 13:22:28 crc kubenswrapper[4764]: ${ho_enable} \ Mar 09 13:22:28 crc kubenswrapper[4764]: --enable-interconnect \ Mar 09 13:22:28 crc kubenswrapper[4764]: --disable-approver \ Mar 09 13:22:28 crc kubenswrapper[4764]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 09 13:22:28 crc kubenswrapper[4764]: --wait-for-kubernetes-api=200s \ Mar 09 13:22:28 crc kubenswrapper[4764]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 09 13:22:28 crc kubenswrapper[4764]: --loglevel="${LOGLEVEL}" Mar 09 13:22:28 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 13:22:28 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.562313 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:22:28 crc kubenswrapper[4764]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 09 13:22:28 crc kubenswrapper[4764]: if [[ -f "/env/_master" ]]; then Mar 09 13:22:28 crc kubenswrapper[4764]: set -o allexport Mar 09 13:22:28 crc kubenswrapper[4764]: source "/env/_master" Mar 09 13:22:28 crc kubenswrapper[4764]: set +o allexport Mar 09 13:22:28 crc kubenswrapper[4764]: fi Mar 09 13:22:28 crc kubenswrapper[4764]: Mar 09 13:22:28 crc kubenswrapper[4764]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 09 13:22:28 crc kubenswrapper[4764]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 09 13:22:28 crc kubenswrapper[4764]: --disable-webhook \ Mar 09 13:22:28 crc kubenswrapper[4764]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 09 13:22:28 crc kubenswrapper[4764]: --loglevel="${LOGLEVEL}" Mar 09 13:22:28 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 13:22:28 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.563449 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.642712 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-kubelet\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.642750 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-ovn\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.642771 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.642786 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-env-overrides\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.642801 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-slash\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.642815 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-systemd\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.642829 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-log-socket\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.642843 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-run-ovn-kubernetes\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.642858 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-cni-bin\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.642887 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-run-netns\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.642902 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-openvswitch\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.642924 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovn-node-metrics-cert\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.642973 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-etc-openvswitch\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.642988 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovnkube-script-lib\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643001 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-systemd-units\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643025 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovnkube-config\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643041 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5xrv\" (UniqueName: \"kubernetes.io/projected/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-kube-api-access-r5xrv\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643084 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-var-lib-openvswitch\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643099 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-node-log\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643114 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-cni-netd\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643171 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-cni-netd\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643205 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-kubelet\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643226 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-ovn\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643245 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643823 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-etc-openvswitch\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643868 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-cni-bin\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643884 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-run-netns\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643915 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-openvswitch\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643915 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-slash\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643948 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-env-overrides\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643958 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-systemd\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.643978 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-log-socket\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.644007 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-run-ovn-kubernetes\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.644291 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-systemd-units\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.644599 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-var-lib-openvswitch\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.644634 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-node-log\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.644858 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovnkube-config\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.645199 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovnkube-script-lib\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.646999 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovn-node-metrics-cert\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.659589 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.659656 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.659675 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.659692 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.659701 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:28Z","lastTransitionTime":"2026-03-09T13:22:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.660833 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5xrv\" (UniqueName: \"kubernetes.io/projected/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-kube-api-access-r5xrv\") pod \"ovnkube-node-7kggv\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.721838 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:28 crc kubenswrapper[4764]: W0309 13:22:28.733189 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8ccb4f5_550a_41b2_b39d_201cdd5d902a.slice/crio-65054061a375abf79424d7095dced33c492d7e93ee2f22f668abd5a0c5ced956 WatchSource:0}: Error finding container 65054061a375abf79424d7095dced33c492d7e93ee2f22f668abd5a0c5ced956: Status 404 returned error can't find the container with id 65054061a375abf79424d7095dced33c492d7e93ee2f22f668abd5a0c5ced956 Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.735304 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:22:28 crc kubenswrapper[4764]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 09 13:22:28 crc kubenswrapper[4764]: apiVersion: v1 Mar 09 13:22:28 crc kubenswrapper[4764]: clusters: Mar 09 13:22:28 crc kubenswrapper[4764]: - cluster: Mar 09 13:22:28 crc kubenswrapper[4764]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 09 13:22:28 crc kubenswrapper[4764]: server: https://api-int.crc.testing:6443 Mar 09 13:22:28 crc kubenswrapper[4764]: name: default-cluster Mar 09 13:22:28 crc kubenswrapper[4764]: contexts: Mar 09 13:22:28 crc kubenswrapper[4764]: - context: Mar 09 13:22:28 crc kubenswrapper[4764]: cluster: default-cluster Mar 09 13:22:28 crc kubenswrapper[4764]: namespace: default Mar 09 13:22:28 crc kubenswrapper[4764]: user: default-auth Mar 09 13:22:28 crc kubenswrapper[4764]: name: default-context Mar 09 13:22:28 crc kubenswrapper[4764]: current-context: default-context Mar 09 13:22:28 crc kubenswrapper[4764]: kind: Config Mar 09 13:22:28 crc kubenswrapper[4764]: preferences: {} Mar 09 13:22:28 crc kubenswrapper[4764]: users: Mar 09 13:22:28 crc kubenswrapper[4764]: - name: default-auth Mar 09 13:22:28 crc kubenswrapper[4764]: user: Mar 09 13:22:28 crc kubenswrapper[4764]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 09 13:22:28 crc kubenswrapper[4764]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 09 13:22:28 crc kubenswrapper[4764]: EOF Mar 09 13:22:28 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r5xrv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 13:22:28 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.736472 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.740359 4764 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.761905 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.761944 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.761958 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.761975 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.761986 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:28Z","lastTransitionTime":"2026-03-09T13:22:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.864709 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.864771 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.864786 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.864804 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.864817 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:28Z","lastTransitionTime":"2026-03-09T13:22:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.901185 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"53e57fc2015ccb3794b104e1fb3b1e815a6b724ebb5c7cdb477b30d5440dd8ad"} Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.902153 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-r5bnx" event={"ID":"7fede188-66d9-4cb1-af19-c94afe7fbcde","Type":"ContainerStarted","Data":"78819a166f081d16d2602205a32b059426bbf5489e3535ca772ed8601874dbea"} Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.902925 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-25jms,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.903469 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" event={"ID":"072442e6-8ece-4f72-a8cb-ad7ef1e3facb","Type":"ContainerStarted","Data":"40f63e24da2ee3c429a77ccc1ab25e68446cce3675cd0f629706236630ebbef0"} Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.904730 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qjrkl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-crvdf_openshift-multus(072442e6-8ece-4f72-a8cb-ad7ef1e3facb): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.905177 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-25jms,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.905822 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-crvdf" podUID="072442e6-8ece-4f72-a8cb-ad7ef1e3facb" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.906256 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.906430 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zmzm7" event={"ID":"202a1f58-ce83-4374-ac48-dc806f7b9d6b","Type":"ContainerStarted","Data":"a445c4d4ff8c0e7680f3c86119feb92a65fc69ae77f288e56f271709f496c0ac"} Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.906836 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:22:28 crc kubenswrapper[4764]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 09 13:22:28 crc kubenswrapper[4764]: set -uo pipefail Mar 09 13:22:28 crc kubenswrapper[4764]: Mar 09 13:22:28 crc kubenswrapper[4764]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 09 13:22:28 crc kubenswrapper[4764]: Mar 09 13:22:28 crc kubenswrapper[4764]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 09 13:22:28 crc kubenswrapper[4764]: HOSTS_FILE="/etc/hosts" Mar 09 13:22:28 crc kubenswrapper[4764]: TEMP_FILE="/etc/hosts.tmp" Mar 09 13:22:28 crc kubenswrapper[4764]: Mar 09 13:22:28 crc kubenswrapper[4764]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 09 13:22:28 crc kubenswrapper[4764]: Mar 09 13:22:28 crc kubenswrapper[4764]: # Make a temporary file with the old hosts file's attributes. Mar 09 13:22:28 crc kubenswrapper[4764]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 09 13:22:28 crc kubenswrapper[4764]: echo "Failed to preserve hosts file. Exiting." Mar 09 13:22:28 crc kubenswrapper[4764]: exit 1 Mar 09 13:22:28 crc kubenswrapper[4764]: fi Mar 09 13:22:28 crc kubenswrapper[4764]: Mar 09 13:22:28 crc kubenswrapper[4764]: while true; do Mar 09 13:22:28 crc kubenswrapper[4764]: declare -A svc_ips Mar 09 13:22:28 crc kubenswrapper[4764]: for svc in "${services[@]}"; do Mar 09 13:22:28 crc kubenswrapper[4764]: # Fetch service IP from cluster dns if present. We make several tries Mar 09 13:22:28 crc kubenswrapper[4764]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 09 13:22:28 crc kubenswrapper[4764]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 09 13:22:28 crc kubenswrapper[4764]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 09 13:22:28 crc kubenswrapper[4764]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 09 13:22:28 crc kubenswrapper[4764]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 09 13:22:28 crc kubenswrapper[4764]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 09 13:22:28 crc kubenswrapper[4764]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 09 13:22:28 crc kubenswrapper[4764]: for i in ${!cmds[*]} Mar 09 13:22:28 crc kubenswrapper[4764]: do Mar 09 13:22:28 crc kubenswrapper[4764]: ips=($(eval "${cmds[i]}")) Mar 09 13:22:28 crc kubenswrapper[4764]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 09 13:22:28 crc kubenswrapper[4764]: svc_ips["${svc}"]="${ips[@]}" Mar 09 13:22:28 crc kubenswrapper[4764]: break Mar 09 13:22:28 crc kubenswrapper[4764]: fi Mar 09 13:22:28 crc kubenswrapper[4764]: done Mar 09 13:22:28 crc kubenswrapper[4764]: done Mar 09 13:22:28 crc kubenswrapper[4764]: Mar 09 13:22:28 crc kubenswrapper[4764]: # Update /etc/hosts only if we get valid service IPs Mar 09 13:22:28 crc kubenswrapper[4764]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 09 13:22:28 crc kubenswrapper[4764]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 09 13:22:28 crc kubenswrapper[4764]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 09 13:22:28 crc kubenswrapper[4764]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 09 13:22:28 crc kubenswrapper[4764]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 09 13:22:28 crc kubenswrapper[4764]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 09 13:22:28 crc kubenswrapper[4764]: sleep 60 & wait Mar 09 13:22:28 crc kubenswrapper[4764]: continue Mar 09 13:22:28 crc kubenswrapper[4764]: fi Mar 09 13:22:28 crc kubenswrapper[4764]: Mar 09 13:22:28 crc kubenswrapper[4764]: # Append resolver entries for services Mar 09 13:22:28 crc kubenswrapper[4764]: rc=0 Mar 09 13:22:28 crc kubenswrapper[4764]: for svc in "${!svc_ips[@]}"; do Mar 09 13:22:28 crc kubenswrapper[4764]: for ip in ${svc_ips[${svc}]}; do Mar 09 13:22:28 crc kubenswrapper[4764]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 09 13:22:28 crc kubenswrapper[4764]: done Mar 09 13:22:28 crc kubenswrapper[4764]: done Mar 09 13:22:28 crc kubenswrapper[4764]: if [[ $rc -ne 0 ]]; then Mar 09 13:22:28 crc kubenswrapper[4764]: sleep 60 & wait Mar 09 13:22:28 crc kubenswrapper[4764]: continue Mar 09 13:22:28 crc kubenswrapper[4764]: fi Mar 09 13:22:28 crc kubenswrapper[4764]: Mar 09 13:22:28 crc kubenswrapper[4764]: Mar 09 13:22:28 crc kubenswrapper[4764]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 09 13:22:28 crc kubenswrapper[4764]: # Replace /etc/hosts with our modified version if needed Mar 09 13:22:28 crc kubenswrapper[4764]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 09 13:22:28 crc kubenswrapper[4764]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 09 13:22:28 crc kubenswrapper[4764]: fi Mar 09 13:22:28 crc kubenswrapper[4764]: sleep 60 & wait Mar 09 13:22:28 crc kubenswrapper[4764]: unset svc_ips Mar 09 13:22:28 crc kubenswrapper[4764]: done Mar 09 13:22:28 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pjp5c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-r5bnx_openshift-dns(7fede188-66d9-4cb1-af19-c94afe7fbcde): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 13:22:28 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.907368 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerStarted","Data":"65054061a375abf79424d7095dced33c492d7e93ee2f22f668abd5a0c5ced956"} Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.907915 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:22:28 crc kubenswrapper[4764]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 09 13:22:28 crc kubenswrapper[4764]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 09 13:22:28 crc kubenswrapper[4764]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rk5pb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-zmzm7_openshift-multus(202a1f58-ce83-4374-ac48-dc806f7b9d6b): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 13:22:28 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.910178 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-r5bnx" podUID="7fede188-66d9-4cb1-af19-c94afe7fbcde" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.910335 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-zmzm7" podUID="202a1f58-ce83-4374-ac48-dc806f7b9d6b" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.911135 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:22:28 crc kubenswrapper[4764]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 09 13:22:28 crc kubenswrapper[4764]: apiVersion: v1 Mar 09 13:22:28 crc kubenswrapper[4764]: clusters: Mar 09 13:22:28 crc kubenswrapper[4764]: - cluster: Mar 09 13:22:28 crc kubenswrapper[4764]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 09 13:22:28 crc kubenswrapper[4764]: server: https://api-int.crc.testing:6443 Mar 09 13:22:28 crc kubenswrapper[4764]: name: default-cluster Mar 09 13:22:28 crc kubenswrapper[4764]: contexts: Mar 09 13:22:28 crc kubenswrapper[4764]: - context: Mar 09 13:22:28 crc kubenswrapper[4764]: cluster: default-cluster Mar 09 13:22:28 crc kubenswrapper[4764]: namespace: default Mar 09 13:22:28 crc kubenswrapper[4764]: user: default-auth Mar 09 13:22:28 crc kubenswrapper[4764]: name: default-context Mar 09 13:22:28 crc kubenswrapper[4764]: current-context: default-context Mar 09 13:22:28 crc kubenswrapper[4764]: kind: Config Mar 09 13:22:28 crc kubenswrapper[4764]: preferences: {} Mar 09 13:22:28 crc kubenswrapper[4764]: users: Mar 09 13:22:28 crc kubenswrapper[4764]: - name: default-auth Mar 09 13:22:28 crc kubenswrapper[4764]: user: Mar 09 13:22:28 crc kubenswrapper[4764]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 09 13:22:28 crc kubenswrapper[4764]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 09 13:22:28 crc kubenswrapper[4764]: EOF Mar 09 13:22:28 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r5xrv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 13:22:28 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.912237 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: E0309 13:22:28.912438 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.921911 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.932031 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.943334 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.953750 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.962608 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.966840 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.966871 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.966881 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.966895 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.966905 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:28Z","lastTransitionTime":"2026-03-09T13:22:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.972693 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.980043 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.988685 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:28 crc kubenswrapper[4764]: I0309 13:22:28.999270 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.013381 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.022298 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.029802 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.037607 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.045615 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.053910 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.060993 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.069830 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.069877 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.069889 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.069909 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.069922 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:29Z","lastTransitionTime":"2026-03-09T13:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.070472 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.081035 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.087891 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.095796 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.105874 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.119991 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.128752 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.172540 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.172605 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.172620 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.172675 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.172699 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:29Z","lastTransitionTime":"2026-03-09T13:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.274785 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.274842 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.274860 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.274882 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.274898 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:29Z","lastTransitionTime":"2026-03-09T13:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.378110 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.378161 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.378176 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.378199 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.378225 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:29Z","lastTransitionTime":"2026-03-09T13:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.482042 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.482086 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.482097 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.482115 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.482127 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:29Z","lastTransitionTime":"2026-03-09T13:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:29 crc kubenswrapper[4764]: E0309 13:22:29.561572 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 09 13:22:29 crc kubenswrapper[4764]: E0309 13:22:29.562861 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.584873 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.584935 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.584948 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.584973 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.584986 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:29Z","lastTransitionTime":"2026-03-09T13:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.688737 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.688796 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.688808 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.688829 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.688842 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:29Z","lastTransitionTime":"2026-03-09T13:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.791377 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.791668 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.791784 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.791852 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.791908 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:29Z","lastTransitionTime":"2026-03-09T13:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.895656 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.895908 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.896013 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.896130 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.896214 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:29Z","lastTransitionTime":"2026-03-09T13:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.998932 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.998990 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.999003 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.999021 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:29 crc kubenswrapper[4764]: I0309 13:22:29.999035 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:29Z","lastTransitionTime":"2026-03-09T13:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.102657 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.102706 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.102720 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.102739 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.102749 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:30Z","lastTransitionTime":"2026-03-09T13:22:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.205211 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.205255 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.205264 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.205278 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.205288 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:30Z","lastTransitionTime":"2026-03-09T13:22:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.308451 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.308498 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.308509 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.308529 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.308539 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:30Z","lastTransitionTime":"2026-03-09T13:22:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.361489 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:22:46.361460036 +0000 UTC m=+121.611631954 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.361544 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.361788 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.361849 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.361886 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.361924 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.362038 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.362098 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:46.362087712 +0000 UTC m=+121.612259630 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.362265 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.362291 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.362307 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.362350 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:46.362339929 +0000 UTC m=+121.612511847 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.362427 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.362467 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:46.362456702 +0000 UTC m=+121.612628620 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.362551 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.362573 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.362584 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.362615 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:46.362606746 +0000 UTC m=+121.612778664 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.413383 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.413420 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.413429 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.413446 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.413455 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:30Z","lastTransitionTime":"2026-03-09T13:22:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.516199 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.516761 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.516837 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.516929 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.517005 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:30Z","lastTransitionTime":"2026-03-09T13:22:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.558807 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.558953 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.559010 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.559138 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.559227 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.559352 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.560307 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:22:30 crc kubenswrapper[4764]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 09 13:22:30 crc kubenswrapper[4764]: set -o allexport Mar 09 13:22:30 crc kubenswrapper[4764]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 09 13:22:30 crc kubenswrapper[4764]: source /etc/kubernetes/apiserver-url.env Mar 09 13:22:30 crc kubenswrapper[4764]: else Mar 09 13:22:30 crc kubenswrapper[4764]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 09 13:22:30 crc kubenswrapper[4764]: exit 1 Mar 09 13:22:30 crc kubenswrapper[4764]: fi Mar 09 13:22:30 crc kubenswrapper[4764]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 09 13:22:30 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 09 13:22:30 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:22:30 crc kubenswrapper[4764]: E0309 13:22:30.561475 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.621305 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.621347 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.621357 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.621372 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.621382 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:30Z","lastTransitionTime":"2026-03-09T13:22:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.723559 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.723592 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.723603 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.723616 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.723625 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:30Z","lastTransitionTime":"2026-03-09T13:22:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.826450 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.826494 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.826504 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.826524 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.826534 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:30Z","lastTransitionTime":"2026-03-09T13:22:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.928879 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.928933 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.928944 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.928958 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:30 crc kubenswrapper[4764]: I0309 13:22:30.928986 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:30Z","lastTransitionTime":"2026-03-09T13:22:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.031756 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.031797 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.031808 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.031823 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.031834 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:31Z","lastTransitionTime":"2026-03-09T13:22:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.135564 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.135681 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.135709 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.135738 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.135763 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:31Z","lastTransitionTime":"2026-03-09T13:22:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.238862 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.238926 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.238936 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.238952 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.238962 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:31Z","lastTransitionTime":"2026-03-09T13:22:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.342633 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.342695 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.342704 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.342720 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.342729 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:31Z","lastTransitionTime":"2026-03-09T13:22:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.445775 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.446331 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.446399 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.446514 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.446579 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:31Z","lastTransitionTime":"2026-03-09T13:22:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.550129 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.550186 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.550199 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.550219 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.550234 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:31Z","lastTransitionTime":"2026-03-09T13:22:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.653725 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.653781 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.653792 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.653816 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.653831 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:31Z","lastTransitionTime":"2026-03-09T13:22:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.758028 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.758094 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.758116 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.758144 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.758167 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:31Z","lastTransitionTime":"2026-03-09T13:22:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.861328 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.861368 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.861376 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.861391 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.861400 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:31Z","lastTransitionTime":"2026-03-09T13:22:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.965879 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.965978 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.966014 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.966042 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:31 crc kubenswrapper[4764]: I0309 13:22:31.966062 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:31Z","lastTransitionTime":"2026-03-09T13:22:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.070848 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.070925 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.070949 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.070979 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.071002 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:32Z","lastTransitionTime":"2026-03-09T13:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.174214 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.174269 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.174283 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.174303 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.174318 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:32Z","lastTransitionTime":"2026-03-09T13:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.277451 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.277527 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.277550 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.277581 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.277603 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:32Z","lastTransitionTime":"2026-03-09T13:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.379556 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.379590 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.379598 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.379614 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.379625 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:32Z","lastTransitionTime":"2026-03-09T13:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.482067 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.482109 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.482118 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.482133 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.482141 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:32Z","lastTransitionTime":"2026-03-09T13:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.559075 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.559124 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:32 crc kubenswrapper[4764]: E0309 13:22:32.559219 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.559080 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:32 crc kubenswrapper[4764]: E0309 13:22:32.559491 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:32 crc kubenswrapper[4764]: E0309 13:22:32.559881 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.559927 4764 scope.go:117] "RemoveContainer" containerID="033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.584942 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.584982 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.584995 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.585011 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.585022 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:32Z","lastTransitionTime":"2026-03-09T13:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.687177 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.687781 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.687804 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.687860 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.687876 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:32Z","lastTransitionTime":"2026-03-09T13:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.790930 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.790962 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.790972 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.790985 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.790997 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:32Z","lastTransitionTime":"2026-03-09T13:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.892985 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.893027 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.893038 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.893056 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.893068 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:32Z","lastTransitionTime":"2026-03-09T13:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.901257 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.901282 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.901290 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.901300 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.901307 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:32Z","lastTransitionTime":"2026-03-09T13:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: E0309 13:22:32.912332 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.915481 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.915511 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.915522 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.915570 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.915587 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:32Z","lastTransitionTime":"2026-03-09T13:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: E0309 13:22:32.923606 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.925694 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.926437 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.926467 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.926478 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.926493 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.926503 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:32Z","lastTransitionTime":"2026-03-09T13:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.927694 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811"} Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.928034 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:22:32 crc kubenswrapper[4764]: E0309 13:22:32.935698 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.939172 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.939201 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.939211 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.939226 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.939250 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:32Z","lastTransitionTime":"2026-03-09T13:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.943538 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:32 crc kubenswrapper[4764]: E0309 13:22:32.949810 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.951885 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.952936 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.952971 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.952981 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.952995 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.953007 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:32Z","lastTransitionTime":"2026-03-09T13:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: E0309 13:22:32.960370 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:32 crc kubenswrapper[4764]: E0309 13:22:32.960475 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.965697 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.973689 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.980516 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.989839 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.995779 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.995821 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.995831 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.995846 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.995856 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:32Z","lastTransitionTime":"2026-03-09T13:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:32 crc kubenswrapper[4764]: I0309 13:22:32.999800 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.006874 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.016586 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.027394 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.040587 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.048271 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.098164 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.098204 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.098216 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.098231 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.098242 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:33Z","lastTransitionTime":"2026-03-09T13:22:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.200113 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.200160 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.200173 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.200189 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.200202 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:33Z","lastTransitionTime":"2026-03-09T13:22:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.302077 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.302116 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.302125 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.302138 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.302151 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:33Z","lastTransitionTime":"2026-03-09T13:22:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.404898 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.404942 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.404956 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.404977 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.404993 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:33Z","lastTransitionTime":"2026-03-09T13:22:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.507033 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.507075 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.507088 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.507103 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.507112 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:33Z","lastTransitionTime":"2026-03-09T13:22:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.609716 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.609760 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.609773 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.609788 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.609800 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:33Z","lastTransitionTime":"2026-03-09T13:22:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.712235 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.712270 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.712279 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.712292 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.712305 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:33Z","lastTransitionTime":"2026-03-09T13:22:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.821038 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.821092 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.821103 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.821124 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.821137 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:33Z","lastTransitionTime":"2026-03-09T13:22:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.926599 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.926687 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.926724 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.926759 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:33 crc kubenswrapper[4764]: I0309 13:22:33.926774 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:33Z","lastTransitionTime":"2026-03-09T13:22:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.028695 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.028751 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.028760 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.028773 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.028782 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:34Z","lastTransitionTime":"2026-03-09T13:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.081204 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-hbmjc"] Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.081609 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hbmjc" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.083873 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.084055 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.084159 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.084472 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.096876 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.108627 4764 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.110175 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.126551 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.131046 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.131072 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.131080 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.131093 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.131102 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:34Z","lastTransitionTime":"2026-03-09T13:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.135014 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.146919 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.157192 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.170547 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.190267 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.200332 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7c2b6620-c8b8-47cf-9f15-883dbf8e34cc-host\") pod \"node-ca-hbmjc\" (UID: \"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\") " pod="openshift-image-registry/node-ca-hbmjc" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.200609 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4sdq\" (UniqueName: \"kubernetes.io/projected/7c2b6620-c8b8-47cf-9f15-883dbf8e34cc-kube-api-access-z4sdq\") pod \"node-ca-hbmjc\" (UID: \"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\") " pod="openshift-image-registry/node-ca-hbmjc" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.200733 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7c2b6620-c8b8-47cf-9f15-883dbf8e34cc-serviceca\") pod \"node-ca-hbmjc\" (UID: \"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\") " pod="openshift-image-registry/node-ca-hbmjc" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.204729 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.226318 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.233570 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.233706 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.233792 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.233869 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.233932 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:34Z","lastTransitionTime":"2026-03-09T13:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.235848 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.246182 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.256131 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.301754 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4sdq\" (UniqueName: \"kubernetes.io/projected/7c2b6620-c8b8-47cf-9f15-883dbf8e34cc-kube-api-access-z4sdq\") pod \"node-ca-hbmjc\" (UID: \"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\") " pod="openshift-image-registry/node-ca-hbmjc" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.301807 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7c2b6620-c8b8-47cf-9f15-883dbf8e34cc-serviceca\") pod \"node-ca-hbmjc\" (UID: \"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\") " pod="openshift-image-registry/node-ca-hbmjc" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.301852 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7c2b6620-c8b8-47cf-9f15-883dbf8e34cc-host\") pod \"node-ca-hbmjc\" (UID: \"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\") " pod="openshift-image-registry/node-ca-hbmjc" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.301921 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7c2b6620-c8b8-47cf-9f15-883dbf8e34cc-host\") pod \"node-ca-hbmjc\" (UID: \"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\") " pod="openshift-image-registry/node-ca-hbmjc" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.303078 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7c2b6620-c8b8-47cf-9f15-883dbf8e34cc-serviceca\") pod \"node-ca-hbmjc\" (UID: \"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\") " pod="openshift-image-registry/node-ca-hbmjc" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.319951 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4sdq\" (UniqueName: \"kubernetes.io/projected/7c2b6620-c8b8-47cf-9f15-883dbf8e34cc-kube-api-access-z4sdq\") pod \"node-ca-hbmjc\" (UID: \"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\") " pod="openshift-image-registry/node-ca-hbmjc" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.336730 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.336773 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.336782 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.336799 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.336808 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:34Z","lastTransitionTime":"2026-03-09T13:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.393806 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hbmjc" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.439985 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.440031 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.440043 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.440059 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.440069 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:34Z","lastTransitionTime":"2026-03-09T13:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.542758 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.542787 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.542796 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.542809 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.542817 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:34Z","lastTransitionTime":"2026-03-09T13:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.559691 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.559739 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.559737 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:34 crc kubenswrapper[4764]: E0309 13:22:34.559806 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:34 crc kubenswrapper[4764]: E0309 13:22:34.560074 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:34 crc kubenswrapper[4764]: E0309 13:22:34.560177 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.645486 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.645520 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.645527 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.645539 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.645548 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:34Z","lastTransitionTime":"2026-03-09T13:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.748149 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.748205 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.748217 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.748235 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.748245 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:34Z","lastTransitionTime":"2026-03-09T13:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.850909 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.850971 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.850984 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.851002 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.851015 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:34Z","lastTransitionTime":"2026-03-09T13:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.934440 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hbmjc" event={"ID":"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc","Type":"ContainerStarted","Data":"6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9"} Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.934496 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hbmjc" event={"ID":"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc","Type":"ContainerStarted","Data":"583d7339a6ff7aca2cee97b44d5ae02c2ed0d21a333b208ac485f770c90332e8"} Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.948009 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.956298 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.956368 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.956383 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.956403 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.956422 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:34Z","lastTransitionTime":"2026-03-09T13:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.967306 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.981288 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:34 crc kubenswrapper[4764]: I0309 13:22:34.994449 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.008952 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.016968 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.028237 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.041581 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.056600 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.058999 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.059049 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.059063 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.059082 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.059098 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:35Z","lastTransitionTime":"2026-03-09T13:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.066383 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.076357 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.087341 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.097173 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.161558 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.161601 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.161610 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.161624 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.161634 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:35Z","lastTransitionTime":"2026-03-09T13:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.265018 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.265114 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.265129 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.265151 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.265167 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:35Z","lastTransitionTime":"2026-03-09T13:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.367231 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.367308 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.367321 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.367338 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.367375 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:35Z","lastTransitionTime":"2026-03-09T13:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.470112 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.470143 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.470152 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.470164 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.470173 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:35Z","lastTransitionTime":"2026-03-09T13:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.572103 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.572158 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.572173 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.572190 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.572205 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:35Z","lastTransitionTime":"2026-03-09T13:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.576309 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.589630 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.602808 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.615326 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.629338 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.642860 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.655638 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.664187 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.674148 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.674181 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.674191 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.674207 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.674217 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:35Z","lastTransitionTime":"2026-03-09T13:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.675663 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.689861 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.707517 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.717169 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.726997 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.777821 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.777888 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.777904 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.777928 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.777943 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:35Z","lastTransitionTime":"2026-03-09T13:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.880862 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.881146 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.881234 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.881344 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.881436 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:35Z","lastTransitionTime":"2026-03-09T13:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.984058 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.984127 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.984141 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.984160 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:35 crc kubenswrapper[4764]: I0309 13:22:35.984176 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:35Z","lastTransitionTime":"2026-03-09T13:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.087199 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.087232 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.087243 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.087257 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.087267 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:36Z","lastTransitionTime":"2026-03-09T13:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.189574 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.189618 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.189627 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.189655 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.189664 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:36Z","lastTransitionTime":"2026-03-09T13:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.292236 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.292270 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.292280 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.292292 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.292300 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:36Z","lastTransitionTime":"2026-03-09T13:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.394608 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.394675 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.394687 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.394702 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.394713 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:36Z","lastTransitionTime":"2026-03-09T13:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.497513 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.497554 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.497563 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.497577 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.497587 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:36Z","lastTransitionTime":"2026-03-09T13:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.559066 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.559110 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.559181 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:36 crc kubenswrapper[4764]: E0309 13:22:36.559300 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:36 crc kubenswrapper[4764]: E0309 13:22:36.559408 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:36 crc kubenswrapper[4764]: E0309 13:22:36.559484 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.599409 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.599453 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.599465 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.599485 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.599496 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:36Z","lastTransitionTime":"2026-03-09T13:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.701933 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.701969 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.701979 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.701993 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.702003 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:36Z","lastTransitionTime":"2026-03-09T13:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.804969 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.805011 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.805021 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.805036 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.805048 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:36Z","lastTransitionTime":"2026-03-09T13:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.907595 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.907631 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.907670 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.907688 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:36 crc kubenswrapper[4764]: I0309 13:22:36.907700 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:36Z","lastTransitionTime":"2026-03-09T13:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.010501 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.010542 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.010562 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.010581 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.010595 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:37Z","lastTransitionTime":"2026-03-09T13:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.112349 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.112400 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.112415 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.112436 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.112452 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:37Z","lastTransitionTime":"2026-03-09T13:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.215507 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.215544 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.215554 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.215570 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.215590 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:37Z","lastTransitionTime":"2026-03-09T13:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.318194 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.318233 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.318243 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.318260 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.318273 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:37Z","lastTransitionTime":"2026-03-09T13:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.420758 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.420790 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.420798 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.420812 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.420821 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:37Z","lastTransitionTime":"2026-03-09T13:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.523784 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.523818 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.523826 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.523839 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.523848 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:37Z","lastTransitionTime":"2026-03-09T13:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.626767 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.626829 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.626845 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.626871 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.626887 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:37Z","lastTransitionTime":"2026-03-09T13:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.729279 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.729331 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.729343 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.729359 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.729370 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:37Z","lastTransitionTime":"2026-03-09T13:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.832325 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.832373 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.832387 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.832406 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.832418 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:37Z","lastTransitionTime":"2026-03-09T13:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.935251 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.935308 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.935318 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.935335 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:37 crc kubenswrapper[4764]: I0309 13:22:37.935346 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:37Z","lastTransitionTime":"2026-03-09T13:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.037722 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.037822 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.037834 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.037851 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.037863 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:38Z","lastTransitionTime":"2026-03-09T13:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.140754 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.140793 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.140803 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.140818 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.140830 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:38Z","lastTransitionTime":"2026-03-09T13:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.243881 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.243933 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.243944 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.243964 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.243975 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:38Z","lastTransitionTime":"2026-03-09T13:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.346615 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.346682 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.346694 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.346709 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.346720 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:38Z","lastTransitionTime":"2026-03-09T13:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.449385 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.449752 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.449864 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.449966 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.450064 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:38Z","lastTransitionTime":"2026-03-09T13:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.552956 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.553013 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.553024 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.553043 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.553086 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:38Z","lastTransitionTime":"2026-03-09T13:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.559197 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:38 crc kubenswrapper[4764]: E0309 13:22:38.559346 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.559453 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:38 crc kubenswrapper[4764]: E0309 13:22:38.559660 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.560059 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:38 crc kubenswrapper[4764]: E0309 13:22:38.560185 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.655873 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.655920 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.655929 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.655947 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.655956 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:38Z","lastTransitionTime":"2026-03-09T13:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.757965 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.758013 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.758024 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.758041 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.758052 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:38Z","lastTransitionTime":"2026-03-09T13:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.880936 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.880973 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.880980 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.880993 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.881001 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:38Z","lastTransitionTime":"2026-03-09T13:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.983893 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.983963 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.983977 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.984002 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:38 crc kubenswrapper[4764]: I0309 13:22:38.984017 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:38Z","lastTransitionTime":"2026-03-09T13:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.087038 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.087085 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.087102 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.087121 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.087136 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:39Z","lastTransitionTime":"2026-03-09T13:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.190246 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.190292 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.190300 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.190317 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.190327 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:39Z","lastTransitionTime":"2026-03-09T13:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.293432 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.293512 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.293525 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.293548 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.293562 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:39Z","lastTransitionTime":"2026-03-09T13:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.397023 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.397074 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.397089 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.397110 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.397129 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:39Z","lastTransitionTime":"2026-03-09T13:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.500158 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.500217 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.500230 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.500250 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.500265 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:39Z","lastTransitionTime":"2026-03-09T13:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.603399 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.603436 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.603444 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.603461 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.603471 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:39Z","lastTransitionTime":"2026-03-09T13:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.705953 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.705985 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.705993 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.706006 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.706014 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:39Z","lastTransitionTime":"2026-03-09T13:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.795960 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs"] Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.796795 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.799010 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.799186 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.809107 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.809173 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.809187 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.809206 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.809219 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:39Z","lastTransitionTime":"2026-03-09T13:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.809808 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.821021 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.829174 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.837342 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.845826 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.855267 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.867116 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.883065 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.891327 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.900220 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.909912 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.912618 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.912676 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.912688 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.912704 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.912716 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:39Z","lastTransitionTime":"2026-03-09T13:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.920300 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.930143 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.938017 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.947168 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39"} Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.947216 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454"} Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.957444 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.966812 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/30a92c50-fe51-40d9-a69c-4b5fd722bfc6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hzjxs\" (UID: \"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.966971 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdm9q\" (UniqueName: \"kubernetes.io/projected/30a92c50-fe51-40d9-a69c-4b5fd722bfc6-kube-api-access-mdm9q\") pod \"ovnkube-control-plane-749d76644c-hzjxs\" (UID: \"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.967014 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/30a92c50-fe51-40d9-a69c-4b5fd722bfc6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hzjxs\" (UID: \"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.967040 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/30a92c50-fe51-40d9-a69c-4b5fd722bfc6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hzjxs\" (UID: \"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.968460 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.979812 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.988103 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:39 crc kubenswrapper[4764]: I0309 13:22:39.999067 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.008423 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.014554 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.014594 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.014608 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.014628 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.014656 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:40Z","lastTransitionTime":"2026-03-09T13:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.019309 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.035743 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.044226 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.052558 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.064900 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.067746 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdm9q\" (UniqueName: \"kubernetes.io/projected/30a92c50-fe51-40d9-a69c-4b5fd722bfc6-kube-api-access-mdm9q\") pod \"ovnkube-control-plane-749d76644c-hzjxs\" (UID: \"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.067797 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/30a92c50-fe51-40d9-a69c-4b5fd722bfc6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hzjxs\" (UID: \"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.067816 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/30a92c50-fe51-40d9-a69c-4b5fd722bfc6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hzjxs\" (UID: \"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.067869 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/30a92c50-fe51-40d9-a69c-4b5fd722bfc6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hzjxs\" (UID: \"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.068545 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/30a92c50-fe51-40d9-a69c-4b5fd722bfc6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hzjxs\" (UID: \"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.068592 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/30a92c50-fe51-40d9-a69c-4b5fd722bfc6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hzjxs\" (UID: \"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.072231 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/30a92c50-fe51-40d9-a69c-4b5fd722bfc6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hzjxs\" (UID: \"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.073978 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.085334 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdm9q\" (UniqueName: \"kubernetes.io/projected/30a92c50-fe51-40d9-a69c-4b5fd722bfc6-kube-api-access-mdm9q\") pod \"ovnkube-control-plane-749d76644c-hzjxs\" (UID: \"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.085558 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.094481 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.109880 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.117312 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.117346 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.117355 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.117369 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.117378 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:40Z","lastTransitionTime":"2026-03-09T13:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:40 crc kubenswrapper[4764]: W0309 13:22:40.121348 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30a92c50_fe51_40d9_a69c_4b5fd722bfc6.slice/crio-b60d1a57310c4eecf9739792746a14c95b7b8a0de18df8ad29785d547b60b123 WatchSource:0}: Error finding container b60d1a57310c4eecf9739792746a14c95b7b8a0de18df8ad29785d547b60b123: Status 404 returned error can't find the container with id b60d1a57310c4eecf9739792746a14c95b7b8a0de18df8ad29785d547b60b123 Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.219801 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.219859 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.219871 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.219888 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.219900 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:40Z","lastTransitionTime":"2026-03-09T13:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.324249 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.324283 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.324296 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.324313 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.324459 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:40Z","lastTransitionTime":"2026-03-09T13:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.426715 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.426802 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.426851 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.426876 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.426892 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:40Z","lastTransitionTime":"2026-03-09T13:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.508866 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-wkwdz"] Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.509422 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:40 crc kubenswrapper[4764]: E0309 13:22:40.509518 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.527843 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.529023 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.529065 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.529076 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.529104 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.529115 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:40Z","lastTransitionTime":"2026-03-09T13:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.543634 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.557804 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.559096 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.559144 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.559140 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:40 crc kubenswrapper[4764]: E0309 13:22:40.559194 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:40 crc kubenswrapper[4764]: E0309 13:22:40.559288 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:40 crc kubenswrapper[4764]: E0309 13:22:40.559465 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.569753 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.582950 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.594195 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.604780 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.614697 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.625351 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.631174 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.631206 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.631215 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.631230 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.631243 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:40Z","lastTransitionTime":"2026-03-09T13:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.633696 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.644182 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.657416 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.675591 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs\") pod \"network-metrics-daemon-wkwdz\" (UID: \"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\") " pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.675713 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsbrz\" (UniqueName: \"kubernetes.io/projected/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-kube-api-access-gsbrz\") pod \"network-metrics-daemon-wkwdz\" (UID: \"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\") " pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.678684 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.689834 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.703345 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.733887 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.733933 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.733943 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.733975 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.733987 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:40Z","lastTransitionTime":"2026-03-09T13:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.776568 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsbrz\" (UniqueName: \"kubernetes.io/projected/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-kube-api-access-gsbrz\") pod \"network-metrics-daemon-wkwdz\" (UID: \"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\") " pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.776636 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs\") pod \"network-metrics-daemon-wkwdz\" (UID: \"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\") " pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:40 crc kubenswrapper[4764]: E0309 13:22:40.776776 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:40 crc kubenswrapper[4764]: E0309 13:22:40.776859 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs podName:6597fc34-10ee-4984-9c69-f4b7c0d46e2a nodeName:}" failed. No retries permitted until 2026-03-09 13:22:41.276831472 +0000 UTC m=+116.527003380 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs") pod "network-metrics-daemon-wkwdz" (UID: "6597fc34-10ee-4984-9c69-f4b7c0d46e2a") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.800181 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsbrz\" (UniqueName: \"kubernetes.io/projected/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-kube-api-access-gsbrz\") pod \"network-metrics-daemon-wkwdz\" (UID: \"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\") " pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.835838 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.836094 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.836167 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.836230 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.836332 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:40Z","lastTransitionTime":"2026-03-09T13:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.939115 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.939152 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.939160 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.939174 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.939184 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:40Z","lastTransitionTime":"2026-03-09T13:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.951088 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" event={"ID":"30a92c50-fe51-40d9-a69c-4b5fd722bfc6","Type":"ContainerStarted","Data":"10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098"} Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.951348 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" event={"ID":"30a92c50-fe51-40d9-a69c-4b5fd722bfc6","Type":"ContainerStarted","Data":"73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727"} Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.951450 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" event={"ID":"30a92c50-fe51-40d9-a69c-4b5fd722bfc6","Type":"ContainerStarted","Data":"b60d1a57310c4eecf9739792746a14c95b7b8a0de18df8ad29785d547b60b123"} Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.952323 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zmzm7" event={"ID":"202a1f58-ce83-4374-ac48-dc806f7b9d6b","Type":"ContainerStarted","Data":"05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331"} Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.972260 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.984193 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:40 crc kubenswrapper[4764]: I0309 13:22:40.996633 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.013400 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.030595 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.041504 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.041547 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.041555 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.041573 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.041582 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:41Z","lastTransitionTime":"2026-03-09T13:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.045966 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.061999 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.077156 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.090499 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.104966 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.121838 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.131077 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.140708 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.143924 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.144011 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.144033 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.144066 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.144089 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:41Z","lastTransitionTime":"2026-03-09T13:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.152623 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.170317 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.183565 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.197018 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.210799 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.229569 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.242948 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.246634 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.246705 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.246718 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.246739 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.246751 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:41Z","lastTransitionTime":"2026-03-09T13:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.259427 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.274220 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.283057 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs\") pod \"network-metrics-daemon-wkwdz\" (UID: \"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\") " pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:41 crc kubenswrapper[4764]: E0309 13:22:41.283331 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:41 crc kubenswrapper[4764]: E0309 13:22:41.283443 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs podName:6597fc34-10ee-4984-9c69-f4b7c0d46e2a nodeName:}" failed. No retries permitted until 2026-03-09 13:22:42.28341776 +0000 UTC m=+117.533589668 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs") pod "network-metrics-daemon-wkwdz" (UID: "6597fc34-10ee-4984-9c69-f4b7c0d46e2a") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.292540 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.310290 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.343103 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.350618 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.350726 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.350745 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.350774 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.350799 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:41Z","lastTransitionTime":"2026-03-09T13:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.363425 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.380059 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.394504 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.410499 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.426939 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.454980 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.455034 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.455045 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.455063 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.455078 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:41Z","lastTransitionTime":"2026-03-09T13:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.558872 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.559055 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.559134 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.559146 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.559175 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.559185 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:41Z","lastTransitionTime":"2026-03-09T13:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:41 crc kubenswrapper[4764]: E0309 13:22:41.559415 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.662661 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.662711 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.662722 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.662738 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.662748 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:41Z","lastTransitionTime":"2026-03-09T13:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.764867 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.764913 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.764922 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.764940 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.764952 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:41Z","lastTransitionTime":"2026-03-09T13:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.868036 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.868092 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.868110 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.868138 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.868159 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:41Z","lastTransitionTime":"2026-03-09T13:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.958510 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" event={"ID":"072442e6-8ece-4f72-a8cb-ad7ef1e3facb","Type":"ContainerStarted","Data":"efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f"} Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.970605 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.970637 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.970661 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.970675 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.970684 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:41Z","lastTransitionTime":"2026-03-09T13:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:41 crc kubenswrapper[4764]: I0309 13:22:41.982670 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.004245 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.031374 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.046191 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.061662 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.073735 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.073777 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.073790 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.073810 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.073826 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:42Z","lastTransitionTime":"2026-03-09T13:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.074860 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.094120 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.107427 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.121561 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.136845 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.152077 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.165278 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.176737 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.176996 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.177031 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.177045 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.177070 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.177087 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:42Z","lastTransitionTime":"2026-03-09T13:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.188580 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.202079 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.281229 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.281331 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.281359 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.281399 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.281438 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:42Z","lastTransitionTime":"2026-03-09T13:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.293173 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs\") pod \"network-metrics-daemon-wkwdz\" (UID: \"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\") " pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:42 crc kubenswrapper[4764]: E0309 13:22:42.293376 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:42 crc kubenswrapper[4764]: E0309 13:22:42.293445 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs podName:6597fc34-10ee-4984-9c69-f4b7c0d46e2a nodeName:}" failed. No retries permitted until 2026-03-09 13:22:44.293423211 +0000 UTC m=+119.543595119 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs") pod "network-metrics-daemon-wkwdz" (UID: "6597fc34-10ee-4984-9c69-f4b7c0d46e2a") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.384760 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.384807 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.384822 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.384844 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.384859 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:42Z","lastTransitionTime":"2026-03-09T13:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.488863 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.488952 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.488975 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.489016 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.489042 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:42Z","lastTransitionTime":"2026-03-09T13:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.559771 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.559837 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.559923 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:42 crc kubenswrapper[4764]: E0309 13:22:42.560610 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:42 crc kubenswrapper[4764]: E0309 13:22:42.560727 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:42 crc kubenswrapper[4764]: E0309 13:22:42.560751 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.593933 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.593968 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.593979 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.593995 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.594008 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:42Z","lastTransitionTime":"2026-03-09T13:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.696501 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.696535 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.696544 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.696558 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.696566 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:42Z","lastTransitionTime":"2026-03-09T13:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.798425 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.798459 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.798469 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.798481 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.798490 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:42Z","lastTransitionTime":"2026-03-09T13:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.900947 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.900983 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.900993 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.901007 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.901017 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:42Z","lastTransitionTime":"2026-03-09T13:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.925027 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.944079 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.961517 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.963177 4764 generic.go:334] "Generic (PLEG): container finished" podID="072442e6-8ece-4f72-a8cb-ad7ef1e3facb" containerID="efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f" exitCode=0 Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.963250 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" event={"ID":"072442e6-8ece-4f72-a8cb-ad7ef1e3facb","Type":"ContainerDied","Data":"efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f"} Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.965707 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-r5bnx" event={"ID":"7fede188-66d9-4cb1-af19-c94afe7fbcde","Type":"ContainerStarted","Data":"7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f"} Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.967758 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97"} Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.983122 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:42 crc kubenswrapper[4764]: I0309 13:22:42.998431 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.003289 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.003332 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.003341 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.003354 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.003364 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:43Z","lastTransitionTime":"2026-03-09T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.010730 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.022758 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.036520 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.050934 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.063021 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.073126 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.083464 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.099239 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.105223 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.105261 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.105270 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.105285 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.105293 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:43Z","lastTransitionTime":"2026-03-09T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.113867 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.124777 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.149102 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.170997 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.187344 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.199197 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.207531 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.207564 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.207574 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.207587 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.207596 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:43Z","lastTransitionTime":"2026-03-09T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.212599 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.224319 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.241334 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.258585 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.269422 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.279367 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.288103 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.298760 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.307318 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.309445 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.309484 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.309499 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.309520 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.309536 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:43Z","lastTransitionTime":"2026-03-09T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.319147 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.331280 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.344744 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.352789 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.352814 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.352822 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.352835 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.352845 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:43Z","lastTransitionTime":"2026-03-09T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:43 crc kubenswrapper[4764]: E0309 13:22:43.366257 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.369840 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.370006 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.370102 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.370194 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.370285 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:43Z","lastTransitionTime":"2026-03-09T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:43 crc kubenswrapper[4764]: E0309 13:22:43.384006 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.387677 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.387937 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.387955 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.387973 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.387983 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:43Z","lastTransitionTime":"2026-03-09T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:43 crc kubenswrapper[4764]: E0309 13:22:43.402876 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.407033 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.407083 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.407099 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.407118 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.407131 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:43Z","lastTransitionTime":"2026-03-09T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:43 crc kubenswrapper[4764]: E0309 13:22:43.420022 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.423587 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.423629 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.423638 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.423667 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.423678 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:43Z","lastTransitionTime":"2026-03-09T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:43 crc kubenswrapper[4764]: E0309 13:22:43.435438 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: E0309 13:22:43.435550 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.437569 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.437631 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.437660 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.437681 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.437692 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:43Z","lastTransitionTime":"2026-03-09T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.540039 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.540289 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.540366 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.540443 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.540504 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:43Z","lastTransitionTime":"2026-03-09T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.565881 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:43 crc kubenswrapper[4764]: E0309 13:22:43.566049 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.642548 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.642589 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.642599 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.642611 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.642621 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:43Z","lastTransitionTime":"2026-03-09T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.744470 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.744535 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.744546 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.744560 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.744568 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:43Z","lastTransitionTime":"2026-03-09T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.846885 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.846922 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.846962 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.846977 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.846989 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:43Z","lastTransitionTime":"2026-03-09T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.949948 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.949985 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.949995 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.950008 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.950017 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:43Z","lastTransitionTime":"2026-03-09T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.972938 4764 generic.go:334] "Generic (PLEG): container finished" podID="072442e6-8ece-4f72-a8cb-ad7ef1e3facb" containerID="3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a" exitCode=0 Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.973000 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" event={"ID":"072442e6-8ece-4f72-a8cb-ad7ef1e3facb","Type":"ContainerDied","Data":"3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a"} Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.983532 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:43 crc kubenswrapper[4764]: I0309 13:22:43.996777 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.026329 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.052896 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.064062 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.064140 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.064165 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.064200 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.064220 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:44Z","lastTransitionTime":"2026-03-09T13:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.064977 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.086422 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.101936 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.115249 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.125234 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.137221 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.147819 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.163145 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.167146 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.167178 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.167187 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.167199 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.167227 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:44Z","lastTransitionTime":"2026-03-09T13:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.176607 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.191952 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.210444 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.269817 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.269869 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.269886 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.269909 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.269926 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:44Z","lastTransitionTime":"2026-03-09T13:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.316128 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs\") pod \"network-metrics-daemon-wkwdz\" (UID: \"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\") " pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:44 crc kubenswrapper[4764]: E0309 13:22:44.316254 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:44 crc kubenswrapper[4764]: E0309 13:22:44.316297 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs podName:6597fc34-10ee-4984-9c69-f4b7c0d46e2a nodeName:}" failed. No retries permitted until 2026-03-09 13:22:48.316283947 +0000 UTC m=+123.566455855 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs") pod "network-metrics-daemon-wkwdz" (UID: "6597fc34-10ee-4984-9c69-f4b7c0d46e2a") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.372371 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.372406 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.372416 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.372431 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.372442 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:44Z","lastTransitionTime":"2026-03-09T13:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.474820 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.474852 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.474861 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.474874 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.474881 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:44Z","lastTransitionTime":"2026-03-09T13:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.558739 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.558775 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.558922 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:44 crc kubenswrapper[4764]: E0309 13:22:44.559036 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:44 crc kubenswrapper[4764]: E0309 13:22:44.559098 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:44 crc kubenswrapper[4764]: E0309 13:22:44.559751 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.565362 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.577887 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.578156 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.578167 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.578183 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.578193 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:44Z","lastTransitionTime":"2026-03-09T13:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.680524 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.680605 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.680631 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.680698 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.680719 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:44Z","lastTransitionTime":"2026-03-09T13:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.784250 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.784281 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.784291 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.784304 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.784313 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:44Z","lastTransitionTime":"2026-03-09T13:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.886255 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.886296 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.886305 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.886319 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.886329 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:44Z","lastTransitionTime":"2026-03-09T13:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.978891 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b"} Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.978946 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad"} Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.980867 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f"} Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.983290 4764 generic.go:334] "Generic (PLEG): container finished" podID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerID="cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66" exitCode=0 Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.983359 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerDied","Data":"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66"} Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.989239 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.989284 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.989295 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.989314 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.989325 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:44Z","lastTransitionTime":"2026-03-09T13:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.990964 4764 generic.go:334] "Generic (PLEG): container finished" podID="072442e6-8ece-4f72-a8cb-ad7ef1e3facb" containerID="1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91" exitCode=0 Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.991041 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" event={"ID":"072442e6-8ece-4f72-a8cb-ad7ef1e3facb","Type":"ContainerDied","Data":"1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91"} Mar 09 13:22:44 crc kubenswrapper[4764]: I0309 13:22:44.994033 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.011560 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.026813 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.039751 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.058126 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.074581 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.089977 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.091555 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.091599 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.091611 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.091682 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.091699 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:45Z","lastTransitionTime":"2026-03-09T13:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.111099 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.126118 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.138354 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.158038 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.178122 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.194008 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.194047 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.194057 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.194075 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.194084 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:45Z","lastTransitionTime":"2026-03-09T13:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.196273 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.214324 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.234372 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.249001 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.264562 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.278213 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.296065 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.296180 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.296196 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.296205 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.296284 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.296297 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:45Z","lastTransitionTime":"2026-03-09T13:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.309463 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.323720 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.335556 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.346937 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.363814 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.375503 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.389684 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.398249 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.398276 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.398284 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.398297 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.398308 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:45Z","lastTransitionTime":"2026-03-09T13:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.407699 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.420396 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.432095 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.441280 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.455160 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.467301 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: E0309 13:22:45.499506 4764 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.559299 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:45 crc kubenswrapper[4764]: E0309 13:22:45.559440 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.574789 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.587409 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.598990 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.613867 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.623415 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.634587 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: E0309 13:22:45.639318 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.649783 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.664613 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.671990 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.681544 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.690103 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.701989 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.712093 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.721721 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.731811 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.742719 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.997411 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerStarted","Data":"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519"} Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.997970 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerStarted","Data":"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29"} Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.997992 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerStarted","Data":"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01"} Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.998005 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerStarted","Data":"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb"} Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.998016 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerStarted","Data":"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4"} Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.999822 4764 generic.go:334] "Generic (PLEG): container finished" podID="072442e6-8ece-4f72-a8cb-ad7ef1e3facb" containerID="885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96" exitCode=0 Mar 09 13:22:45 crc kubenswrapper[4764]: I0309 13:22:45.999855 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" event={"ID":"072442e6-8ece-4f72-a8cb-ad7ef1e3facb","Type":"ContainerDied","Data":"885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96"} Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.014845 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.032684 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.045317 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.061340 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.075573 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.086179 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.100780 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.118609 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.143613 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.154558 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.172952 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.189636 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.211189 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.222310 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.238467 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.250260 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.445618 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.445796 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:18.445759973 +0000 UTC m=+153.695931881 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.446053 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.446081 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.446109 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.446130 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.446167 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.446217 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:23:18.446208685 +0000 UTC m=+153.696380593 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.446252 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.446262 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.446274 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.446301 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.446264 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.446330 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:23:18.446312068 +0000 UTC m=+153.696484056 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.446341 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.446349 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:23:18.446339179 +0000 UTC m=+153.696511207 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.446351 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.446384 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:23:18.44637351 +0000 UTC m=+153.696545518 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.558853 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.558904 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.558973 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.559056 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:46 crc kubenswrapper[4764]: I0309 13:22:46.559363 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:46 crc kubenswrapper[4764]: E0309 13:22:46.559488 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.005882 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerStarted","Data":"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d"} Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.008295 4764 generic.go:334] "Generic (PLEG): container finished" podID="072442e6-8ece-4f72-a8cb-ad7ef1e3facb" containerID="2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f" exitCode=0 Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.008339 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" event={"ID":"072442e6-8ece-4f72-a8cb-ad7ef1e3facb","Type":"ContainerDied","Data":"2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f"} Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.019725 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.036394 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.048498 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.062807 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.075042 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.086074 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.096533 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.107399 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.116448 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.127574 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.139913 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.159934 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.171999 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.182724 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.191884 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.200554 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:47 crc kubenswrapper[4764]: I0309 13:22:47.558957 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:47 crc kubenswrapper[4764]: E0309 13:22:47.559113 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.014623 4764 generic.go:334] "Generic (PLEG): container finished" podID="072442e6-8ece-4f72-a8cb-ad7ef1e3facb" containerID="ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0" exitCode=0 Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.014683 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" event={"ID":"072442e6-8ece-4f72-a8cb-ad7ef1e3facb","Type":"ContainerDied","Data":"ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0"} Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.033093 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.053371 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.072350 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.090729 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.101047 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.116778 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.134036 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.154606 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.168798 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.182036 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.193739 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.208565 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.220510 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.232683 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.243994 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.256152 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.365298 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs\") pod \"network-metrics-daemon-wkwdz\" (UID: \"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\") " pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:48 crc kubenswrapper[4764]: E0309 13:22:48.365509 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:48 crc kubenswrapper[4764]: E0309 13:22:48.365599 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs podName:6597fc34-10ee-4984-9c69-f4b7c0d46e2a nodeName:}" failed. No retries permitted until 2026-03-09 13:22:56.365578455 +0000 UTC m=+131.615750373 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs") pod "network-metrics-daemon-wkwdz" (UID: "6597fc34-10ee-4984-9c69-f4b7c0d46e2a") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.559713 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.559749 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:48 crc kubenswrapper[4764]: I0309 13:22:48.559724 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:48 crc kubenswrapper[4764]: E0309 13:22:48.559844 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:48 crc kubenswrapper[4764]: E0309 13:22:48.559911 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:48 crc kubenswrapper[4764]: E0309 13:22:48.559989 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.022343 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" event={"ID":"072442e6-8ece-4f72-a8cb-ad7ef1e3facb","Type":"ContainerStarted","Data":"d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519"} Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.029162 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerStarted","Data":"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017"} Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.041523 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.056183 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.070060 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.082223 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.091720 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.105433 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.120611 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.132891 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.147083 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.165936 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.174122 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.182557 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.191324 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.201226 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.208580 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.217938 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:49 crc kubenswrapper[4764]: I0309 13:22:49.559837 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:49 crc kubenswrapper[4764]: E0309 13:22:49.560072 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:22:50 crc kubenswrapper[4764]: I0309 13:22:50.559825 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:50 crc kubenswrapper[4764]: I0309 13:22:50.559972 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:50 crc kubenswrapper[4764]: E0309 13:22:50.560367 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:50 crc kubenswrapper[4764]: I0309 13:22:50.560103 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:50 crc kubenswrapper[4764]: E0309 13:22:50.560698 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:50 crc kubenswrapper[4764]: E0309 13:22:50.560843 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:50 crc kubenswrapper[4764]: E0309 13:22:50.641059 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.039519 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerStarted","Data":"03b613417c12f1a1de495ba21dfe736f2565fde5ba629162dcc5cbe929369b4b"} Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.039843 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.052145 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.064986 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.067989 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.076415 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.087730 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.098680 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.112499 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.123974 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.132703 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.142760 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.156812 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.179923 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03b613417c12f1a1de495ba21dfe736f2565fde5ba629162dcc5cbe929369b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.189279 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.199407 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.207529 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.218870 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.228858 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.241404 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.251281 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.261513 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.272101 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.283587 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.295558 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.305551 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.317429 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.327722 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.339303 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.349183 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.361187 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.373263 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.388604 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03b613417c12f1a1de495ba21dfe736f2565fde5ba629162dcc5cbe929369b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.397253 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.405786 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:51 crc kubenswrapper[4764]: I0309 13:22:51.559074 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:51 crc kubenswrapper[4764]: E0309 13:22:51.559201 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.043186 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.043248 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.070146 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.084598 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.095942 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.106321 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.117584 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.127996 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.140724 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.155053 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.166111 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.177936 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.196979 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03b613417c12f1a1de495ba21dfe736f2565fde5ba629162dcc5cbe929369b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.208392 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.221267 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.230192 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.247295 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.258602 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.278832 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.559479 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:52 crc kubenswrapper[4764]: E0309 13:22:52.559616 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.559797 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:52 crc kubenswrapper[4764]: E0309 13:22:52.559861 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:52 crc kubenswrapper[4764]: I0309 13:22:52.559974 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:52 crc kubenswrapper[4764]: E0309 13:22:52.560169 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.486132 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.486926 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.486961 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.486990 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.487006 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:53Z","lastTransitionTime":"2026-03-09T13:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:53 crc kubenswrapper[4764]: E0309 13:22:53.501848 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:53Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.513695 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.513785 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.513807 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.513838 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.513858 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:53Z","lastTransitionTime":"2026-03-09T13:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:53 crc kubenswrapper[4764]: E0309 13:22:53.529323 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:53Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.533940 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.533988 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.534005 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.534028 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.534044 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:53Z","lastTransitionTime":"2026-03-09T13:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:53 crc kubenswrapper[4764]: E0309 13:22:53.548591 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:53Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.553210 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.553244 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.553254 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.553269 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.553279 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:53Z","lastTransitionTime":"2026-03-09T13:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.559888 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:53 crc kubenswrapper[4764]: E0309 13:22:53.560367 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:22:53 crc kubenswrapper[4764]: E0309 13:22:53.565496 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:53Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.570451 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.570499 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.570517 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.570537 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:53 crc kubenswrapper[4764]: I0309 13:22:53.570552 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:53Z","lastTransitionTime":"2026-03-09T13:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:53 crc kubenswrapper[4764]: E0309 13:22:53.587871 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:53Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:53 crc kubenswrapper[4764]: E0309 13:22:53.588127 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.051769 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovnkube-controller/0.log" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.055895 4764 generic.go:334] "Generic (PLEG): container finished" podID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerID="03b613417c12f1a1de495ba21dfe736f2565fde5ba629162dcc5cbe929369b4b" exitCode=1 Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.055938 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerDied","Data":"03b613417c12f1a1de495ba21dfe736f2565fde5ba629162dcc5cbe929369b4b"} Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.057637 4764 scope.go:117] "RemoveContainer" containerID="03b613417c12f1a1de495ba21dfe736f2565fde5ba629162dcc5cbe929369b4b" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.073225 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.089670 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.104869 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.120147 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.137192 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.153298 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.167446 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.184484 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.199863 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.217256 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.240839 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03b613417c12f1a1de495ba21dfe736f2565fde5ba629162dcc5cbe929369b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03b613417c12f1a1de495ba21dfe736f2565fde5ba629162dcc5cbe929369b4b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"4 for removal\\\\nI0309 13:22:53.071206 6796 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0309 13:22:53.071312 6796 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 13:22:53.071247 6796 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0309 13:22:53.071352 6796 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 13:22:53.071377 6796 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0309 13:22:53.071443 6796 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 13:22:53.071459 6796 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 13:22:53.071465 6796 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0309 13:22:53.071481 6796 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0309 13:22:53.071492 6796 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0309 13:22:53.071500 6796 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 13:22:53.071697 6796 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:22:53.071710 6796 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 13:22:53.071739 6796 factory.go:656] Stopping watch factory\\\\nI0309 13:22:53.071758 6796 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:22:53.071779 6796 handler.go:208] Removed *v1.Node event handler 2\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.253668 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.264689 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.275914 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.289537 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.299034 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.558908 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.558969 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:54 crc kubenswrapper[4764]: I0309 13:22:54.559145 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:54 crc kubenswrapper[4764]: E0309 13:22:54.559181 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:54 crc kubenswrapper[4764]: E0309 13:22:54.559352 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:54 crc kubenswrapper[4764]: E0309 13:22:54.559473 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.060825 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovnkube-controller/1.log" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.061624 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovnkube-controller/0.log" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.064810 4764 generic.go:334] "Generic (PLEG): container finished" podID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerID="74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2" exitCode=1 Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.064851 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerDied","Data":"74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2"} Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.064892 4764 scope.go:117] "RemoveContainer" containerID="03b613417c12f1a1de495ba21dfe736f2565fde5ba629162dcc5cbe929369b4b" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.066095 4764 scope.go:117] "RemoveContainer" containerID="74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2" Mar 09 13:22:55 crc kubenswrapper[4764]: E0309 13:22:55.066409 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.083876 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.100861 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.118856 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.141086 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.159406 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.177700 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.195141 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.208717 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.228936 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.244065 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.257240 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.277612 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.299335 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03b613417c12f1a1de495ba21dfe736f2565fde5ba629162dcc5cbe929369b4b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"4 for removal\\\\nI0309 13:22:53.071206 6796 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0309 13:22:53.071312 6796 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 13:22:53.071247 6796 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0309 13:22:53.071352 6796 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 13:22:53.071377 6796 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0309 13:22:53.071443 6796 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 13:22:53.071459 6796 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 13:22:53.071465 6796 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0309 13:22:53.071481 6796 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0309 13:22:53.071492 6796 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0309 13:22:53.071500 6796 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 13:22:53.071697 6796 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:22:53.071710 6796 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 13:22:53.071739 6796 factory.go:656] Stopping watch factory\\\\nI0309 13:22:53.071758 6796 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:22:53.071779 6796 handler.go:208] Removed *v1.Node event handler 2\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:54Z\\\",\\\"message\\\":\\\"services_controller.go:444] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0309 13:22:54.899454 6914 services_controller.go:445] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0309 13:22:54.899473 6914 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-webhook for network=default\\\\nI0309 13:22:54.899513 6914 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:22:54.899390 6914 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}\\\\nI0309 13:22:54.899576 6914 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0309 13:22:54.899578 6914 services_controller.go:360] Finished syncing service api on namespace openshift-oauth-apiserver for network=default : 1.726105ms\\\\nI0309 13:22:54.899624 6914 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-daemon for network=default\\\\nF0309 13:22:54.899660 6914 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.312828 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.322911 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.336211 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.559112 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:55 crc kubenswrapper[4764]: E0309 13:22:55.559325 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.578869 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.595258 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.608683 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.620821 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.632532 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: E0309 13:22:55.641497 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.648538 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.660147 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.684752 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.711701 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.732464 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03b613417c12f1a1de495ba21dfe736f2565fde5ba629162dcc5cbe929369b4b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:53Z\\\",\\\"message\\\":\\\"4 for removal\\\\nI0309 13:22:53.071206 6796 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0309 13:22:53.071312 6796 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 13:22:53.071247 6796 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0309 13:22:53.071352 6796 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 13:22:53.071377 6796 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0309 13:22:53.071443 6796 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 13:22:53.071459 6796 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 13:22:53.071465 6796 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0309 13:22:53.071481 6796 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0309 13:22:53.071492 6796 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0309 13:22:53.071500 6796 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 13:22:53.071697 6796 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:22:53.071710 6796 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 13:22:53.071739 6796 factory.go:656] Stopping watch factory\\\\nI0309 13:22:53.071758 6796 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:22:53.071779 6796 handler.go:208] Removed *v1.Node event handler 2\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:54Z\\\",\\\"message\\\":\\\"services_controller.go:444] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0309 13:22:54.899454 6914 services_controller.go:445] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0309 13:22:54.899473 6914 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-webhook for network=default\\\\nI0309 13:22:54.899513 6914 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:22:54.899390 6914 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}\\\\nI0309 13:22:54.899576 6914 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0309 13:22:54.899578 6914 services_controller.go:360] Finished syncing service api on namespace openshift-oauth-apiserver for network=default : 1.726105ms\\\\nI0309 13:22:54.899624 6914 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-daemon for network=default\\\\nF0309 13:22:54.899660 6914 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.740777 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.752008 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.763457 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.778682 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.789782 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:55 crc kubenswrapper[4764]: I0309 13:22:55.801081 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.070862 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovnkube-controller/1.log" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.076015 4764 scope.go:117] "RemoveContainer" containerID="74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2" Mar 09 13:22:56 crc kubenswrapper[4764]: E0309 13:22:56.076340 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.092041 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.110759 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.129268 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.161560 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:54Z\\\",\\\"message\\\":\\\"services_controller.go:444] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0309 13:22:54.899454 6914 services_controller.go:445] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0309 13:22:54.899473 6914 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-webhook for network=default\\\\nI0309 13:22:54.899513 6914 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:22:54.899390 6914 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}\\\\nI0309 13:22:54.899576 6914 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0309 13:22:54.899578 6914 services_controller.go:360] Finished syncing service api on namespace openshift-oauth-apiserver for network=default : 1.726105ms\\\\nI0309 13:22:54.899624 6914 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-daemon for network=default\\\\nF0309 13:22:54.899660 6914 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.176804 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.191821 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.213409 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.235404 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.253929 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.268246 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.283150 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.297365 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.312707 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.324874 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.338064 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.351049 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.453184 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs\") pod \"network-metrics-daemon-wkwdz\" (UID: \"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\") " pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:56 crc kubenswrapper[4764]: E0309 13:22:56.453372 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:56 crc kubenswrapper[4764]: E0309 13:22:56.453595 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs podName:6597fc34-10ee-4984-9c69-f4b7c0d46e2a nodeName:}" failed. No retries permitted until 2026-03-09 13:23:12.453571613 +0000 UTC m=+147.703743591 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs") pod "network-metrics-daemon-wkwdz" (UID: "6597fc34-10ee-4984-9c69-f4b7c0d46e2a") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.559231 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.559271 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:56 crc kubenswrapper[4764]: I0309 13:22:56.559562 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:56 crc kubenswrapper[4764]: E0309 13:22:56.559780 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:56 crc kubenswrapper[4764]: E0309 13:22:56.559924 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:56 crc kubenswrapper[4764]: E0309 13:22:56.560144 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:57 crc kubenswrapper[4764]: I0309 13:22:57.559514 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:57 crc kubenswrapper[4764]: E0309 13:22:57.561206 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:22:58 crc kubenswrapper[4764]: I0309 13:22:58.558887 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:58 crc kubenswrapper[4764]: I0309 13:22:58.558887 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:58 crc kubenswrapper[4764]: E0309 13:22:58.559358 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:58 crc kubenswrapper[4764]: E0309 13:22:58.559266 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:58 crc kubenswrapper[4764]: I0309 13:22:58.558904 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:58 crc kubenswrapper[4764]: E0309 13:22:58.559439 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:58 crc kubenswrapper[4764]: I0309 13:22:58.722190 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:22:58 crc kubenswrapper[4764]: I0309 13:22:58.722930 4764 scope.go:117] "RemoveContainer" containerID="74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2" Mar 09 13:22:58 crc kubenswrapper[4764]: E0309 13:22:58.723141 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" Mar 09 13:22:59 crc kubenswrapper[4764]: I0309 13:22:59.559927 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:22:59 crc kubenswrapper[4764]: E0309 13:22:59.560073 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:00 crc kubenswrapper[4764]: I0309 13:23:00.558686 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:00 crc kubenswrapper[4764]: E0309 13:23:00.559088 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:00 crc kubenswrapper[4764]: I0309 13:23:00.558806 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:00 crc kubenswrapper[4764]: E0309 13:23:00.559402 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:00 crc kubenswrapper[4764]: I0309 13:23:00.558783 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:00 crc kubenswrapper[4764]: E0309 13:23:00.559683 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:00 crc kubenswrapper[4764]: E0309 13:23:00.642693 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:23:01 crc kubenswrapper[4764]: I0309 13:23:01.559824 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:01 crc kubenswrapper[4764]: E0309 13:23:01.560124 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:02 crc kubenswrapper[4764]: I0309 13:23:02.559284 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:02 crc kubenswrapper[4764]: E0309 13:23:02.559920 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:02 crc kubenswrapper[4764]: I0309 13:23:02.559394 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:02 crc kubenswrapper[4764]: E0309 13:23:02.560094 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:02 crc kubenswrapper[4764]: I0309 13:23:02.559341 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:02 crc kubenswrapper[4764]: E0309 13:23:02.560354 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.560051 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:03 crc kubenswrapper[4764]: E0309 13:23:03.560782 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.575049 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.619514 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.619736 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.619797 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.619886 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.619943 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:03Z","lastTransitionTime":"2026-03-09T13:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:03 crc kubenswrapper[4764]: E0309 13:23:03.632292 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:03Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.637363 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.637604 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.637691 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.637782 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.637847 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:03Z","lastTransitionTime":"2026-03-09T13:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:03 crc kubenswrapper[4764]: E0309 13:23:03.652509 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:03Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.656863 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.656905 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.656915 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.656932 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.656943 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:03Z","lastTransitionTime":"2026-03-09T13:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:03 crc kubenswrapper[4764]: E0309 13:23:03.669514 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:03Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.673097 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.673136 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.673145 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.673159 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.673169 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:03Z","lastTransitionTime":"2026-03-09T13:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:03 crc kubenswrapper[4764]: E0309 13:23:03.683624 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:03Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.690182 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.690445 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.690457 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.690472 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:03 crc kubenswrapper[4764]: I0309 13:23:03.690487 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:03Z","lastTransitionTime":"2026-03-09T13:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:03 crc kubenswrapper[4764]: E0309 13:23:03.703295 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:03Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:03 crc kubenswrapper[4764]: E0309 13:23:03.703427 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:23:04 crc kubenswrapper[4764]: I0309 13:23:04.559518 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:04 crc kubenswrapper[4764]: I0309 13:23:04.559545 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:04 crc kubenswrapper[4764]: E0309 13:23:04.559666 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:04 crc kubenswrapper[4764]: I0309 13:23:04.559562 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:04 crc kubenswrapper[4764]: E0309 13:23:04.559767 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:04 crc kubenswrapper[4764]: E0309 13:23:04.559897 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.559184 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:05 crc kubenswrapper[4764]: E0309 13:23:05.559349 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.576563 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8a532f3-0fa0-4125-aeef-fd19fc524647\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2756a837bdc4537e5cf1848d4c91935799827b254edac2f5a67cf8ff7ce886a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.590638 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.608555 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.621552 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.638141 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: E0309 13:23:05.643486 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.658359 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.673674 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.694539 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.711197 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.725140 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.745928 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:54Z\\\",\\\"message\\\":\\\"services_controller.go:444] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0309 13:22:54.899454 6914 services_controller.go:445] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0309 13:22:54.899473 6914 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-webhook for network=default\\\\nI0309 13:22:54.899513 6914 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:22:54.899390 6914 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}\\\\nI0309 13:22:54.899576 6914 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0309 13:22:54.899578 6914 services_controller.go:360] Finished syncing service api on namespace openshift-oauth-apiserver for network=default : 1.726105ms\\\\nI0309 13:22:54.899624 6914 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-daemon for network=default\\\\nF0309 13:22:54.899660 6914 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.761820 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.778555 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.792879 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.814016 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.825035 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:05 crc kubenswrapper[4764]: I0309 13:23:05.838082 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:06 crc kubenswrapper[4764]: I0309 13:23:06.559081 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:06 crc kubenswrapper[4764]: I0309 13:23:06.559217 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:06 crc kubenswrapper[4764]: E0309 13:23:06.559295 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:06 crc kubenswrapper[4764]: E0309 13:23:06.559487 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:06 crc kubenswrapper[4764]: I0309 13:23:06.559763 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:06 crc kubenswrapper[4764]: E0309 13:23:06.559962 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:07 crc kubenswrapper[4764]: I0309 13:23:07.559454 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:07 crc kubenswrapper[4764]: E0309 13:23:07.559606 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:07 crc kubenswrapper[4764]: I0309 13:23:07.574854 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 09 13:23:08 crc kubenswrapper[4764]: I0309 13:23:08.559688 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:08 crc kubenswrapper[4764]: I0309 13:23:08.559689 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:08 crc kubenswrapper[4764]: E0309 13:23:08.559842 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:08 crc kubenswrapper[4764]: I0309 13:23:08.559700 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:08 crc kubenswrapper[4764]: E0309 13:23:08.559998 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:08 crc kubenswrapper[4764]: E0309 13:23:08.560083 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:09 crc kubenswrapper[4764]: I0309 13:23:09.558930 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:09 crc kubenswrapper[4764]: E0309 13:23:09.559160 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:10 crc kubenswrapper[4764]: I0309 13:23:10.558934 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:10 crc kubenswrapper[4764]: E0309 13:23:10.559773 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:10 crc kubenswrapper[4764]: I0309 13:23:10.559045 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:10 crc kubenswrapper[4764]: E0309 13:23:10.559967 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:10 crc kubenswrapper[4764]: I0309 13:23:10.558990 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:10 crc kubenswrapper[4764]: E0309 13:23:10.560227 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:10 crc kubenswrapper[4764]: E0309 13:23:10.645346 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:23:11 crc kubenswrapper[4764]: I0309 13:23:11.559261 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:11 crc kubenswrapper[4764]: E0309 13:23:11.560181 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:12 crc kubenswrapper[4764]: I0309 13:23:12.529977 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs\") pod \"network-metrics-daemon-wkwdz\" (UID: \"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\") " pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:12 crc kubenswrapper[4764]: E0309 13:23:12.530160 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:23:12 crc kubenswrapper[4764]: E0309 13:23:12.530258 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs podName:6597fc34-10ee-4984-9c69-f4b7c0d46e2a nodeName:}" failed. No retries permitted until 2026-03-09 13:23:44.530238106 +0000 UTC m=+179.780410014 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs") pod "network-metrics-daemon-wkwdz" (UID: "6597fc34-10ee-4984-9c69-f4b7c0d46e2a") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:23:12 crc kubenswrapper[4764]: I0309 13:23:12.558795 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:12 crc kubenswrapper[4764]: I0309 13:23:12.558795 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:12 crc kubenswrapper[4764]: I0309 13:23:12.558845 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:12 crc kubenswrapper[4764]: E0309 13:23:12.559247 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:12 crc kubenswrapper[4764]: E0309 13:23:12.559304 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:12 crc kubenswrapper[4764]: E0309 13:23:12.559352 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:12 crc kubenswrapper[4764]: I0309 13:23:12.559524 4764 scope.go:117] "RemoveContainer" containerID="74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.124500 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovnkube-controller/1.log" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.127496 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerStarted","Data":"850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6"} Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.127925 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.144493 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:54Z\\\",\\\"message\\\":\\\"services_controller.go:444] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0309 13:22:54.899454 6914 services_controller.go:445] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0309 13:22:54.899473 6914 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-webhook for network=default\\\\nI0309 13:22:54.899513 6914 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:22:54.899390 6914 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}\\\\nI0309 13:22:54.899576 6914 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0309 13:22:54.899578 6914 services_controller.go:360] Finished syncing service api on namespace openshift-oauth-apiserver for network=default : 1.726105ms\\\\nI0309 13:22:54.899624 6914 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-daemon for network=default\\\\nF0309 13:22:54.899660 6914 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:23:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.153923 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.164932 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.175557 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.186907 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.197126 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.208518 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.220446 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.228495 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8a532f3-0fa0-4125-aeef-fd19fc524647\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2756a837bdc4537e5cf1848d4c91935799827b254edac2f5a67cf8ff7ce886a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.238180 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.246667 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.258589 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.271275 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.284838 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.301406 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.312686 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9db73dd7-a619-4291-a559-aab80ef8e067\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98efc0508ec5c94130c0707470eed2e0ae1117e57a8139af476e03a4bc67e788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:21:16.528766 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:21:16.529708 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:21:16.533102 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:21:16.533732 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:21:42.646426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 13:21:46.099155 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:21:46.099249 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7857b72b1425eeefd8acf226dd6f0d6c783e14267fcfb79dd038118573a2b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4cbb681d47f7be8b418b159d390e6857a8a1f5cff40ac621c892019ceef7828\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5476061465c7cd40b23fd92e34348e1889754bbfabe1e2bc6419637cf70604d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.323112 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.333733 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.559059 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:13 crc kubenswrapper[4764]: E0309 13:23:13.559789 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.573538 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.759274 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.759314 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.759325 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.759342 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.759355 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:13Z","lastTransitionTime":"2026-03-09T13:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:13 crc kubenswrapper[4764]: E0309 13:23:13.772169 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.776168 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.776207 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.776216 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.776232 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.776241 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:13Z","lastTransitionTime":"2026-03-09T13:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:13 crc kubenswrapper[4764]: E0309 13:23:13.788149 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.791871 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.791908 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.791919 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.791936 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.791946 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:13Z","lastTransitionTime":"2026-03-09T13:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:13 crc kubenswrapper[4764]: E0309 13:23:13.802465 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.805600 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.805663 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.805681 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.805702 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.805714 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:13Z","lastTransitionTime":"2026-03-09T13:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:13 crc kubenswrapper[4764]: E0309 13:23:13.816426 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.819111 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.819141 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.819152 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.819170 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:13 crc kubenswrapper[4764]: I0309 13:23:13.819180 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:13Z","lastTransitionTime":"2026-03-09T13:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:13 crc kubenswrapper[4764]: E0309 13:23:13.830001 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:13 crc kubenswrapper[4764]: E0309 13:23:13.830132 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.131888 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovnkube-controller/2.log" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.133135 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovnkube-controller/1.log" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.136345 4764 generic.go:334] "Generic (PLEG): container finished" podID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerID="850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6" exitCode=1 Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.136435 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerDied","Data":"850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6"} Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.136506 4764 scope.go:117] "RemoveContainer" containerID="74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.137842 4764 scope.go:117] "RemoveContainer" containerID="850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6" Mar 09 13:23:14 crc kubenswrapper[4764]: E0309 13:23:14.138157 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.154884 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.167601 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.189918 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1507e859-72ea-4c35-a5a7-6d0f48b19b09\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289f145eb4f8412759fdbbc747534a08534b2f848f18206d9f9c2d03adc86944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c220cbb5e42a64e54a853893cbdd6e5502e5f1e5cd7fa5aa7fffc3e27c5b2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98462dd9c7c97ff5402911d179c7739d92d293c4abacea1ed4a394ec9f33dfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4039a3daa857df7f6171121f2a936d4893913c8374ad7229c758dd93a8821386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a9b753c5c47533ae4771e38e6ce4f7a3822bc208557dda27fd1b525c76044f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.204517 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.221980 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.236329 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.251570 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.267379 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.291407 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9db73dd7-a619-4291-a559-aab80ef8e067\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98efc0508ec5c94130c0707470eed2e0ae1117e57a8139af476e03a4bc67e788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:21:16.528766 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:21:16.529708 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:21:16.533102 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:21:16.533732 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:21:42.646426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 13:21:46.099155 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:21:46.099249 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7857b72b1425eeefd8acf226dd6f0d6c783e14267fcfb79dd038118573a2b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4cbb681d47f7be8b418b159d390e6857a8a1f5cff40ac621c892019ceef7828\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5476061465c7cd40b23fd92e34348e1889754bbfabe1e2bc6419637cf70604d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.312715 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.328637 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.347752 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74e2e4b90f919f70c2ae2842193be603c8759f2203043d122ff14addcc7c6de2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:54Z\\\",\\\"message\\\":\\\"services_controller.go:444] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0309 13:22:54.899454 6914 services_controller.go:445] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0309 13:22:54.899473 6914 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-webhook for network=default\\\\nI0309 13:22:54.899513 6914 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:22:54.899390 6914 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}\\\\nI0309 13:22:54.899576 6914 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0309 13:22:54.899578 6914 services_controller.go:360] Finished syncing service api on namespace openshift-oauth-apiserver for network=default : 1.726105ms\\\\nI0309 13:22:54.899624 6914 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-daemon for network=default\\\\nF0309 13:22:54.899660 6914 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"n.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0309 13:23:13.282082 7129 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0309 13:23:13.282128 7129 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:23:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.361582 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.370873 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.379993 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.392196 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.402083 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.411160 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8a532f3-0fa0-4125-aeef-fd19fc524647\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2756a837bdc4537e5cf1848d4c91935799827b254edac2f5a67cf8ff7ce886a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.420963 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.559514 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.559611 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:14 crc kubenswrapper[4764]: I0309 13:23:14.559682 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:14 crc kubenswrapper[4764]: E0309 13:23:14.559704 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:14 crc kubenswrapper[4764]: E0309 13:23:14.559805 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:14 crc kubenswrapper[4764]: E0309 13:23:14.559898 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.144457 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovnkube-controller/2.log" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.152216 4764 scope.go:117] "RemoveContainer" containerID="850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6" Mar 09 13:23:15 crc kubenswrapper[4764]: E0309 13:23:15.152531 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.167626 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8a532f3-0fa0-4125-aeef-fd19fc524647\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2756a837bdc4537e5cf1848d4c91935799827b254edac2f5a67cf8ff7ce886a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.183684 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.199404 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.230446 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1507e859-72ea-4c35-a5a7-6d0f48b19b09\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289f145eb4f8412759fdbbc747534a08534b2f848f18206d9f9c2d03adc86944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c220cbb5e42a64e54a853893cbdd6e5502e5f1e5cd7fa5aa7fffc3e27c5b2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98462dd9c7c97ff5402911d179c7739d92d293c4abacea1ed4a394ec9f33dfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4039a3daa857df7f6171121f2a936d4893913c8374ad7229c758dd93a8821386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a9b753c5c47533ae4771e38e6ce4f7a3822bc208557dda27fd1b525c76044f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.249777 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.269704 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.288793 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.304062 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.321447 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.337285 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9db73dd7-a619-4291-a559-aab80ef8e067\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98efc0508ec5c94130c0707470eed2e0ae1117e57a8139af476e03a4bc67e788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:21:16.528766 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:21:16.529708 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:21:16.533102 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:21:16.533732 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:21:42.646426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 13:21:46.099155 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:21:46.099249 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7857b72b1425eeefd8acf226dd6f0d6c783e14267fcfb79dd038118573a2b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4cbb681d47f7be8b418b159d390e6857a8a1f5cff40ac621c892019ceef7828\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5476061465c7cd40b23fd92e34348e1889754bbfabe1e2bc6419637cf70604d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.353232 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.369318 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.394110 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"n.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0309 13:23:13.282082 7129 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0309 13:23:13.282128 7129 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:23:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.408365 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.423836 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.435576 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.452288 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.462974 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.480374 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.559004 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:15 crc kubenswrapper[4764]: E0309 13:23:15.559135 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.575412 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.594021 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9db73dd7-a619-4291-a559-aab80ef8e067\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98efc0508ec5c94130c0707470eed2e0ae1117e57a8139af476e03a4bc67e788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:21:16.528766 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:21:16.529708 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:21:16.533102 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:21:16.533732 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:21:42.646426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 13:21:46.099155 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:21:46.099249 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7857b72b1425eeefd8acf226dd6f0d6c783e14267fcfb79dd038118573a2b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4cbb681d47f7be8b418b159d390e6857a8a1f5cff40ac621c892019ceef7828\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5476061465c7cd40b23fd92e34348e1889754bbfabe1e2bc6419637cf70604d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.608855 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.622914 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.635144 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: E0309 13:23:15.646365 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.660867 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.676702 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.695036 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.708902 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.725384 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.742847 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.766597 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"n.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0309 13:23:13.282082 7129 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0309 13:23:13.282128 7129 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:23:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.777559 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.791407 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8a532f3-0fa0-4125-aeef-fd19fc524647\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2756a837bdc4537e5cf1848d4c91935799827b254edac2f5a67cf8ff7ce886a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.803775 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.824155 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1507e859-72ea-4c35-a5a7-6d0f48b19b09\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289f145eb4f8412759fdbbc747534a08534b2f848f18206d9f9c2d03adc86944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c220cbb5e42a64e54a853893cbdd6e5502e5f1e5cd7fa5aa7fffc3e27c5b2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98462dd9c7c97ff5402911d179c7739d92d293c4abacea1ed4a394ec9f33dfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4039a3daa857df7f6171121f2a936d4893913c8374ad7229c758dd93a8821386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a9b753c5c47533ae4771e38e6ce4f7a3822bc208557dda27fd1b525c76044f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.836734 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.849398 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:15 crc kubenswrapper[4764]: I0309 13:23:15.862607 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:16 crc kubenswrapper[4764]: I0309 13:23:16.559049 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:16 crc kubenswrapper[4764]: I0309 13:23:16.559049 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:16 crc kubenswrapper[4764]: E0309 13:23:16.559705 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:16 crc kubenswrapper[4764]: I0309 13:23:16.559103 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:16 crc kubenswrapper[4764]: E0309 13:23:16.559932 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:16 crc kubenswrapper[4764]: E0309 13:23:16.560051 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:17 crc kubenswrapper[4764]: I0309 13:23:17.559704 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:17 crc kubenswrapper[4764]: E0309 13:23:17.559945 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:18 crc kubenswrapper[4764]: I0309 13:23:18.495022 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:18 crc kubenswrapper[4764]: I0309 13:23:18.495276 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:18 crc kubenswrapper[4764]: I0309 13:23:18.495326 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:18 crc kubenswrapper[4764]: I0309 13:23:18.495370 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.495483 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.495490 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:22.495443959 +0000 UTC m=+217.745615917 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.495506 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.495574 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:24:22.495546942 +0000 UTC m=+217.745719010 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.495599 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.495612 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:24:22.495590034 +0000 UTC m=+217.745762162 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.495624 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.495663 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.495715 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:24:22.495691967 +0000 UTC m=+217.745863875 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:23:18 crc kubenswrapper[4764]: I0309 13:23:18.495720 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.495999 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.496045 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.496075 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.496172 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:24:22.496148781 +0000 UTC m=+217.746320859 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:23:18 crc kubenswrapper[4764]: I0309 13:23:18.558882 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:18 crc kubenswrapper[4764]: I0309 13:23:18.558903 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.559024 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:18 crc kubenswrapper[4764]: I0309 13:23:18.559069 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.559199 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:18 crc kubenswrapper[4764]: E0309 13:23:18.559269 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:19 crc kubenswrapper[4764]: I0309 13:23:19.559463 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:19 crc kubenswrapper[4764]: E0309 13:23:19.559704 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:20 crc kubenswrapper[4764]: I0309 13:23:20.559422 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:20 crc kubenswrapper[4764]: I0309 13:23:20.559422 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:20 crc kubenswrapper[4764]: I0309 13:23:20.559495 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:20 crc kubenswrapper[4764]: E0309 13:23:20.561401 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:20 crc kubenswrapper[4764]: E0309 13:23:20.561728 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:20 crc kubenswrapper[4764]: E0309 13:23:20.562556 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:20 crc kubenswrapper[4764]: E0309 13:23:20.648454 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:23:21 crc kubenswrapper[4764]: I0309 13:23:21.559454 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:21 crc kubenswrapper[4764]: E0309 13:23:21.559615 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:22 crc kubenswrapper[4764]: I0309 13:23:22.558893 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:22 crc kubenswrapper[4764]: I0309 13:23:22.558939 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:22 crc kubenswrapper[4764]: E0309 13:23:22.559022 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:22 crc kubenswrapper[4764]: E0309 13:23:22.559088 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:22 crc kubenswrapper[4764]: I0309 13:23:22.559137 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:22 crc kubenswrapper[4764]: E0309 13:23:22.559233 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:23 crc kubenswrapper[4764]: I0309 13:23:23.559406 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:23 crc kubenswrapper[4764]: E0309 13:23:23.559582 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.132032 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.132094 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.132112 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.132140 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.132154 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:24Z","lastTransitionTime":"2026-03-09T13:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:24 crc kubenswrapper[4764]: E0309 13:23:24.158950 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:24Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.167023 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.167380 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.167450 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.167518 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.167582 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:24Z","lastTransitionTime":"2026-03-09T13:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:24 crc kubenswrapper[4764]: E0309 13:23:24.182829 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:24Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.187753 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.188110 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.188181 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.188253 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.188323 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:24Z","lastTransitionTime":"2026-03-09T13:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:24 crc kubenswrapper[4764]: E0309 13:23:24.204213 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:24Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.208909 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.208972 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.208985 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.209022 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.209038 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:24Z","lastTransitionTime":"2026-03-09T13:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:24 crc kubenswrapper[4764]: E0309 13:23:24.225549 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:24Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.231471 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.231515 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.231526 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.231544 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.231554 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:24Z","lastTransitionTime":"2026-03-09T13:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:24 crc kubenswrapper[4764]: E0309 13:23:24.273755 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:24Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:24 crc kubenswrapper[4764]: E0309 13:23:24.273884 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.558694 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.558764 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:24 crc kubenswrapper[4764]: I0309 13:23:24.558704 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:24 crc kubenswrapper[4764]: E0309 13:23:24.558848 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:24 crc kubenswrapper[4764]: E0309 13:23:24.558918 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:24 crc kubenswrapper[4764]: E0309 13:23:24.559023 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.559373 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:25 crc kubenswrapper[4764]: E0309 13:23:25.559552 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.584918 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.604825 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9db73dd7-a619-4291-a559-aab80ef8e067\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98efc0508ec5c94130c0707470eed2e0ae1117e57a8139af476e03a4bc67e788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:21:16.528766 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:21:16.529708 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:21:16.533102 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:21:16.533732 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:21:42.646426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 13:21:46.099155 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:21:46.099249 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7857b72b1425eeefd8acf226dd6f0d6c783e14267fcfb79dd038118573a2b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4cbb681d47f7be8b418b159d390e6857a8a1f5cff40ac621c892019ceef7828\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5476061465c7cd40b23fd92e34348e1889754bbfabe1e2bc6419637cf70604d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.624066 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.638766 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: E0309 13:23:25.649344 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.660576 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.678608 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.692323 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.709879 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.727986 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.752623 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"n.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0309 13:23:13.282082 7129 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0309 13:23:13.282128 7129 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:23:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.769069 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.787336 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.803417 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.814563 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8a532f3-0fa0-4125-aeef-fd19fc524647\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2756a837bdc4537e5cf1848d4c91935799827b254edac2f5a67cf8ff7ce886a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.829829 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.852294 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1507e859-72ea-4c35-a5a7-6d0f48b19b09\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289f145eb4f8412759fdbbc747534a08534b2f848f18206d9f9c2d03adc86944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c220cbb5e42a64e54a853893cbdd6e5502e5f1e5cd7fa5aa7fffc3e27c5b2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98462dd9c7c97ff5402911d179c7739d92d293c4abacea1ed4a394ec9f33dfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4039a3daa857df7f6171121f2a936d4893913c8374ad7229c758dd93a8821386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a9b753c5c47533ae4771e38e6ce4f7a3822bc208557dda27fd1b525c76044f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.868437 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.885515 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:25 crc kubenswrapper[4764]: I0309 13:23:25.910097 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:26 crc kubenswrapper[4764]: I0309 13:23:26.559279 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:26 crc kubenswrapper[4764]: I0309 13:23:26.559298 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:26 crc kubenswrapper[4764]: I0309 13:23:26.559312 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:26 crc kubenswrapper[4764]: E0309 13:23:26.560902 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:26 crc kubenswrapper[4764]: E0309 13:23:26.561008 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:26 crc kubenswrapper[4764]: E0309 13:23:26.561080 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:27 crc kubenswrapper[4764]: I0309 13:23:27.559161 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:27 crc kubenswrapper[4764]: E0309 13:23:27.559347 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.195094 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zmzm7_202a1f58-ce83-4374-ac48-dc806f7b9d6b/kube-multus/0.log" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.195156 4764 generic.go:334] "Generic (PLEG): container finished" podID="202a1f58-ce83-4374-ac48-dc806f7b9d6b" containerID="05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331" exitCode=1 Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.195192 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zmzm7" event={"ID":"202a1f58-ce83-4374-ac48-dc806f7b9d6b","Type":"ContainerDied","Data":"05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331"} Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.195669 4764 scope.go:117] "RemoveContainer" containerID="05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.212512 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.239242 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"n.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0309 13:23:13.282082 7129 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0309 13:23:13.282128 7129 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:23:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.250475 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.261418 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.270166 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.286357 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.300064 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.316822 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:27Z\\\",\\\"message\\\":\\\"2026-03-09T13:22:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a97e94f9-7ade-431c-bdc1-eded365600db\\\\n2026-03-09T13:22:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a97e94f9-7ade-431c-bdc1-eded365600db to /host/opt/cni/bin/\\\\n2026-03-09T13:22:42Z [verbose] multus-daemon started\\\\n2026-03-09T13:22:42Z [verbose] Readiness Indicator file check\\\\n2026-03-09T13:23:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.325194 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8a532f3-0fa0-4125-aeef-fd19fc524647\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2756a837bdc4537e5cf1848d4c91935799827b254edac2f5a67cf8ff7ce886a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.335323 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.344990 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.362548 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1507e859-72ea-4c35-a5a7-6d0f48b19b09\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289f145eb4f8412759fdbbc747534a08534b2f848f18206d9f9c2d03adc86944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c220cbb5e42a64e54a853893cbdd6e5502e5f1e5cd7fa5aa7fffc3e27c5b2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98462dd9c7c97ff5402911d179c7739d92d293c4abacea1ed4a394ec9f33dfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4039a3daa857df7f6171121f2a936d4893913c8374ad7229c758dd93a8821386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a9b753c5c47533ae4771e38e6ce4f7a3822bc208557dda27fd1b525c76044f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.373298 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.384715 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.395218 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.405704 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.418737 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.429267 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9db73dd7-a619-4291-a559-aab80ef8e067\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98efc0508ec5c94130c0707470eed2e0ae1117e57a8139af476e03a4bc67e788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:21:16.528766 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:21:16.529708 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:21:16.533102 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:21:16.533732 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:21:42.646426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 13:21:46.099155 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:21:46.099249 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7857b72b1425eeefd8acf226dd6f0d6c783e14267fcfb79dd038118573a2b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4cbb681d47f7be8b418b159d390e6857a8a1f5cff40ac621c892019ceef7828\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5476061465c7cd40b23fd92e34348e1889754bbfabe1e2bc6419637cf70604d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.440965 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.559634 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.559687 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:28 crc kubenswrapper[4764]: I0309 13:23:28.559704 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:28 crc kubenswrapper[4764]: E0309 13:23:28.559774 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:28 crc kubenswrapper[4764]: E0309 13:23:28.559960 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:28 crc kubenswrapper[4764]: E0309 13:23:28.560087 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.200355 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zmzm7_202a1f58-ce83-4374-ac48-dc806f7b9d6b/kube-multus/0.log" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.200452 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zmzm7" event={"ID":"202a1f58-ce83-4374-ac48-dc806f7b9d6b","Type":"ContainerStarted","Data":"ae16a6074b17ec1df33139f5a8f220d15bea5658fc20ad653e2af7b2427dec6a"} Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.212565 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.225877 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.237473 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.250840 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae16a6074b17ec1df33139f5a8f220d15bea5658fc20ad653e2af7b2427dec6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:27Z\\\",\\\"message\\\":\\\"2026-03-09T13:22:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a97e94f9-7ade-431c-bdc1-eded365600db\\\\n2026-03-09T13:22:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a97e94f9-7ade-431c-bdc1-eded365600db to /host/opt/cni/bin/\\\\n2026-03-09T13:22:42Z [verbose] multus-daemon started\\\\n2026-03-09T13:22:42Z [verbose] Readiness Indicator file check\\\\n2026-03-09T13:23:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:23:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.266091 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.283210 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"n.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0309 13:23:13.282082 7129 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0309 13:23:13.282128 7129 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:23:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.295837 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.307813 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.317998 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8a532f3-0fa0-4125-aeef-fd19fc524647\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2756a837bdc4537e5cf1848d4c91935799827b254edac2f5a67cf8ff7ce886a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.330992 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.354015 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1507e859-72ea-4c35-a5a7-6d0f48b19b09\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289f145eb4f8412759fdbbc747534a08534b2f848f18206d9f9c2d03adc86944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c220cbb5e42a64e54a853893cbdd6e5502e5f1e5cd7fa5aa7fffc3e27c5b2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98462dd9c7c97ff5402911d179c7739d92d293c4abacea1ed4a394ec9f33dfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4039a3daa857df7f6171121f2a936d4893913c8374ad7229c758dd93a8821386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a9b753c5c47533ae4771e38e6ce4f7a3822bc208557dda27fd1b525c76044f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.365105 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.375938 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.386192 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.400992 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.415452 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9db73dd7-a619-4291-a559-aab80ef8e067\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98efc0508ec5c94130c0707470eed2e0ae1117e57a8139af476e03a4bc67e788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:21:16.528766 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:21:16.529708 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:21:16.533102 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:21:16.533732 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:21:42.646426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 13:21:46.099155 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:21:46.099249 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7857b72b1425eeefd8acf226dd6f0d6c783e14267fcfb79dd038118573a2b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4cbb681d47f7be8b418b159d390e6857a8a1f5cff40ac621c892019ceef7828\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5476061465c7cd40b23fd92e34348e1889754bbfabe1e2bc6419637cf70604d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.428977 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.441907 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.456949 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.559180 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:29 crc kubenswrapper[4764]: E0309 13:23:29.559331 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:29 crc kubenswrapper[4764]: I0309 13:23:29.559993 4764 scope.go:117] "RemoveContainer" containerID="850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6" Mar 09 13:23:29 crc kubenswrapper[4764]: E0309 13:23:29.560159 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" Mar 09 13:23:30 crc kubenswrapper[4764]: I0309 13:23:30.559676 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:30 crc kubenswrapper[4764]: I0309 13:23:30.559777 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:30 crc kubenswrapper[4764]: I0309 13:23:30.559676 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:30 crc kubenswrapper[4764]: E0309 13:23:30.559800 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:30 crc kubenswrapper[4764]: E0309 13:23:30.559927 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:30 crc kubenswrapper[4764]: E0309 13:23:30.559990 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:30 crc kubenswrapper[4764]: E0309 13:23:30.651361 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:23:31 crc kubenswrapper[4764]: I0309 13:23:31.558657 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:31 crc kubenswrapper[4764]: E0309 13:23:31.558795 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:32 crc kubenswrapper[4764]: I0309 13:23:32.559462 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:32 crc kubenswrapper[4764]: I0309 13:23:32.559583 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:32 crc kubenswrapper[4764]: I0309 13:23:32.559462 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:32 crc kubenswrapper[4764]: E0309 13:23:32.559598 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:32 crc kubenswrapper[4764]: E0309 13:23:32.559760 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:32 crc kubenswrapper[4764]: E0309 13:23:32.559860 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:33 crc kubenswrapper[4764]: I0309 13:23:33.559713 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:33 crc kubenswrapper[4764]: E0309 13:23:33.559913 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.559487 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.559487 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:34 crc kubenswrapper[4764]: E0309 13:23:34.560176 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.559575 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:34 crc kubenswrapper[4764]: E0309 13:23:34.560261 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:34 crc kubenswrapper[4764]: E0309 13:23:34.560758 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.650034 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.650123 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.650145 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.650176 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.650198 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:34Z","lastTransitionTime":"2026-03-09T13:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:34 crc kubenswrapper[4764]: E0309 13:23:34.674456 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.680285 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.680360 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.680383 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.680412 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.680433 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:34Z","lastTransitionTime":"2026-03-09T13:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:34 crc kubenswrapper[4764]: E0309 13:23:34.700833 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.704572 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.704605 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.704624 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.704638 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.704683 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:34Z","lastTransitionTime":"2026-03-09T13:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:34 crc kubenswrapper[4764]: E0309 13:23:34.721302 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.725756 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.725807 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.725821 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.725840 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.725853 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:34Z","lastTransitionTime":"2026-03-09T13:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:34 crc kubenswrapper[4764]: E0309 13:23:34.737612 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.741158 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.741194 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.741207 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.741223 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:34 crc kubenswrapper[4764]: I0309 13:23:34.741234 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:34Z","lastTransitionTime":"2026-03-09T13:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:34 crc kubenswrapper[4764]: E0309 13:23:34.761210 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:34 crc kubenswrapper[4764]: E0309 13:23:34.761368 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.559448 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:35 crc kubenswrapper[4764]: E0309 13:23:35.559632 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.583241 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1507e859-72ea-4c35-a5a7-6d0f48b19b09\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289f145eb4f8412759fdbbc747534a08534b2f848f18206d9f9c2d03adc86944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c220cbb5e42a64e54a853893cbdd6e5502e5f1e5cd7fa5aa7fffc3e27c5b2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98462dd9c7c97ff5402911d179c7739d92d293c4abacea1ed4a394ec9f33dfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4039a3daa857df7f6171121f2a936d4893913c8374ad7229c758dd93a8821386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a9b753c5c47533ae4771e38e6ce4f7a3822bc208557dda27fd1b525c76044f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.597800 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.613982 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.630572 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.645100 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: E0309 13:23:35.651956 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.665066 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.680991 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9db73dd7-a619-4291-a559-aab80ef8e067\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98efc0508ec5c94130c0707470eed2e0ae1117e57a8139af476e03a4bc67e788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:21:16.528766 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:21:16.529708 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:21:16.533102 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:21:16.533732 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:21:42.646426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 13:21:46.099155 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:21:46.099249 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7857b72b1425eeefd8acf226dd6f0d6c783e14267fcfb79dd038118573a2b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4cbb681d47f7be8b418b159d390e6857a8a1f5cff40ac621c892019ceef7828\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5476061465c7cd40b23fd92e34348e1889754bbfabe1e2bc6419637cf70604d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.698160 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.710996 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.728190 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"n.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0309 13:23:13.282082 7129 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0309 13:23:13.282128 7129 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:23:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.738971 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.750165 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.763825 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.779376 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.795185 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.812384 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae16a6074b17ec1df33139f5a8f220d15bea5658fc20ad653e2af7b2427dec6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:27Z\\\",\\\"message\\\":\\\"2026-03-09T13:22:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a97e94f9-7ade-431c-bdc1-eded365600db\\\\n2026-03-09T13:22:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a97e94f9-7ade-431c-bdc1-eded365600db to /host/opt/cni/bin/\\\\n2026-03-09T13:22:42Z [verbose] multus-daemon started\\\\n2026-03-09T13:22:42Z [verbose] Readiness Indicator file check\\\\n2026-03-09T13:23:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:23:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.829403 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.848119 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8a532f3-0fa0-4125-aeef-fd19fc524647\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2756a837bdc4537e5cf1848d4c91935799827b254edac2f5a67cf8ff7ce886a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:35 crc kubenswrapper[4764]: I0309 13:23:35.862583 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:36 crc kubenswrapper[4764]: I0309 13:23:36.559526 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:36 crc kubenswrapper[4764]: I0309 13:23:36.559589 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:36 crc kubenswrapper[4764]: E0309 13:23:36.559776 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:36 crc kubenswrapper[4764]: I0309 13:23:36.559820 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:36 crc kubenswrapper[4764]: E0309 13:23:36.559964 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:36 crc kubenswrapper[4764]: E0309 13:23:36.560153 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:37 crc kubenswrapper[4764]: I0309 13:23:37.559564 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:37 crc kubenswrapper[4764]: E0309 13:23:37.559824 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:38 crc kubenswrapper[4764]: I0309 13:23:38.558727 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:38 crc kubenswrapper[4764]: I0309 13:23:38.558818 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:38 crc kubenswrapper[4764]: E0309 13:23:38.558897 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:38 crc kubenswrapper[4764]: I0309 13:23:38.558961 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:38 crc kubenswrapper[4764]: E0309 13:23:38.559131 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:38 crc kubenswrapper[4764]: E0309 13:23:38.559180 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:39 crc kubenswrapper[4764]: I0309 13:23:39.559880 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:39 crc kubenswrapper[4764]: E0309 13:23:39.560116 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:40 crc kubenswrapper[4764]: I0309 13:23:40.559024 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:40 crc kubenswrapper[4764]: E0309 13:23:40.559539 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:40 crc kubenswrapper[4764]: I0309 13:23:40.559099 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:40 crc kubenswrapper[4764]: I0309 13:23:40.559178 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:40 crc kubenswrapper[4764]: E0309 13:23:40.559822 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:40 crc kubenswrapper[4764]: E0309 13:23:40.560012 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:40 crc kubenswrapper[4764]: I0309 13:23:40.561524 4764 scope.go:117] "RemoveContainer" containerID="850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6" Mar 09 13:23:40 crc kubenswrapper[4764]: E0309 13:23:40.653790 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.243567 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovnkube-controller/2.log" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.245565 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerStarted","Data":"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f"} Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.246092 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.258799 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.269736 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.280108 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.292549 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.304312 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9db73dd7-a619-4291-a559-aab80ef8e067\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98efc0508ec5c94130c0707470eed2e0ae1117e57a8139af476e03a4bc67e788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:21:16.528766 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:21:16.529708 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:21:16.533102 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:21:16.533732 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:21:42.646426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 13:21:46.099155 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:21:46.099249 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7857b72b1425eeefd8acf226dd6f0d6c783e14267fcfb79dd038118573a2b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4cbb681d47f7be8b418b159d390e6857a8a1f5cff40ac621c892019ceef7828\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5476061465c7cd40b23fd92e34348e1889754bbfabe1e2bc6419637cf70604d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.315965 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae16a6074b17ec1df33139f5a8f220d15bea5658fc20ad653e2af7b2427dec6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:27Z\\\",\\\"message\\\":\\\"2026-03-09T13:22:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a97e94f9-7ade-431c-bdc1-eded365600db\\\\n2026-03-09T13:22:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a97e94f9-7ade-431c-bdc1-eded365600db to /host/opt/cni/bin/\\\\n2026-03-09T13:22:42Z [verbose] multus-daemon started\\\\n2026-03-09T13:22:42Z [verbose] Readiness Indicator file check\\\\n2026-03-09T13:23:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:23:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.330920 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.348512 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"n.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0309 13:23:13.282082 7129 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0309 13:23:13.282128 7129 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:23:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.358876 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.368925 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.379990 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.391588 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.401401 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.411075 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8a532f3-0fa0-4125-aeef-fd19fc524647\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2756a837bdc4537e5cf1848d4c91935799827b254edac2f5a67cf8ff7ce886a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.420588 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.431631 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.443075 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.463912 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1507e859-72ea-4c35-a5a7-6d0f48b19b09\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289f145eb4f8412759fdbbc747534a08534b2f848f18206d9f9c2d03adc86944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c220cbb5e42a64e54a853893cbdd6e5502e5f1e5cd7fa5aa7fffc3e27c5b2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98462dd9c7c97ff5402911d179c7739d92d293c4abacea1ed4a394ec9f33dfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4039a3daa857df7f6171121f2a936d4893913c8374ad7229c758dd93a8821386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a9b753c5c47533ae4771e38e6ce4f7a3822bc208557dda27fd1b525c76044f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.476057 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:41 crc kubenswrapper[4764]: I0309 13:23:41.558846 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:41 crc kubenswrapper[4764]: E0309 13:23:41.559140 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.251525 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovnkube-controller/3.log" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.252270 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovnkube-controller/2.log" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.255351 4764 generic.go:334] "Generic (PLEG): container finished" podID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerID="9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f" exitCode=1 Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.255419 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerDied","Data":"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f"} Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.255495 4764 scope.go:117] "RemoveContainer" containerID="850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.256405 4764 scope.go:117] "RemoveContainer" containerID="9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f" Mar 09 13:23:42 crc kubenswrapper[4764]: E0309 13:23:42.256651 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.267727 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8a532f3-0fa0-4125-aeef-fd19fc524647\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2756a837bdc4537e5cf1848d4c91935799827b254edac2f5a67cf8ff7ce886a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.285405 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.302804 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.331577 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1507e859-72ea-4c35-a5a7-6d0f48b19b09\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289f145eb4f8412759fdbbc747534a08534b2f848f18206d9f9c2d03adc86944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c220cbb5e42a64e54a853893cbdd6e5502e5f1e5cd7fa5aa7fffc3e27c5b2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98462dd9c7c97ff5402911d179c7739d92d293c4abacea1ed4a394ec9f33dfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4039a3daa857df7f6171121f2a936d4893913c8374ad7229c758dd93a8821386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a9b753c5c47533ae4771e38e6ce4f7a3822bc208557dda27fd1b525c76044f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.344099 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.354998 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.366515 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.381703 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.393781 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.405756 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9db73dd7-a619-4291-a559-aab80ef8e067\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98efc0508ec5c94130c0707470eed2e0ae1117e57a8139af476e03a4bc67e788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:21:16.528766 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:21:16.529708 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:21:16.533102 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:21:16.533732 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:21:42.646426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 13:21:46.099155 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:21:46.099249 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7857b72b1425eeefd8acf226dd6f0d6c783e14267fcfb79dd038118573a2b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4cbb681d47f7be8b418b159d390e6857a8a1f5cff40ac621c892019ceef7828\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5476061465c7cd40b23fd92e34348e1889754bbfabe1e2bc6419637cf70604d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.423876 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.438729 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.455626 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://850c5b772d45fee8f95d82796bdd0694bc5d2829f80a8e826bd3a428943f08f6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:13Z\\\",\\\"message\\\":\\\"n.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0309 13:23:13.282082 7129 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0309 13:23:13.282128 7129 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:23:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:41Z\\\",\\\"message\\\":\\\"NetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 13:23:41.398455 7479 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:23:41.398816 7479 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:23:41.399063 7479 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:23:41.399497 7479 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:23:41.399594 7479 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 13:23:41.399615 7479 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 13:23:41.399627 7479 factory.go:656] Stopping watch factory\\\\nI0309 13:23:41.399641 7479 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 13:23:41.445020 7479 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0309 13:23:41.445268 7479 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0309 13:23:41.445408 7479 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:23:41.445455 7479 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 13:23:41.445595 7479 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.466400 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.477525 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.487982 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.500741 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.508842 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.518984 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae16a6074b17ec1df33139f5a8f220d15bea5658fc20ad653e2af7b2427dec6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:27Z\\\",\\\"message\\\":\\\"2026-03-09T13:22:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a97e94f9-7ade-431c-bdc1-eded365600db\\\\n2026-03-09T13:22:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a97e94f9-7ade-431c-bdc1-eded365600db to /host/opt/cni/bin/\\\\n2026-03-09T13:22:42Z [verbose] multus-daemon started\\\\n2026-03-09T13:22:42Z [verbose] Readiness Indicator file check\\\\n2026-03-09T13:23:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:23:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.558808 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.558846 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:42 crc kubenswrapper[4764]: I0309 13:23:42.558917 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:42 crc kubenswrapper[4764]: E0309 13:23:42.559051 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:42 crc kubenswrapper[4764]: E0309 13:23:42.559157 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:42 crc kubenswrapper[4764]: E0309 13:23:42.559229 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.259632 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovnkube-controller/3.log" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.263152 4764 scope.go:117] "RemoveContainer" containerID="9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f" Mar 09 13:23:43 crc kubenswrapper[4764]: E0309 13:23:43.263355 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.273070 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8a532f3-0fa0-4125-aeef-fd19fc524647\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2756a837bdc4537e5cf1848d4c91935799827b254edac2f5a67cf8ff7ce886a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4166a7efb6711d50164acbe2944b72105acb42653213963d1e52516f0c43f982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.283981 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bcdd179-43c2-427c-9fac-7155c122e922\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83cca42acbb21422d7ee12f4e9824007c225a563a00ac3769c638770e404ce7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25jms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xxczl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.303347 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1507e859-72ea-4c35-a5a7-6d0f48b19b09\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289f145eb4f8412759fdbbc747534a08534b2f848f18206d9f9c2d03adc86944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c220cbb5e42a64e54a853893cbdd6e5502e5f1e5cd7fa5aa7fffc3e27c5b2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98462dd9c7c97ff5402911d179c7739d92d293c4abacea1ed4a394ec9f33dfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4039a3daa857df7f6171121f2a936d4893913c8374ad7229c758dd93a8821386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a9b753c5c47533ae4771e38e6ce4f7a3822bc208557dda27fd1b525c76044f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.314301 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.325207 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.338422 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.350047 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c21ab2-1820-47de-a61d-71d81928564a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"message\\\":\\\"W0309 13:21:43.803612 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0309 13:21:43.803959 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773062503 cert, and key in /tmp/serving-cert-1644690243/serving-signer.crt, /tmp/serving-cert-1644690243/serving-signer.key\\\\nI0309 13:21:44.168505 1 observer_polling.go:159] Starting file observer\\\\nW0309 13:21:44.177360 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0309 13:21:44.177910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:44.178828 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1644690243/tls.crt::/tmp/serving-cert-1644690243/tls.key\\\\\\\"\\\\nF0309 13:21:44.724786 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.360880 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9db73dd7-a619-4291-a559-aab80ef8e067\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98efc0508ec5c94130c0707470eed2e0ae1117e57a8139af476e03a4bc67e788\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://698e8e53f96938b60a1aefe2ccea945879623e79d8b0f8836a67cb38ec85918f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:21:16.528766 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:21:16.529708 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:21:16.533102 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:21:16.533732 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:21:42.646426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 13:21:46.099155 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:21:46.099249 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7857b72b1425eeefd8acf226dd6f0d6c783e14267fcfb79dd038118573a2b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4cbb681d47f7be8b418b159d390e6857a8a1f5cff40ac621c892019ceef7828\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5476061465c7cd40b23fd92e34348e1889754bbfabe1e2bc6419637cf70604d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.370982 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd3c85ee070f78d4d30b9ccad3da35f82313620fffdaf69c3424865843dfa39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5280ec2d6cc95dbda0281276e900fa2c9d43170f932966688440a55c1deb454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.380610 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acc019f66fb28c4b9a901a359ecf5e127e643482a864c4d9da494a62286c0e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.391943 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.402438 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbmjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c2b6620-c8b8-47cf-9f15-883dbf8e34cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d8a2c962bc242865fd8c0fce8fc21f9e385e38cc9b2892663a82c02dfe6cac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4sdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbmjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.412940 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30a92c50-fe51-40d9-a69c-4b5fd722bfc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73df6317c77ae8a47acf57b1fdb2710ccefa68c346011b5a98d5a38c775c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10aea1054bea324b596bc250a9d3f7010db0636732f622c0674db83b1f245098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdm9q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hzjxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.422259 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gsbrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wkwdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.433722 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3bffb853598e5fea63b5302816325c29f740c9e9e2a7f0ee4f87e810045f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.443858 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5bnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fede188-66d9-4cb1-af19-c94afe7fbcde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a63a85aa57ac9496c6745b881ceba7015f610b3ce31c17513724f036bea999f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjp5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5bnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.455365 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zmzm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"202a1f58-ce83-4374-ac48-dc806f7b9d6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae16a6074b17ec1df33139f5a8f220d15bea5658fc20ad653e2af7b2427dec6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:27Z\\\",\\\"message\\\":\\\"2026-03-09T13:22:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a97e94f9-7ade-431c-bdc1-eded365600db\\\\n2026-03-09T13:22:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a97e94f9-7ade-431c-bdc1-eded365600db to /host/opt/cni/bin/\\\\n2026-03-09T13:22:42Z [verbose] multus-daemon started\\\\n2026-03-09T13:22:42Z [verbose] Readiness Indicator file check\\\\n2026-03-09T13:23:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:23:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5pb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zmzm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.470824 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"072442e6-8ece-4f72-a8cb-ad7ef1e3facb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c773d25fbc4d390f6da4699cdfa6bac3745eb18288fd5bb885b9a4dca9a519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdc5099d0e35ffc801fb7fb52979c62ed875cd19c8d0b015350b4906e6ea61f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c57a862615a1748c2abb6ae6147d48398f8246d030b1527963f77743d8d802a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a96bc81c6b97f1f03d9cb304263ff47034248b9949771825e1813c284032b91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885b4c9d85cdb3705865fae2824c36e6440c602deaaf1345fffa0eda8de42a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ff7f6fa22134725b00be193ba8f58c9ad083335c7f795fa87a1cad202a2496f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4e3985b08c9b6913c4e31f9645f8544fb5f7c0cbd75d116618fd9e479588d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjrkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.488307 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:23:41Z\\\",\\\"message\\\":\\\"NetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 13:23:41.398455 7479 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:23:41.398816 7479 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:23:41.399063 7479 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:23:41.399497 7479 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:23:41.399594 7479 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 13:23:41.399615 7479 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 13:23:41.399627 7479 factory.go:656] Stopping watch factory\\\\nI0309 13:23:41.399641 7479 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 13:23:41.445020 7479 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0309 13:23:41.445268 7479 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0309 13:23:41.445408 7479 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:23:41.445455 7479 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 13:23:41.445595 7479 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:23:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5xrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:22:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7kggv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:43 crc kubenswrapper[4764]: I0309 13:23:43.559077 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:43 crc kubenswrapper[4764]: E0309 13:23:43.559284 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.558871 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.558964 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:44 crc kubenswrapper[4764]: E0309 13:23:44.559093 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.558920 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:44 crc kubenswrapper[4764]: E0309 13:23:44.559396 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:44 crc kubenswrapper[4764]: E0309 13:23:44.559567 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.585949 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs\") pod \"network-metrics-daemon-wkwdz\" (UID: \"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\") " pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:44 crc kubenswrapper[4764]: E0309 13:23:44.586723 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:23:44 crc kubenswrapper[4764]: E0309 13:23:44.587020 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs podName:6597fc34-10ee-4984-9c69-f4b7c0d46e2a nodeName:}" failed. No retries permitted until 2026-03-09 13:24:48.586834751 +0000 UTC m=+243.837006839 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs") pod "network-metrics-daemon-wkwdz" (UID: "6597fc34-10ee-4984-9c69-f4b7c0d46e2a") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.869023 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.869069 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.869078 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.869095 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.869107 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:44Z","lastTransitionTime":"2026-03-09T13:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:44 crc kubenswrapper[4764]: E0309 13:23:44.882958 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.887161 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.887201 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.887210 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.887223 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.887231 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:44Z","lastTransitionTime":"2026-03-09T13:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:44 crc kubenswrapper[4764]: E0309 13:23:44.906351 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.911080 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.911184 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.911231 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.911260 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.911280 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:44Z","lastTransitionTime":"2026-03-09T13:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:44 crc kubenswrapper[4764]: E0309 13:23:44.926028 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.930338 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.930380 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.930389 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.930406 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.930427 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:44Z","lastTransitionTime":"2026-03-09T13:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:44 crc kubenswrapper[4764]: E0309 13:23:44.949884 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.955044 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.955088 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.955096 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.955111 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:44 crc kubenswrapper[4764]: I0309 13:23:44.955123 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:44Z","lastTransitionTime":"2026-03-09T13:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:44 crc kubenswrapper[4764]: E0309 13:23:44.967172 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:23:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44d5d5b8-3cb1-4753-9ed2-c6ede7c42d06\\\",\\\"systemUUID\\\":\\\"470a86bd-e6aa-42c1-b220-b7b8c0289210\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:44 crc kubenswrapper[4764]: E0309 13:23:44.967299 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:23:45 crc kubenswrapper[4764]: I0309 13:23:45.559691 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:45 crc kubenswrapper[4764]: E0309 13:23:45.559805 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:45 crc kubenswrapper[4764]: I0309 13:23:45.579295 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1507e859-72ea-4c35-a5a7-6d0f48b19b09\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://289f145eb4f8412759fdbbc747534a08534b2f848f18206d9f9c2d03adc86944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c220cbb5e42a64e54a853893cbdd6e5502e5f1e5cd7fa5aa7fffc3e27c5b2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98462dd9c7c97ff5402911d179c7739d92d293c4abacea1ed4a394ec9f33dfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4039a3daa857df7f6171121f2a936d4893913c8374ad7229c758dd93a8821386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a9b753c5c47533ae4771e38e6ce4f7a3822bc208557dda27fd1b525c76044f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36c569656352f45737e634caa991a17028b70ef88ec9449a7f22edd270d63443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://573ffaefe818651692cd854f41bd67b933fa6496e3a833873b40c2324b3a09e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae08be6456f1300abf253a9801e2585bed0bb928045bcb96921efa4b5d0200eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:45 crc kubenswrapper[4764]: I0309 13:23:45.590894 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc78a2e5-6620-4ea8-8bc7-c62bcc3348cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e22c92297ebb9853ff582f519aa4c98c1468e757b53b1245f7a7ab0f9fe25344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://050547213bbff923b18f34b1196952d038bab09162ab803da36c99fa3f769dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819fd898c59b4cf866334e40822b6142b014a2e11f4ca75b98d549665b1575a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f97e59c94171eedb27889b9cb72fa1b241a5e09c23310576dfd844db1bfc2292\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:45 crc kubenswrapper[4764]: I0309 13:23:45.606486 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:23:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:23:45 crc kubenswrapper[4764]: E0309 13:23:45.654693 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:23:45 crc kubenswrapper[4764]: I0309 13:23:45.656878 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=89.656861313 podStartE2EDuration="1m29.656861313s" podCreationTimestamp="2026-03-09 13:22:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:45.656551303 +0000 UTC m=+180.906723211" watchObservedRunningTime="2026-03-09 13:23:45.656861313 +0000 UTC m=+180.907033221" Mar 09 13:23:45 crc kubenswrapper[4764]: I0309 13:23:45.673370 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=38.673352324 podStartE2EDuration="38.673352324s" podCreationTimestamp="2026-03-09 13:23:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:45.673043834 +0000 UTC m=+180.923215742" watchObservedRunningTime="2026-03-09 13:23:45.673352324 +0000 UTC m=+180.923524232" Mar 09 13:23:45 crc kubenswrapper[4764]: I0309 13:23:45.738597 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-hbmjc" podStartSLOduration=115.738571904 podStartE2EDuration="1m55.738571904s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:45.728385158 +0000 UTC m=+180.978557066" watchObservedRunningTime="2026-03-09 13:23:45.738571904 +0000 UTC m=+180.988743812" Mar 09 13:23:45 crc kubenswrapper[4764]: I0309 13:23:45.750944 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hzjxs" podStartSLOduration=114.750924757 podStartE2EDuration="1m54.750924757s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:45.739474802 +0000 UTC m=+180.989646750" watchObservedRunningTime="2026-03-09 13:23:45.750924757 +0000 UTC m=+181.001096665" Mar 09 13:23:45 crc kubenswrapper[4764]: I0309 13:23:45.776555 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-r5bnx" podStartSLOduration=115.77653816 podStartE2EDuration="1m55.77653816s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:45.772362491 +0000 UTC m=+181.022534399" watchObservedRunningTime="2026-03-09 13:23:45.77653816 +0000 UTC m=+181.026710068" Mar 09 13:23:45 crc kubenswrapper[4764]: I0309 13:23:45.803221 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-zmzm7" podStartSLOduration=115.803202096 podStartE2EDuration="1m55.803202096s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:45.78850186 +0000 UTC m=+181.038673768" watchObservedRunningTime="2026-03-09 13:23:45.803202096 +0000 UTC m=+181.053374004" Mar 09 13:23:45 crc kubenswrapper[4764]: I0309 13:23:45.813546 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-crvdf" podStartSLOduration=115.813525065 podStartE2EDuration="1m55.813525065s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:45.802860115 +0000 UTC m=+181.053032053" watchObservedRunningTime="2026-03-09 13:23:45.813525065 +0000 UTC m=+181.063696973" Mar 09 13:23:45 crc kubenswrapper[4764]: I0309 13:23:45.814215 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=42.814209127 podStartE2EDuration="42.814209127s" podCreationTimestamp="2026-03-09 13:23:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:45.813431343 +0000 UTC m=+181.063603271" watchObservedRunningTime="2026-03-09 13:23:45.814209127 +0000 UTC m=+181.064381035" Mar 09 13:23:45 crc kubenswrapper[4764]: I0309 13:23:45.824122 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podStartSLOduration=115.824107063 podStartE2EDuration="1m55.824107063s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:45.823271147 +0000 UTC m=+181.073443055" watchObservedRunningTime="2026-03-09 13:23:45.824107063 +0000 UTC m=+181.074278981" Mar 09 13:23:46 crc kubenswrapper[4764]: I0309 13:23:46.559601 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:46 crc kubenswrapper[4764]: E0309 13:23:46.559817 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:46 crc kubenswrapper[4764]: I0309 13:23:46.559826 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:46 crc kubenswrapper[4764]: I0309 13:23:46.559925 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:46 crc kubenswrapper[4764]: E0309 13:23:46.560088 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:46 crc kubenswrapper[4764]: E0309 13:23:46.560207 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:47 crc kubenswrapper[4764]: I0309 13:23:47.559388 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:47 crc kubenswrapper[4764]: E0309 13:23:47.559776 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:48 crc kubenswrapper[4764]: I0309 13:23:48.558663 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:48 crc kubenswrapper[4764]: E0309 13:23:48.558825 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:48 crc kubenswrapper[4764]: I0309 13:23:48.558671 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:48 crc kubenswrapper[4764]: E0309 13:23:48.558912 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:48 crc kubenswrapper[4764]: I0309 13:23:48.558672 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:48 crc kubenswrapper[4764]: E0309 13:23:48.558989 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:49 crc kubenswrapper[4764]: I0309 13:23:49.558841 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:49 crc kubenswrapper[4764]: E0309 13:23:49.559072 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:50 crc kubenswrapper[4764]: I0309 13:23:50.559217 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:50 crc kubenswrapper[4764]: I0309 13:23:50.559221 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:50 crc kubenswrapper[4764]: E0309 13:23:50.559902 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:50 crc kubenswrapper[4764]: E0309 13:23:50.559818 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:50 crc kubenswrapper[4764]: I0309 13:23:50.559304 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:50 crc kubenswrapper[4764]: E0309 13:23:50.560047 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:50 crc kubenswrapper[4764]: E0309 13:23:50.655882 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:23:51 crc kubenswrapper[4764]: I0309 13:23:51.559470 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:51 crc kubenswrapper[4764]: E0309 13:23:51.559684 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:52 crc kubenswrapper[4764]: I0309 13:23:52.558929 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:52 crc kubenswrapper[4764]: I0309 13:23:52.559029 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:52 crc kubenswrapper[4764]: I0309 13:23:52.558959 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:52 crc kubenswrapper[4764]: E0309 13:23:52.559121 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:52 crc kubenswrapper[4764]: E0309 13:23:52.559271 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:52 crc kubenswrapper[4764]: E0309 13:23:52.559397 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:53 crc kubenswrapper[4764]: I0309 13:23:53.559459 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:53 crc kubenswrapper[4764]: E0309 13:23:53.559803 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:54 crc kubenswrapper[4764]: I0309 13:23:54.559592 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:54 crc kubenswrapper[4764]: I0309 13:23:54.559700 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:54 crc kubenswrapper[4764]: E0309 13:23:54.559747 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:54 crc kubenswrapper[4764]: I0309 13:23:54.559800 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:54 crc kubenswrapper[4764]: E0309 13:23:54.559982 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:54 crc kubenswrapper[4764]: E0309 13:23:54.560129 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.150370 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.150428 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.150440 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.150463 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.150477 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:23:55Z","lastTransitionTime":"2026-03-09T13:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.199787 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s"] Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.200381 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.202546 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.202596 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.202562 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.205576 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.232020 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=42.232001317 podStartE2EDuration="42.232001317s" podCreationTimestamp="2026-03-09 13:23:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:55.231237943 +0000 UTC m=+190.481409871" watchObservedRunningTime="2026-03-09 13:23:55.232001317 +0000 UTC m=+190.482173245" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.245974 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=71.245949129 podStartE2EDuration="1m11.245949129s" podCreationTimestamp="2026-03-09 13:22:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:55.243302347 +0000 UTC m=+190.493474255" watchObservedRunningTime="2026-03-09 13:23:55.245949129 +0000 UTC m=+190.496121037" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.298456 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c544e05e-a876-4e15-bcfe-947cad49b850-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lpx7s\" (UID: \"c544e05e-a876-4e15-bcfe-947cad49b850\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.298537 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c544e05e-a876-4e15-bcfe-947cad49b850-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lpx7s\" (UID: \"c544e05e-a876-4e15-bcfe-947cad49b850\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.298614 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c544e05e-a876-4e15-bcfe-947cad49b850-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lpx7s\" (UID: \"c544e05e-a876-4e15-bcfe-947cad49b850\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.298633 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c544e05e-a876-4e15-bcfe-947cad49b850-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lpx7s\" (UID: \"c544e05e-a876-4e15-bcfe-947cad49b850\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.298698 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c544e05e-a876-4e15-bcfe-947cad49b850-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lpx7s\" (UID: \"c544e05e-a876-4e15-bcfe-947cad49b850\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.400632 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c544e05e-a876-4e15-bcfe-947cad49b850-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lpx7s\" (UID: \"c544e05e-a876-4e15-bcfe-947cad49b850\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.400766 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c544e05e-a876-4e15-bcfe-947cad49b850-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lpx7s\" (UID: \"c544e05e-a876-4e15-bcfe-947cad49b850\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.400844 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c544e05e-a876-4e15-bcfe-947cad49b850-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lpx7s\" (UID: \"c544e05e-a876-4e15-bcfe-947cad49b850\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.400874 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c544e05e-a876-4e15-bcfe-947cad49b850-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lpx7s\" (UID: \"c544e05e-a876-4e15-bcfe-947cad49b850\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.400907 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c544e05e-a876-4e15-bcfe-947cad49b850-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lpx7s\" (UID: \"c544e05e-a876-4e15-bcfe-947cad49b850\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.400950 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c544e05e-a876-4e15-bcfe-947cad49b850-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lpx7s\" (UID: \"c544e05e-a876-4e15-bcfe-947cad49b850\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.400961 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c544e05e-a876-4e15-bcfe-947cad49b850-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lpx7s\" (UID: \"c544e05e-a876-4e15-bcfe-947cad49b850\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.402530 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c544e05e-a876-4e15-bcfe-947cad49b850-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lpx7s\" (UID: \"c544e05e-a876-4e15-bcfe-947cad49b850\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.411317 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c544e05e-a876-4e15-bcfe-947cad49b850-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lpx7s\" (UID: \"c544e05e-a876-4e15-bcfe-947cad49b850\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.427634 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c544e05e-a876-4e15-bcfe-947cad49b850-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lpx7s\" (UID: \"c544e05e-a876-4e15-bcfe-947cad49b850\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.515288 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" Mar 09 13:23:55 crc kubenswrapper[4764]: W0309 13:23:55.530738 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc544e05e_a876_4e15_bcfe_947cad49b850.slice/crio-13736dcabd84d0c7eb41312a46b31318b883e83b56a656f38df72ee8b2fe4764 WatchSource:0}: Error finding container 13736dcabd84d0c7eb41312a46b31318b883e83b56a656f38df72ee8b2fe4764: Status 404 returned error can't find the container with id 13736dcabd84d0c7eb41312a46b31318b883e83b56a656f38df72ee8b2fe4764 Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.558950 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:55 crc kubenswrapper[4764]: E0309 13:23:55.560378 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.610735 4764 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 09 13:23:55 crc kubenswrapper[4764]: I0309 13:23:55.620439 4764 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 09 13:23:55 crc kubenswrapper[4764]: E0309 13:23:55.656569 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:23:56 crc kubenswrapper[4764]: I0309 13:23:56.305982 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" event={"ID":"c544e05e-a876-4e15-bcfe-947cad49b850","Type":"ContainerStarted","Data":"d2b10a3465d5a7b3351bf4f381a7550f42a9f289553c4ac878e37c5c549ade28"} Mar 09 13:23:56 crc kubenswrapper[4764]: I0309 13:23:56.306060 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" event={"ID":"c544e05e-a876-4e15-bcfe-947cad49b850","Type":"ContainerStarted","Data":"13736dcabd84d0c7eb41312a46b31318b883e83b56a656f38df72ee8b2fe4764"} Mar 09 13:23:56 crc kubenswrapper[4764]: I0309 13:23:56.328097 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lpx7s" podStartSLOduration=126.328075056 podStartE2EDuration="2m6.328075056s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:56.32790183 +0000 UTC m=+191.578073808" watchObservedRunningTime="2026-03-09 13:23:56.328075056 +0000 UTC m=+191.578246954" Mar 09 13:23:56 crc kubenswrapper[4764]: I0309 13:23:56.559683 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:56 crc kubenswrapper[4764]: I0309 13:23:56.559760 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:56 crc kubenswrapper[4764]: I0309 13:23:56.559792 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:56 crc kubenswrapper[4764]: E0309 13:23:56.559826 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:56 crc kubenswrapper[4764]: E0309 13:23:56.559870 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:56 crc kubenswrapper[4764]: E0309 13:23:56.559940 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:56 crc kubenswrapper[4764]: I0309 13:23:56.560558 4764 scope.go:117] "RemoveContainer" containerID="9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f" Mar 09 13:23:56 crc kubenswrapper[4764]: E0309 13:23:56.560753 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" Mar 09 13:23:57 crc kubenswrapper[4764]: I0309 13:23:57.559739 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:57 crc kubenswrapper[4764]: E0309 13:23:57.559922 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:23:58 crc kubenswrapper[4764]: I0309 13:23:58.559249 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:58 crc kubenswrapper[4764]: I0309 13:23:58.559303 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:58 crc kubenswrapper[4764]: E0309 13:23:58.559386 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:58 crc kubenswrapper[4764]: I0309 13:23:58.559441 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:58 crc kubenswrapper[4764]: E0309 13:23:58.559574 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:58 crc kubenswrapper[4764]: E0309 13:23:58.559772 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:59 crc kubenswrapper[4764]: I0309 13:23:59.559179 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:23:59 crc kubenswrapper[4764]: E0309 13:23:59.559514 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:00 crc kubenswrapper[4764]: I0309 13:24:00.559335 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:00 crc kubenswrapper[4764]: I0309 13:24:00.559336 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:00 crc kubenswrapper[4764]: I0309 13:24:00.559348 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:00 crc kubenswrapper[4764]: E0309 13:24:00.559526 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:00 crc kubenswrapper[4764]: E0309 13:24:00.559630 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:00 crc kubenswrapper[4764]: E0309 13:24:00.559766 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:00 crc kubenswrapper[4764]: E0309 13:24:00.657825 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:24:01 crc kubenswrapper[4764]: I0309 13:24:01.559621 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:01 crc kubenswrapper[4764]: E0309 13:24:01.559904 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:02 crc kubenswrapper[4764]: I0309 13:24:02.559072 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:02 crc kubenswrapper[4764]: I0309 13:24:02.559089 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:02 crc kubenswrapper[4764]: I0309 13:24:02.559169 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:02 crc kubenswrapper[4764]: E0309 13:24:02.559252 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:02 crc kubenswrapper[4764]: E0309 13:24:02.559419 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:02 crc kubenswrapper[4764]: E0309 13:24:02.559527 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:03 crc kubenswrapper[4764]: I0309 13:24:03.559421 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:03 crc kubenswrapper[4764]: E0309 13:24:03.559547 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:04 crc kubenswrapper[4764]: I0309 13:24:04.558978 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:04 crc kubenswrapper[4764]: I0309 13:24:04.559105 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:04 crc kubenswrapper[4764]: E0309 13:24:04.559190 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:04 crc kubenswrapper[4764]: I0309 13:24:04.559152 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:04 crc kubenswrapper[4764]: E0309 13:24:04.559692 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:04 crc kubenswrapper[4764]: E0309 13:24:04.559742 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:05 crc kubenswrapper[4764]: I0309 13:24:05.559364 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:05 crc kubenswrapper[4764]: E0309 13:24:05.560695 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:05 crc kubenswrapper[4764]: E0309 13:24:05.658439 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:24:06 crc kubenswrapper[4764]: I0309 13:24:06.559756 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:06 crc kubenswrapper[4764]: I0309 13:24:06.559834 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:06 crc kubenswrapper[4764]: I0309 13:24:06.559758 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:06 crc kubenswrapper[4764]: E0309 13:24:06.560008 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:06 crc kubenswrapper[4764]: E0309 13:24:06.560231 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:06 crc kubenswrapper[4764]: E0309 13:24:06.560279 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:07 crc kubenswrapper[4764]: I0309 13:24:07.559715 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:07 crc kubenswrapper[4764]: E0309 13:24:07.560049 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:08 crc kubenswrapper[4764]: I0309 13:24:08.559261 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:08 crc kubenswrapper[4764]: I0309 13:24:08.559372 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:08 crc kubenswrapper[4764]: E0309 13:24:08.559391 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:08 crc kubenswrapper[4764]: I0309 13:24:08.559487 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:08 crc kubenswrapper[4764]: E0309 13:24:08.559682 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:08 crc kubenswrapper[4764]: E0309 13:24:08.559854 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:09 crc kubenswrapper[4764]: I0309 13:24:09.558896 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:09 crc kubenswrapper[4764]: E0309 13:24:09.559068 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:10 crc kubenswrapper[4764]: I0309 13:24:10.559512 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:10 crc kubenswrapper[4764]: I0309 13:24:10.559581 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:10 crc kubenswrapper[4764]: I0309 13:24:10.560153 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:10 crc kubenswrapper[4764]: E0309 13:24:10.560290 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:10 crc kubenswrapper[4764]: E0309 13:24:10.560405 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:10 crc kubenswrapper[4764]: I0309 13:24:10.560458 4764 scope.go:117] "RemoveContainer" containerID="9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f" Mar 09 13:24:10 crc kubenswrapper[4764]: E0309 13:24:10.560535 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:10 crc kubenswrapper[4764]: E0309 13:24:10.560899 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7kggv_openshift-ovn-kubernetes(b8ccb4f5-550a-41b2-b39d-201cdd5d902a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" Mar 09 13:24:10 crc kubenswrapper[4764]: E0309 13:24:10.659684 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:24:11 crc kubenswrapper[4764]: I0309 13:24:11.558772 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:11 crc kubenswrapper[4764]: E0309 13:24:11.558901 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:12 crc kubenswrapper[4764]: I0309 13:24:12.559348 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:12 crc kubenswrapper[4764]: I0309 13:24:12.559430 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:12 crc kubenswrapper[4764]: E0309 13:24:12.559529 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:12 crc kubenswrapper[4764]: I0309 13:24:12.559563 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:12 crc kubenswrapper[4764]: E0309 13:24:12.559829 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:12 crc kubenswrapper[4764]: E0309 13:24:12.560012 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:13 crc kubenswrapper[4764]: I0309 13:24:13.558822 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:13 crc kubenswrapper[4764]: E0309 13:24:13.558968 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:14 crc kubenswrapper[4764]: I0309 13:24:14.360339 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zmzm7_202a1f58-ce83-4374-ac48-dc806f7b9d6b/kube-multus/1.log" Mar 09 13:24:14 crc kubenswrapper[4764]: I0309 13:24:14.361333 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zmzm7_202a1f58-ce83-4374-ac48-dc806f7b9d6b/kube-multus/0.log" Mar 09 13:24:14 crc kubenswrapper[4764]: I0309 13:24:14.361379 4764 generic.go:334] "Generic (PLEG): container finished" podID="202a1f58-ce83-4374-ac48-dc806f7b9d6b" containerID="ae16a6074b17ec1df33139f5a8f220d15bea5658fc20ad653e2af7b2427dec6a" exitCode=1 Mar 09 13:24:14 crc kubenswrapper[4764]: I0309 13:24:14.361406 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zmzm7" event={"ID":"202a1f58-ce83-4374-ac48-dc806f7b9d6b","Type":"ContainerDied","Data":"ae16a6074b17ec1df33139f5a8f220d15bea5658fc20ad653e2af7b2427dec6a"} Mar 09 13:24:14 crc kubenswrapper[4764]: I0309 13:24:14.361442 4764 scope.go:117] "RemoveContainer" containerID="05cbc20845591e82618289ff2509634cc5e2d05362e8414b3ff91a73d0719331" Mar 09 13:24:14 crc kubenswrapper[4764]: I0309 13:24:14.361822 4764 scope.go:117] "RemoveContainer" containerID="ae16a6074b17ec1df33139f5a8f220d15bea5658fc20ad653e2af7b2427dec6a" Mar 09 13:24:14 crc kubenswrapper[4764]: E0309 13:24:14.362007 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-zmzm7_openshift-multus(202a1f58-ce83-4374-ac48-dc806f7b9d6b)\"" pod="openshift-multus/multus-zmzm7" podUID="202a1f58-ce83-4374-ac48-dc806f7b9d6b" Mar 09 13:24:14 crc kubenswrapper[4764]: I0309 13:24:14.558839 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:14 crc kubenswrapper[4764]: I0309 13:24:14.558928 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:14 crc kubenswrapper[4764]: E0309 13:24:14.558967 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:14 crc kubenswrapper[4764]: I0309 13:24:14.558933 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:14 crc kubenswrapper[4764]: E0309 13:24:14.559072 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:14 crc kubenswrapper[4764]: E0309 13:24:14.559184 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:15 crc kubenswrapper[4764]: I0309 13:24:15.366790 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zmzm7_202a1f58-ce83-4374-ac48-dc806f7b9d6b/kube-multus/1.log" Mar 09 13:24:15 crc kubenswrapper[4764]: I0309 13:24:15.558932 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:15 crc kubenswrapper[4764]: E0309 13:24:15.559963 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:15 crc kubenswrapper[4764]: E0309 13:24:15.660392 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:24:16 crc kubenswrapper[4764]: I0309 13:24:16.559085 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:16 crc kubenswrapper[4764]: I0309 13:24:16.559117 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:16 crc kubenswrapper[4764]: I0309 13:24:16.559139 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:16 crc kubenswrapper[4764]: E0309 13:24:16.559222 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:16 crc kubenswrapper[4764]: E0309 13:24:16.559316 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:16 crc kubenswrapper[4764]: E0309 13:24:16.559390 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:17 crc kubenswrapper[4764]: I0309 13:24:17.559753 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:17 crc kubenswrapper[4764]: E0309 13:24:17.559911 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:18 crc kubenswrapper[4764]: I0309 13:24:18.558710 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:18 crc kubenswrapper[4764]: I0309 13:24:18.558772 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:18 crc kubenswrapper[4764]: I0309 13:24:18.558729 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:18 crc kubenswrapper[4764]: E0309 13:24:18.558850 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:18 crc kubenswrapper[4764]: E0309 13:24:18.558946 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:18 crc kubenswrapper[4764]: E0309 13:24:18.558980 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:19 crc kubenswrapper[4764]: I0309 13:24:19.559907 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:19 crc kubenswrapper[4764]: E0309 13:24:19.560110 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:20 crc kubenswrapper[4764]: I0309 13:24:20.559121 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:20 crc kubenswrapper[4764]: I0309 13:24:20.559217 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:20 crc kubenswrapper[4764]: I0309 13:24:20.559324 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:20 crc kubenswrapper[4764]: E0309 13:24:20.559309 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:20 crc kubenswrapper[4764]: E0309 13:24:20.559508 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:20 crc kubenswrapper[4764]: E0309 13:24:20.559617 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:20 crc kubenswrapper[4764]: E0309 13:24:20.661552 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:24:21 crc kubenswrapper[4764]: I0309 13:24:21.559116 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:21 crc kubenswrapper[4764]: E0309 13:24:21.559385 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:22 crc kubenswrapper[4764]: I0309 13:24:22.559405 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:22 crc kubenswrapper[4764]: I0309 13:24:22.559449 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:22 crc kubenswrapper[4764]: I0309 13:24:22.559421 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.559553 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.559601 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.559665 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:22 crc kubenswrapper[4764]: I0309 13:24:22.590937 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:22 crc kubenswrapper[4764]: I0309 13:24:22.591024 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:22 crc kubenswrapper[4764]: I0309 13:24:22.591055 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:22 crc kubenswrapper[4764]: I0309 13:24:22.591096 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:22 crc kubenswrapper[4764]: I0309 13:24:22.591119 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.591168 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.591205 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:26:24.591172262 +0000 UTC m=+339.841344180 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.591246 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:26:24.591234864 +0000 UTC m=+339.841406782 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.591270 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.591347 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:26:24.591329797 +0000 UTC m=+339.841501705 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.591279 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.591420 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.591447 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.591207 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.592240 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.592267 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.592359 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:26:24.591498531 +0000 UTC m=+339.841670519 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:24:22 crc kubenswrapper[4764]: E0309 13:24:22.592409 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:26:24.592394575 +0000 UTC m=+339.842566503 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:24:23 crc kubenswrapper[4764]: I0309 13:24:23.559767 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:23 crc kubenswrapper[4764]: E0309 13:24:23.560109 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:24 crc kubenswrapper[4764]: I0309 13:24:24.559280 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:24 crc kubenswrapper[4764]: I0309 13:24:24.559337 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:24 crc kubenswrapper[4764]: I0309 13:24:24.559356 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:24 crc kubenswrapper[4764]: E0309 13:24:24.559454 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:24 crc kubenswrapper[4764]: E0309 13:24:24.559577 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:24 crc kubenswrapper[4764]: E0309 13:24:24.559677 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:25 crc kubenswrapper[4764]: I0309 13:24:25.559066 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:25 crc kubenswrapper[4764]: E0309 13:24:25.560476 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:25 crc kubenswrapper[4764]: I0309 13:24:25.561317 4764 scope.go:117] "RemoveContainer" containerID="9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f" Mar 09 13:24:25 crc kubenswrapper[4764]: E0309 13:24:25.662206 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:24:26 crc kubenswrapper[4764]: I0309 13:24:26.399959 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovnkube-controller/3.log" Mar 09 13:24:26 crc kubenswrapper[4764]: I0309 13:24:26.402592 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerStarted","Data":"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29"} Mar 09 13:24:26 crc kubenswrapper[4764]: I0309 13:24:26.403049 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:24:26 crc kubenswrapper[4764]: I0309 13:24:26.427099 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wkwdz"] Mar 09 13:24:26 crc kubenswrapper[4764]: I0309 13:24:26.427229 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:26 crc kubenswrapper[4764]: E0309 13:24:26.427334 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:26 crc kubenswrapper[4764]: I0309 13:24:26.558768 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:26 crc kubenswrapper[4764]: I0309 13:24:26.558794 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:26 crc kubenswrapper[4764]: I0309 13:24:26.558795 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:26 crc kubenswrapper[4764]: E0309 13:24:26.558903 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:26 crc kubenswrapper[4764]: E0309 13:24:26.559036 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:26 crc kubenswrapper[4764]: E0309 13:24:26.559134 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:27 crc kubenswrapper[4764]: I0309 13:24:27.558993 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:27 crc kubenswrapper[4764]: E0309 13:24:27.559260 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:27 crc kubenswrapper[4764]: I0309 13:24:27.559451 4764 scope.go:117] "RemoveContainer" containerID="ae16a6074b17ec1df33139f5a8f220d15bea5658fc20ad653e2af7b2427dec6a" Mar 09 13:24:27 crc kubenswrapper[4764]: I0309 13:24:27.577129 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podStartSLOduration=157.577113176 podStartE2EDuration="2m37.577113176s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:26.43320687 +0000 UTC m=+221.683378788" watchObservedRunningTime="2026-03-09 13:24:27.577113176 +0000 UTC m=+222.827285084" Mar 09 13:24:28 crc kubenswrapper[4764]: I0309 13:24:28.411496 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zmzm7_202a1f58-ce83-4374-ac48-dc806f7b9d6b/kube-multus/1.log" Mar 09 13:24:28 crc kubenswrapper[4764]: I0309 13:24:28.411902 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zmzm7" event={"ID":"202a1f58-ce83-4374-ac48-dc806f7b9d6b","Type":"ContainerStarted","Data":"63894bb3203376238f9013fa9abe0d5e50f67b5675ce93e7ce0b6cdd92b326b3"} Mar 09 13:24:28 crc kubenswrapper[4764]: I0309 13:24:28.559518 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:28 crc kubenswrapper[4764]: E0309 13:24:28.559881 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:28 crc kubenswrapper[4764]: I0309 13:24:28.559538 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:28 crc kubenswrapper[4764]: E0309 13:24:28.560455 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:28 crc kubenswrapper[4764]: I0309 13:24:28.559518 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:28 crc kubenswrapper[4764]: E0309 13:24:28.560764 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:29 crc kubenswrapper[4764]: I0309 13:24:29.558931 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:29 crc kubenswrapper[4764]: E0309 13:24:29.559058 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wkwdz" podUID="6597fc34-10ee-4984-9c69-f4b7c0d46e2a" Mar 09 13:24:30 crc kubenswrapper[4764]: I0309 13:24:30.559492 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:30 crc kubenswrapper[4764]: E0309 13:24:30.560126 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:24:30 crc kubenswrapper[4764]: I0309 13:24:30.559529 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:30 crc kubenswrapper[4764]: E0309 13:24:30.560384 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:24:30 crc kubenswrapper[4764]: I0309 13:24:30.559512 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:30 crc kubenswrapper[4764]: E0309 13:24:30.560625 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:24:31 crc kubenswrapper[4764]: I0309 13:24:31.559128 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:31 crc kubenswrapper[4764]: I0309 13:24:31.561029 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 09 13:24:31 crc kubenswrapper[4764]: I0309 13:24:31.563107 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 09 13:24:32 crc kubenswrapper[4764]: I0309 13:24:32.558678 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:32 crc kubenswrapper[4764]: I0309 13:24:32.558710 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:24:32 crc kubenswrapper[4764]: I0309 13:24:32.558733 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:24:32 crc kubenswrapper[4764]: I0309 13:24:32.562861 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 09 13:24:32 crc kubenswrapper[4764]: I0309 13:24:32.563689 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 09 13:24:32 crc kubenswrapper[4764]: I0309 13:24:32.563730 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 09 13:24:32 crc kubenswrapper[4764]: I0309 13:24:32.564120 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.849943 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.893561 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gst9d"] Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.894432 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.903826 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.903837 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.903948 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.904482 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.904488 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.904632 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.905158 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.906343 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tgqwl"] Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.906582 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.906805 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.907234 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.908712 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ptjnd"] Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.909283 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.910448 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.914042 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.914117 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.914695 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.915385 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.915412 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.915414 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.915530 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.915796 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.915862 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.916182 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.916269 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.917308 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.929529 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j"] Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.929623 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h4rc\" (UniqueName: \"kubernetes.io/projected/1ffb8d96-e6e4-4859-ae7d-37f900979485-kube-api-access-9h4rc\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.929937 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1ffb8d96-e6e4-4859-ae7d-37f900979485-image-import-ca\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.929992 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ffb8d96-e6e4-4859-ae7d-37f900979485-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.930015 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1ffb8d96-e6e4-4859-ae7d-37f900979485-encryption-config\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.930036 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1ffb8d96-e6e4-4859-ae7d-37f900979485-audit\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.930067 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1ffb8d96-e6e4-4859-ae7d-37f900979485-etcd-serving-ca\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.930088 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ffb8d96-e6e4-4859-ae7d-37f900979485-serving-cert\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.930118 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1ffb8d96-e6e4-4859-ae7d-37f900979485-etcd-client\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.930147 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1ffb8d96-e6e4-4859-ae7d-37f900979485-audit-dir\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.930175 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1ffb8d96-e6e4-4859-ae7d-37f900979485-node-pullsecrets\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.930197 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ffb8d96-e6e4-4859-ae7d-37f900979485-config\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.930876 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.934515 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l"] Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.934990 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.936911 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t8ft9"] Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.937843 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.939761 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.946436 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nj856"] Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.962173 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.965481 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r"] Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.966408 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.970031 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sp2mq"] Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.970808 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.980782 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.982907 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.983162 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.983439 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.983630 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.984121 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.984847 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.986476 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448"] Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.986895 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-8g9lj"] Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.987233 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c2m6k"] Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.987596 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-c2m6k" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.988487 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.988604 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.988730 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.988841 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.988852 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.989005 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.989171 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.989477 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.991418 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.991507 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nnrmm"] Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.991542 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.992253 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.992456 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.992821 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.992850 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.992922 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.992969 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.993022 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.993065 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.993149 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.993158 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.993236 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.993258 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.996358 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7"] Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.996637 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.996799 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.996887 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.997006 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.997108 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.997199 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:35 crc kubenswrapper[4764]: I0309 13:24:35.998234 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.000286 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.000998 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.003993 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.004295 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.004546 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.007834 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5g2pp"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.008058 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.008274 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.008445 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.008567 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.008684 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.009098 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5g2pp" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.015131 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-x927s"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.015815 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-x927s" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.025895 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.026481 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.028548 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.028889 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.028985 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029082 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029146 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029169 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029222 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029259 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029344 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029349 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029426 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029430 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029471 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029503 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029626 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029753 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029768 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029840 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029907 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.029949 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.030103 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.030191 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.030289 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.030363 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.030725 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.030985 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ad49b4d8-1218-4a34-8455-831d0f563cbf-client-ca\") pod \"route-controller-manager-6576b87f9c-mvq6r\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.031015 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e025e897-5ff3-476b-81c9-afdd0ae7a25f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-84448\" (UID: \"e025e897-5ff3-476b-81c9-afdd0ae7a25f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.036016 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e025e897-5ff3-476b-81c9-afdd0ae7a25f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-84448\" (UID: \"e025e897-5ff3-476b-81c9-afdd0ae7a25f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.036386 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qxpr\" (UniqueName: \"kubernetes.io/projected/e025e897-5ff3-476b-81c9-afdd0ae7a25f-kube-api-access-2qxpr\") pod \"cluster-image-registry-operator-dc59b4c8b-84448\" (UID: \"e025e897-5ff3-476b-81c9-afdd0ae7a25f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.036422 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-encryption-config\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.036662 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4951d770-ae8c-470a-982a-807c82112722-service-ca-bundle\") pod \"authentication-operator-69f744f599-ptjnd\" (UID: \"4951d770-ae8c-470a-982a-807c82112722\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.036714 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f9zl\" (UniqueName: \"kubernetes.io/projected/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-kube-api-access-9f9zl\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.036861 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23f87d2b-2a92-4abb-a2a6-2de508837343-serving-cert\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.037001 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b625331d-48ab-4d48-86fd-fe73466305ff-serving-cert\") pod \"controller-manager-879f6c89f-tgqwl\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.037027 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p79zf\" (UniqueName: \"kubernetes.io/projected/b625331d-48ab-4d48-86fd-fe73466305ff-kube-api-access-p79zf\") pod \"controller-manager-879f6c89f-tgqwl\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.037148 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.037301 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-client-ca\") pod \"controller-manager-879f6c89f-tgqwl\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.037329 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-oauth-config\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.037354 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.037379 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgk9n\" (UniqueName: \"kubernetes.io/projected/6dc446a1-b77b-4f15-ae5f-0141bf374cdd-kube-api-access-pgk9n\") pod \"openshift-apiserver-operator-796bbdcf4f-k5dfc\" (UID: \"6dc446a1-b77b-4f15-ae5f-0141bf374cdd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.037401 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tgqwl\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.037420 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee2ad8bf-7cf9-4bab-9638-b26d9c593188-config\") pod \"console-operator-58897d9998-sp2mq\" (UID: \"ee2ad8bf-7cf9-4bab-9638-b26d9c593188\") " pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.037439 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.037457 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8fzv\" (UniqueName: \"kubernetes.io/projected/23f87d2b-2a92-4abb-a2a6-2de508837343-kube-api-access-j8fzv\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.037569 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-config\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.037696 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-service-ca\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.037874 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1ffb8d96-e6e4-4859-ae7d-37f900979485-audit\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.038805 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.037905 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dc446a1-b77b-4f15-ae5f-0141bf374cdd-config\") pod \"openshift-apiserver-operator-796bbdcf4f-k5dfc\" (UID: \"6dc446a1-b77b-4f15-ae5f-0141bf374cdd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.041073 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee2ad8bf-7cf9-4bab-9638-b26d9c593188-trusted-ca\") pod \"console-operator-58897d9998-sp2mq\" (UID: \"ee2ad8bf-7cf9-4bab-9638-b26d9c593188\") " pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.041223 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndfng\" (UniqueName: \"kubernetes.io/projected/61b85db0-a292-42a8-8296-d0e476d80c89-kube-api-access-ndfng\") pod \"openshift-config-operator-7777fb866f-fxv7j\" (UID: \"61b85db0-a292-42a8-8296-d0e476d80c89\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.041357 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-audit-policies\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.041462 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/23f87d2b-2a92-4abb-a2a6-2de508837343-etcd-service-ca\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.041554 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1ffb8d96-e6e4-4859-ae7d-37f900979485-etcd-serving-ca\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.041847 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ffb8d96-e6e4-4859-ae7d-37f900979485-serving-cert\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.041930 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-serving-cert\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.042279 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1ffb8d96-e6e4-4859-ae7d-37f900979485-etcd-client\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.042374 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-trusted-ca-bundle\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.042490 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4125448d-5832-43c2-8dba-d95adde7458a-config\") pod \"machine-api-operator-5694c8668f-t8ft9\" (UID: \"4125448d-5832-43c2-8dba-d95adde7458a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.042618 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad49b4d8-1218-4a34-8455-831d0f563cbf-serving-cert\") pod \"route-controller-manager-6576b87f9c-mvq6r\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.042756 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.043100 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4951d770-ae8c-470a-982a-807c82112722-config\") pod \"authentication-operator-69f744f599-ptjnd\" (UID: \"4951d770-ae8c-470a-982a-807c82112722\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.043211 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48tpb\" (UniqueName: \"kubernetes.io/projected/ad49b4d8-1218-4a34-8455-831d0f563cbf-kube-api-access-48tpb\") pod \"route-controller-manager-6576b87f9c-mvq6r\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.043529 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4125448d-5832-43c2-8dba-d95adde7458a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t8ft9\" (UID: \"4125448d-5832-43c2-8dba-d95adde7458a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.043671 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.043927 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e025e897-5ff3-476b-81c9-afdd0ae7a25f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-84448\" (UID: \"e025e897-5ff3-476b-81c9-afdd0ae7a25f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.044035 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h4rc\" (UniqueName: \"kubernetes.io/projected/1ffb8d96-e6e4-4859-ae7d-37f900979485-kube-api-access-9h4rc\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.044257 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dc446a1-b77b-4f15-ae5f-0141bf374cdd-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-k5dfc\" (UID: \"6dc446a1-b77b-4f15-ae5f-0141bf374cdd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.044521 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/23f87d2b-2a92-4abb-a2a6-2de508837343-etcd-client\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.046906 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1ffb8d96-e6e4-4859-ae7d-37f900979485-audit\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.047764 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1ffb8d96-e6e4-4859-ae7d-37f900979485-etcd-serving-ca\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.049317 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.046749 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.050603 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-etcd-client\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.051264 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf3d7b2a-75e7-4c07-9211-b66c64c15def-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-klv4l\" (UID: \"cf3d7b2a-75e7-4c07-9211-b66c64c15def\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.051331 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1ffb8d96-e6e4-4859-ae7d-37f900979485-image-import-ca\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.051394 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee2ad8bf-7cf9-4bab-9638-b26d9c593188-serving-cert\") pod \"console-operator-58897d9998-sp2mq\" (UID: \"ee2ad8bf-7cf9-4bab-9638-b26d9c593188\") " pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.051419 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.051447 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-oauth-serving-cert\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.051706 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23f87d2b-2a92-4abb-a2a6-2de508837343-config\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.051802 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ffb8d96-e6e4-4859-ae7d-37f900979485-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.052176 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k77xq\" (UniqueName: \"kubernetes.io/projected/cf3d7b2a-75e7-4c07-9211-b66c64c15def-kube-api-access-k77xq\") pod \"openshift-controller-manager-operator-756b6f6bc6-klv4l\" (UID: \"cf3d7b2a-75e7-4c07-9211-b66c64c15def\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.052317 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.052856 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/23f87d2b-2a92-4abb-a2a6-2de508837343-etcd-ca\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.052911 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-serving-cert\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.052988 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1ffb8d96-e6e4-4859-ae7d-37f900979485-encryption-config\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.053039 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/61b85db0-a292-42a8-8296-d0e476d80c89-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fxv7j\" (UID: \"61b85db0-a292-42a8-8296-d0e476d80c89\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.053267 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad49b4d8-1218-4a34-8455-831d0f563cbf-config\") pod \"route-controller-manager-6576b87f9c-mvq6r\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.053667 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f9b3244b-8df0-4330-9887-4092260d416a-audit-dir\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.061456 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.061780 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgwzj\" (UniqueName: \"kubernetes.io/projected/ee2ad8bf-7cf9-4bab-9638-b26d9c593188-kube-api-access-bgwzj\") pod \"console-operator-58897d9998-sp2mq\" (UID: \"ee2ad8bf-7cf9-4bab-9638-b26d9c593188\") " pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.061812 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc9tr\" (UniqueName: \"kubernetes.io/projected/4125448d-5832-43c2-8dba-d95adde7458a-kube-api-access-vc9tr\") pod \"machine-api-operator-5694c8668f-t8ft9\" (UID: \"4125448d-5832-43c2-8dba-d95adde7458a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.061837 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4951d770-ae8c-470a-982a-807c82112722-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ptjnd\" (UID: \"4951d770-ae8c-470a-982a-807c82112722\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.061855 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61b85db0-a292-42a8-8296-d0e476d80c89-serving-cert\") pod \"openshift-config-operator-7777fb866f-fxv7j\" (UID: \"61b85db0-a292-42a8-8296-d0e476d80c89\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.061871 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.061888 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.061978 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4951d770-ae8c-470a-982a-807c82112722-serving-cert\") pod \"authentication-operator-69f744f599-ptjnd\" (UID: \"4951d770-ae8c-470a-982a-807c82112722\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062003 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf3d7b2a-75e7-4c07-9211-b66c64c15def-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-klv4l\" (UID: \"cf3d7b2a-75e7-4c07-9211-b66c64c15def\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062019 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062050 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdsds\" (UniqueName: \"kubernetes.io/projected/f9b3244b-8df0-4330-9887-4092260d416a-kube-api-access-tdsds\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062104 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1ffb8d96-e6e4-4859-ae7d-37f900979485-audit-dir\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062123 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062142 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x6ch\" (UniqueName: \"kubernetes.io/projected/baee6113-40a8-468e-b343-09e9afd65ce3-kube-api-access-8x6ch\") pod \"dns-operator-744455d44c-c2m6k\" (UID: \"baee6113-40a8-468e-b343-09e9afd65ce3\") " pod="openshift-dns-operator/dns-operator-744455d44c-c2m6k" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062165 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-config\") pod \"controller-manager-879f6c89f-tgqwl\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062180 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062197 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn7dv\" (UniqueName: \"kubernetes.io/projected/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-kube-api-access-vn7dv\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062214 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-audit-policies\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062229 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-audit-dir\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062246 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrcvk\" (UniqueName: \"kubernetes.io/projected/4951d770-ae8c-470a-982a-807c82112722-kube-api-access-lrcvk\") pod \"authentication-operator-69f744f599-ptjnd\" (UID: \"4951d770-ae8c-470a-982a-807c82112722\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062265 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/baee6113-40a8-468e-b343-09e9afd65ce3-metrics-tls\") pod \"dns-operator-744455d44c-c2m6k\" (UID: \"baee6113-40a8-468e-b343-09e9afd65ce3\") " pod="openshift-dns-operator/dns-operator-744455d44c-c2m6k" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062287 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1ffb8d96-e6e4-4859-ae7d-37f900979485-node-pullsecrets\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062304 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ffb8d96-e6e4-4859-ae7d-37f900979485-config\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062320 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4125448d-5832-43c2-8dba-d95adde7458a-images\") pod \"machine-api-operator-5694c8668f-t8ft9\" (UID: \"4125448d-5832-43c2-8dba-d95adde7458a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062440 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1ffb8d96-e6e4-4859-ae7d-37f900979485-audit-dir\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.062519 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1ffb8d96-e6e4-4859-ae7d-37f900979485-node-pullsecrets\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.063015 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ffb8d96-e6e4-4859-ae7d-37f900979485-config\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.065347 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ffb8d96-e6e4-4859-ae7d-37f900979485-serving-cert\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.065420 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.066036 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-w49hn"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.066303 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.066603 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.066694 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1ffb8d96-e6e4-4859-ae7d-37f900979485-image-import-ca\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.066902 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.067340 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.067524 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.067832 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ffb8d96-e6e4-4859-ae7d-37f900979485-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.067865 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.067978 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.068107 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.068172 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.068215 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.068612 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.069223 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1ffb8d96-e6e4-4859-ae7d-37f900979485-etcd-client\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.070372 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.071125 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.074269 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.074438 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.075047 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.075782 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.076799 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.077378 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.077537 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.079831 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.080346 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-qddjs"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.080523 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.080371 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.080949 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.081000 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-qddjs" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.081469 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hf98d"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.081603 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.082194 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hf98d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.082518 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.082707 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1ffb8d96-e6e4-4859-ae7d-37f900979485-encryption-config\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.083062 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.084742 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-svr2n"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.085425 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-svr2n" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.085862 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gst9d"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.087138 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-b6nrf"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.087842 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b6nrf" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.088526 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9k28f"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.089526 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9k28f" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.089825 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.090616 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.090853 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.091567 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.091681 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-gnnbl"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.092674 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.092688 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.093136 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.093599 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-d4gwh"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.094044 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.094527 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.095063 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.095634 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.096098 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.096629 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.096910 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551044-p748f"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.097400 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551044-p748f" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.098559 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tgqwl"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.099464 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ptjnd"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.101679 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.102115 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.102176 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.102687 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.106051 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5g2pp"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.106698 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.109554 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t8ft9"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.117806 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.119377 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c2m6k"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.122946 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-k96kg"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.126006 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k96kg" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.128269 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nnrmm"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.131535 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hf98d"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.133491 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-8g9lj"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.136146 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.136603 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-qddjs"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.138260 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.139202 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.140511 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.141610 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nj856"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.142931 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.144072 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.145554 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sp2mq"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.146947 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.148254 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.149419 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.150506 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-x927s"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.151592 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-w49hn"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.152974 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.153985 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-b6nrf"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.155045 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9k28f"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.156198 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.156271 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.157267 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-k96kg"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.158354 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551044-p748f"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.159452 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.160904 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-d4gwh"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.162226 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163232 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61b85db0-a292-42a8-8296-d0e476d80c89-serving-cert\") pod \"openshift-config-operator-7777fb866f-fxv7j\" (UID: \"61b85db0-a292-42a8-8296-d0e476d80c89\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163270 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163293 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163322 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgbg9\" (UniqueName: \"kubernetes.io/projected/b72bd4db-e5ea-44f6-bdce-81df2966acfb-kube-api-access-mgbg9\") pod \"catalog-operator-68c6474976-m9kmt\" (UID: \"b72bd4db-e5ea-44f6-bdce-81df2966acfb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163377 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163411 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdsds\" (UniqueName: \"kubernetes.io/projected/f9b3244b-8df0-4330-9887-4092260d416a-kube-api-access-tdsds\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163421 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163445 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163549 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x6ch\" (UniqueName: \"kubernetes.io/projected/baee6113-40a8-468e-b343-09e9afd65ce3-kube-api-access-8x6ch\") pod \"dns-operator-744455d44c-c2m6k\" (UID: \"baee6113-40a8-468e-b343-09e9afd65ce3\") " pod="openshift-dns-operator/dns-operator-744455d44c-c2m6k" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163579 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-config\") pod \"controller-manager-879f6c89f-tgqwl\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163606 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497-serving-cert\") pod \"service-ca-operator-777779d784-svr2n\" (UID: \"8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-svr2n" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163633 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-audit-dir\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163668 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrcvk\" (UniqueName: \"kubernetes.io/projected/4951d770-ae8c-470a-982a-807c82112722-kube-api-access-lrcvk\") pod \"authentication-operator-69f744f599-ptjnd\" (UID: \"4951d770-ae8c-470a-982a-807c82112722\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163701 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/baee6113-40a8-468e-b343-09e9afd65ce3-metrics-tls\") pod \"dns-operator-744455d44c-c2m6k\" (UID: \"baee6113-40a8-468e-b343-09e9afd65ce3\") " pod="openshift-dns-operator/dns-operator-744455d44c-c2m6k" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163727 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-encryption-config\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163745 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ad49b4d8-1218-4a34-8455-831d0f563cbf-client-ca\") pod \"route-controller-manager-6576b87f9c-mvq6r\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163767 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-default-certificate\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163806 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23f87d2b-2a92-4abb-a2a6-2de508837343-serving-cert\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163831 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p79zf\" (UniqueName: \"kubernetes.io/projected/b625331d-48ab-4d48-86fd-fe73466305ff-kube-api-access-p79zf\") pod \"controller-manager-879f6c89f-tgqwl\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163852 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163871 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9c0d96b-ed96-4925-b890-8743879a8b38-service-ca-bundle\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163890 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497-config\") pod \"service-ca-operator-777779d784-svr2n\" (UID: \"8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-svr2n" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163907 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-client-ca\") pod \"controller-manager-879f6c89f-tgqwl\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163927 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b72bd4db-e5ea-44f6-bdce-81df2966acfb-srv-cert\") pod \"catalog-operator-68c6474976-m9kmt\" (UID: \"b72bd4db-e5ea-44f6-bdce-81df2966acfb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163951 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgk9n\" (UniqueName: \"kubernetes.io/projected/6dc446a1-b77b-4f15-ae5f-0141bf374cdd-kube-api-access-pgk9n\") pod \"openshift-apiserver-operator-796bbdcf4f-k5dfc\" (UID: \"6dc446a1-b77b-4f15-ae5f-0141bf374cdd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163973 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.163996 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164015 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8fzv\" (UniqueName: \"kubernetes.io/projected/23f87d2b-2a92-4abb-a2a6-2de508837343-kube-api-access-j8fzv\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164047 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-config\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164066 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5qdx\" (UniqueName: \"kubernetes.io/projected/db0ec273-54d8-4753-b519-243b727a9efd-kube-api-access-f5qdx\") pod \"package-server-manager-789f6589d5-b2pz8\" (UID: \"db0ec273-54d8-4753-b519-243b727a9efd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164090 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dc446a1-b77b-4f15-ae5f-0141bf374cdd-config\") pod \"openshift-apiserver-operator-796bbdcf4f-k5dfc\" (UID: \"6dc446a1-b77b-4f15-ae5f-0141bf374cdd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164110 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-audit-policies\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164127 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/23f87d2b-2a92-4abb-a2a6-2de508837343-etcd-service-ca\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164144 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4912a02a-743d-4bbe-9063-7d99ccd3329a-proxy-tls\") pod \"machine-config-operator-74547568cd-2pfhk\" (UID: \"4912a02a-743d-4bbe-9063-7d99ccd3329a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164162 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39da5087-79bc-4154-b340-22183d9e4417-secret-volume\") pod \"collect-profiles-29551035-5fpgn\" (UID: \"39da5087-79bc-4154-b340-22183d9e4417\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164182 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-serving-cert\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164201 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g47h8\" (UniqueName: \"kubernetes.io/projected/0a005f65-920a-4cdd-b4da-a270953113aa-kube-api-access-g47h8\") pod \"auto-csr-approver-29551044-p748f\" (UID: \"0a005f65-920a-4cdd-b4da-a270953113aa\") " pod="openshift-infra/auto-csr-approver-29551044-p748f" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164222 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4125448d-5832-43c2-8dba-d95adde7458a-config\") pod \"machine-api-operator-5694c8668f-t8ft9\" (UID: \"4125448d-5832-43c2-8dba-d95adde7458a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164241 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164298 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-metrics-certs\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164317 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164320 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4125448d-5832-43c2-8dba-d95adde7458a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t8ft9\" (UID: \"4125448d-5832-43c2-8dba-d95adde7458a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164372 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164381 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14b4aa8c-1066-4388-9442-07722e4c76c2-apiservice-cert\") pod \"packageserver-d55dfcdfc-m4bs6\" (UID: \"14b4aa8c-1066-4388-9442-07722e4c76c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164426 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dc446a1-b77b-4f15-ae5f-0141bf374cdd-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-k5dfc\" (UID: \"6dc446a1-b77b-4f15-ae5f-0141bf374cdd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164507 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/23f87d2b-2a92-4abb-a2a6-2de508837343-etcd-client\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.164880 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.165915 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dc446a1-b77b-4f15-ae5f-0141bf374cdd-config\") pod \"openshift-apiserver-operator-796bbdcf4f-k5dfc\" (UID: \"6dc446a1-b77b-4f15-ae5f-0141bf374cdd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.165929 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-config\") pod \"controller-manager-879f6c89f-tgqwl\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.166357 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.167014 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-config\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.167082 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-svr2n"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.167483 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-client-ca\") pod \"controller-manager-879f6c89f-tgqwl\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.167630 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dc446a1-b77b-4f15-ae5f-0141bf374cdd-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-k5dfc\" (UID: \"6dc446a1-b77b-4f15-ae5f-0141bf374cdd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.167695 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4125448d-5832-43c2-8dba-d95adde7458a-config\") pod \"machine-api-operator-5694c8668f-t8ft9\" (UID: \"4125448d-5832-43c2-8dba-d95adde7458a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.167920 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4125448d-5832-43c2-8dba-d95adde7458a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t8ft9\" (UID: \"4125448d-5832-43c2-8dba-d95adde7458a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.168078 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-audit-dir\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.168248 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-audit-policies\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.168298 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ba55602-0e3f-4722-b437-546732351bc4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-9k28f\" (UID: \"4ba55602-0e3f-4722-b437-546732351bc4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9k28f" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.168333 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.168345 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf3d7b2a-75e7-4c07-9211-b66c64c15def-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-klv4l\" (UID: \"cf3d7b2a-75e7-4c07-9211-b66c64c15def\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.168357 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/23f87d2b-2a92-4abb-a2a6-2de508837343-etcd-service-ca\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.168359 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-serving-cert\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.168400 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2bqx9"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.168695 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.168399 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee2ad8bf-7cf9-4bab-9638-b26d9c593188-serving-cert\") pod \"console-operator-58897d9998-sp2mq\" (UID: \"ee2ad8bf-7cf9-4bab-9638-b26d9c593188\") " pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.168960 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/14b4aa8c-1066-4388-9442-07722e4c76c2-tmpfs\") pod \"packageserver-d55dfcdfc-m4bs6\" (UID: \"14b4aa8c-1066-4388-9442-07722e4c76c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169040 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-oauth-serving-cert\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169067 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf3d7b2a-75e7-4c07-9211-b66c64c15def-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-klv4l\" (UID: \"cf3d7b2a-75e7-4c07-9211-b66c64c15def\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169114 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/db0ec273-54d8-4753-b519-243b727a9efd-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-b2pz8\" (UID: \"db0ec273-54d8-4753-b519-243b727a9efd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169206 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/30a07c97-9d99-41be-956e-ba3d6505d318-srv-cert\") pod \"olm-operator-6b444d44fb-ptbpd\" (UID: \"30a07c97-9d99-41be-956e-ba3d6505d318\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169257 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpcrn\" (UniqueName: \"kubernetes.io/projected/4912a02a-743d-4bbe-9063-7d99ccd3329a-kube-api-access-jpcrn\") pod \"machine-config-operator-74547568cd-2pfhk\" (UID: \"4912a02a-743d-4bbe-9063-7d99ccd3329a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169307 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k77xq\" (UniqueName: \"kubernetes.io/projected/cf3d7b2a-75e7-4c07-9211-b66c64c15def-kube-api-access-k77xq\") pod \"openshift-controller-manager-operator-756b6f6bc6-klv4l\" (UID: \"cf3d7b2a-75e7-4c07-9211-b66c64c15def\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169333 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23f87d2b-2a92-4abb-a2a6-2de508837343-config\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169355 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/23f87d2b-2a92-4abb-a2a6-2de508837343-etcd-ca\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169376 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/61b85db0-a292-42a8-8296-d0e476d80c89-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fxv7j\" (UID: \"61b85db0-a292-42a8-8296-d0e476d80c89\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169400 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad49b4d8-1218-4a34-8455-831d0f563cbf-config\") pod \"route-controller-manager-6576b87f9c-mvq6r\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169422 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f9b3244b-8df0-4330-9887-4092260d416a-audit-dir\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169447 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169474 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdqjn\" (UniqueName: \"kubernetes.io/projected/4ba55602-0e3f-4722-b437-546732351bc4-kube-api-access-fdqjn\") pod \"control-plane-machine-set-operator-78cbb6b69f-9k28f\" (UID: \"4ba55602-0e3f-4722-b437-546732351bc4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9k28f" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169501 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc9tr\" (UniqueName: \"kubernetes.io/projected/4125448d-5832-43c2-8dba-d95adde7458a-kube-api-access-vc9tr\") pod \"machine-api-operator-5694c8668f-t8ft9\" (UID: \"4125448d-5832-43c2-8dba-d95adde7458a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169525 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/711447e6-e7cf-4577-8050-b5a391f96f6a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xfvtf\" (UID: \"711447e6-e7cf-4577-8050-b5a391f96f6a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169541 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/baee6113-40a8-468e-b343-09e9afd65ce3-metrics-tls\") pod \"dns-operator-744455d44c-c2m6k\" (UID: \"baee6113-40a8-468e-b343-09e9afd65ce3\") " pod="openshift-dns-operator/dns-operator-744455d44c-c2m6k" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169549 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4912a02a-743d-4bbe-9063-7d99ccd3329a-images\") pod \"machine-config-operator-74547568cd-2pfhk\" (UID: \"4912a02a-743d-4bbe-9063-7d99ccd3329a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169577 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4951d770-ae8c-470a-982a-807c82112722-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ptjnd\" (UID: \"4951d770-ae8c-470a-982a-807c82112722\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169607 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2l2r\" (UniqueName: \"kubernetes.io/projected/ad307984-e46b-466b-8a5c-63a00976fbbf-kube-api-access-b2l2r\") pod \"cluster-samples-operator-665b6dd947-5g2pp\" (UID: \"ad307984-e46b-466b-8a5c-63a00976fbbf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5g2pp" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169635 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4912a02a-743d-4bbe-9063-7d99ccd3329a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2pfhk\" (UID: \"4912a02a-743d-4bbe-9063-7d99ccd3329a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169698 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4951d770-ae8c-470a-982a-807c82112722-serving-cert\") pod \"authentication-operator-69f744f599-ptjnd\" (UID: \"4951d770-ae8c-470a-982a-807c82112722\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169728 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf3d7b2a-75e7-4c07-9211-b66c64c15def-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-klv4l\" (UID: \"cf3d7b2a-75e7-4c07-9211-b66c64c15def\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169758 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-d4gwh\" (UID: \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\") " pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169763 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61b85db0-a292-42a8-8296-d0e476d80c89-serving-cert\") pod \"openshift-config-operator-7777fb866f-fxv7j\" (UID: \"61b85db0-a292-42a8-8296-d0e476d80c89\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169796 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169826 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn7dv\" (UniqueName: \"kubernetes.io/projected/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-kube-api-access-vn7dv\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169835 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169854 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-audit-policies\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169888 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0959a00-2a83-457f-bcba-7d4af48b11c3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-tvsss\" (UID: \"d0959a00-2a83-457f-bcba-7d4af48b11c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169940 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4125448d-5832-43c2-8dba-d95adde7458a-images\") pod \"machine-api-operator-5694c8668f-t8ft9\" (UID: \"4125448d-5832-43c2-8dba-d95adde7458a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.169970 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67pl5\" (UniqueName: \"kubernetes.io/projected/b9c0d96b-ed96-4925-b890-8743879a8b38-kube-api-access-67pl5\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170000 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e025e897-5ff3-476b-81c9-afdd0ae7a25f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-84448\" (UID: \"e025e897-5ff3-476b-81c9-afdd0ae7a25f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170026 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e025e897-5ff3-476b-81c9-afdd0ae7a25f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-84448\" (UID: \"e025e897-5ff3-476b-81c9-afdd0ae7a25f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170051 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qxpr\" (UniqueName: \"kubernetes.io/projected/e025e897-5ff3-476b-81c9-afdd0ae7a25f-kube-api-access-2qxpr\") pod \"cluster-image-registry-operator-dc59b4c8b-84448\" (UID: \"e025e897-5ff3-476b-81c9-afdd0ae7a25f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170082 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4951d770-ae8c-470a-982a-807c82112722-service-ca-bundle\") pod \"authentication-operator-69f744f599-ptjnd\" (UID: \"4951d770-ae8c-470a-982a-807c82112722\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170109 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f9zl\" (UniqueName: \"kubernetes.io/projected/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-kube-api-access-9f9zl\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170121 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170139 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b625331d-48ab-4d48-86fd-fe73466305ff-serving-cert\") pod \"controller-manager-879f6c89f-tgqwl\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170170 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/30a07c97-9d99-41be-956e-ba3d6505d318-profile-collector-cert\") pod \"olm-operator-6b444d44fb-ptbpd\" (UID: \"30a07c97-9d99-41be-956e-ba3d6505d318\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170198 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-oauth-config\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170228 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tgqwl\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170255 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee2ad8bf-7cf9-4bab-9638-b26d9c593188-config\") pod \"console-operator-58897d9998-sp2mq\" (UID: \"ee2ad8bf-7cf9-4bab-9638-b26d9c593188\") " pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170281 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-service-ca\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170309 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-d4gwh\" (UID: \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\") " pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170334 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14b4aa8c-1066-4388-9442-07722e4c76c2-webhook-cert\") pod \"packageserver-d55dfcdfc-m4bs6\" (UID: \"14b4aa8c-1066-4388-9442-07722e4c76c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170364 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/711447e6-e7cf-4577-8050-b5a391f96f6a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xfvtf\" (UID: \"711447e6-e7cf-4577-8050-b5a391f96f6a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170387 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/61b85db0-a292-42a8-8296-d0e476d80c89-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fxv7j\" (UID: \"61b85db0-a292-42a8-8296-d0e476d80c89\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170392 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee2ad8bf-7cf9-4bab-9638-b26d9c593188-trusted-ca\") pod \"console-operator-58897d9998-sp2mq\" (UID: \"ee2ad8bf-7cf9-4bab-9638-b26d9c593188\") " pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170466 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndfng\" (UniqueName: \"kubernetes.io/projected/61b85db0-a292-42a8-8296-d0e476d80c89-kube-api-access-ndfng\") pod \"openshift-config-operator-7777fb866f-fxv7j\" (UID: \"61b85db0-a292-42a8-8296-d0e476d80c89\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170501 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/711447e6-e7cf-4577-8050-b5a391f96f6a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xfvtf\" (UID: \"711447e6-e7cf-4577-8050-b5a391f96f6a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170530 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad307984-e46b-466b-8a5c-63a00976fbbf-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5g2pp\" (UID: \"ad307984-e46b-466b-8a5c-63a00976fbbf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5g2pp" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170558 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss45h\" (UniqueName: \"kubernetes.io/projected/39da5087-79bc-4154-b340-22183d9e4417-kube-api-access-ss45h\") pod \"collect-profiles-29551035-5fpgn\" (UID: \"39da5087-79bc-4154-b340-22183d9e4417\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170588 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s85j9\" (UniqueName: \"kubernetes.io/projected/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-kube-api-access-s85j9\") pod \"marketplace-operator-79b997595-d4gwh\" (UID: \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\") " pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170620 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad49b4d8-1218-4a34-8455-831d0f563cbf-serving-cert\") pod \"route-controller-manager-6576b87f9c-mvq6r\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170667 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-trusted-ca-bundle\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170693 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.171264 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/23f87d2b-2a92-4abb-a2a6-2de508837343-etcd-ca\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.171355 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4951d770-ae8c-470a-982a-807c82112722-config\") pod \"authentication-operator-69f744f599-ptjnd\" (UID: \"4951d770-ae8c-470a-982a-807c82112722\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.171480 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee2ad8bf-7cf9-4bab-9638-b26d9c593188-trusted-ca\") pod \"console-operator-58897d9998-sp2mq\" (UID: \"ee2ad8bf-7cf9-4bab-9638-b26d9c593188\") " pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.171496 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e025e897-5ff3-476b-81c9-afdd0ae7a25f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-84448\" (UID: \"e025e897-5ff3-476b-81c9-afdd0ae7a25f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.171592 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6nhpx"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.171934 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-audit-policies\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.170697 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4951d770-ae8c-470a-982a-807c82112722-config\") pod \"authentication-operator-69f744f599-ptjnd\" (UID: \"4951d770-ae8c-470a-982a-807c82112722\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.172615 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-oauth-serving-cert\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.172870 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4951d770-ae8c-470a-982a-807c82112722-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ptjnd\" (UID: \"4951d770-ae8c-470a-982a-807c82112722\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.172964 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.173093 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6nhpx" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.173700 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee2ad8bf-7cf9-4bab-9638-b26d9c593188-config\") pod \"console-operator-58897d9998-sp2mq\" (UID: \"ee2ad8bf-7cf9-4bab-9638-b26d9c593188\") " pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.173850 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tgqwl\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.174393 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4125448d-5832-43c2-8dba-d95adde7458a-images\") pod \"machine-api-operator-5694c8668f-t8ft9\" (UID: \"4125448d-5832-43c2-8dba-d95adde7458a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.174464 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2bqx9"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.174569 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-oauth-config\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.174789 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad49b4d8-1218-4a34-8455-831d0f563cbf-serving-cert\") pod \"route-controller-manager-6576b87f9c-mvq6r\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.174854 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48tpb\" (UniqueName: \"kubernetes.io/projected/ad49b4d8-1218-4a34-8455-831d0f563cbf-kube-api-access-48tpb\") pod \"route-controller-manager-6576b87f9c-mvq6r\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.174859 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6nhpx"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.174886 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9vzq\" (UniqueName: \"kubernetes.io/projected/14b4aa8c-1066-4388-9442-07722e4c76c2-kube-api-access-w9vzq\") pod \"packageserver-d55dfcdfc-m4bs6\" (UID: \"14b4aa8c-1066-4388-9442-07722e4c76c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.174915 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.174938 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-stats-auth\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.174971 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e025e897-5ff3-476b-81c9-afdd0ae7a25f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-84448\" (UID: \"e025e897-5ff3-476b-81c9-afdd0ae7a25f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.174998 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0959a00-2a83-457f-bcba-7d4af48b11c3-config\") pod \"kube-apiserver-operator-766d6c64bb-tvsss\" (UID: \"d0959a00-2a83-457f-bcba-7d4af48b11c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.175010 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-service-ca\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.175027 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnv7g\" (UniqueName: \"kubernetes.io/projected/8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497-kube-api-access-vnv7g\") pod \"service-ca-operator-777779d784-svr2n\" (UID: \"8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-svr2n" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.175211 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4951d770-ae8c-470a-982a-807c82112722-service-ca-bundle\") pod \"authentication-operator-69f744f599-ptjnd\" (UID: \"4951d770-ae8c-470a-982a-807c82112722\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.175295 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf3d7b2a-75e7-4c07-9211-b66c64c15def-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-klv4l\" (UID: \"cf3d7b2a-75e7-4c07-9211-b66c64c15def\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.175314 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f9b3244b-8df0-4330-9887-4092260d416a-audit-dir\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.175385 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-trusted-ca-bundle\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.175502 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-etcd-client\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.175619 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.175713 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ad49b4d8-1218-4a34-8455-831d0f563cbf-client-ca\") pod \"route-controller-manager-6576b87f9c-mvq6r\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.175860 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0959a00-2a83-457f-bcba-7d4af48b11c3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-tvsss\" (UID: \"d0959a00-2a83-457f-bcba-7d4af48b11c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.175962 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.175991 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdclx\" (UniqueName: \"kubernetes.io/projected/30a07c97-9d99-41be-956e-ba3d6505d318-kube-api-access-rdclx\") pod \"olm-operator-6b444d44fb-ptbpd\" (UID: \"30a07c97-9d99-41be-956e-ba3d6505d318\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.176039 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.176049 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-kdxg4"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.176100 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.176374 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-serving-cert\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.176482 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b72bd4db-e5ea-44f6-bdce-81df2966acfb-profile-collector-cert\") pod \"catalog-operator-68c6474976-m9kmt\" (UID: \"b72bd4db-e5ea-44f6-bdce-81df2966acfb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.176668 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad49b4d8-1218-4a34-8455-831d0f563cbf-config\") pod \"route-controller-manager-6576b87f9c-mvq6r\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.176857 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-kdxg4" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.177084 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee2ad8bf-7cf9-4bab-9638-b26d9c593188-serving-cert\") pod \"console-operator-58897d9998-sp2mq\" (UID: \"ee2ad8bf-7cf9-4bab-9638-b26d9c593188\") " pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.177170 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgwzj\" (UniqueName: \"kubernetes.io/projected/ee2ad8bf-7cf9-4bab-9638-b26d9c593188-kube-api-access-bgwzj\") pod \"console-operator-58897d9998-sp2mq\" (UID: \"ee2ad8bf-7cf9-4bab-9638-b26d9c593188\") " pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.177229 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39da5087-79bc-4154-b340-22183d9e4417-config-volume\") pod \"collect-profiles-29551035-5fpgn\" (UID: \"39da5087-79bc-4154-b340-22183d9e4417\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.177754 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e025e897-5ff3-476b-81c9-afdd0ae7a25f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-84448\" (UID: \"e025e897-5ff3-476b-81c9-afdd0ae7a25f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.177811 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/23f87d2b-2a92-4abb-a2a6-2de508837343-etcd-client\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.178080 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-encryption-config\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.179272 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.179287 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4951d770-ae8c-470a-982a-807c82112722-serving-cert\") pod \"authentication-operator-69f744f599-ptjnd\" (UID: \"4951d770-ae8c-470a-982a-807c82112722\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.179372 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.179427 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b625331d-48ab-4d48-86fd-fe73466305ff-serving-cert\") pod \"controller-manager-879f6c89f-tgqwl\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.180338 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.180557 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-serving-cert\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.183369 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23f87d2b-2a92-4abb-a2a6-2de508837343-serving-cert\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.183678 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23f87d2b-2a92-4abb-a2a6-2de508837343-config\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.183681 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.187255 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-etcd-client\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.213434 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h4rc\" (UniqueName: \"kubernetes.io/projected/1ffb8d96-e6e4-4859-ae7d-37f900979485-kube-api-access-9h4rc\") pod \"apiserver-76f77b778f-gst9d\" (UID: \"1ffb8d96-e6e4-4859-ae7d-37f900979485\") " pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.215983 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.219267 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.236707 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.256430 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.276614 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.278741 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b72bd4db-e5ea-44f6-bdce-81df2966acfb-profile-collector-cert\") pod \"catalog-operator-68c6474976-m9kmt\" (UID: \"b72bd4db-e5ea-44f6-bdce-81df2966acfb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.278780 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39da5087-79bc-4154-b340-22183d9e4417-config-volume\") pod \"collect-profiles-29551035-5fpgn\" (UID: \"39da5087-79bc-4154-b340-22183d9e4417\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.278804 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgbg9\" (UniqueName: \"kubernetes.io/projected/b72bd4db-e5ea-44f6-bdce-81df2966acfb-kube-api-access-mgbg9\") pod \"catalog-operator-68c6474976-m9kmt\" (UID: \"b72bd4db-e5ea-44f6-bdce-81df2966acfb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.278854 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497-serving-cert\") pod \"service-ca-operator-777779d784-svr2n\" (UID: \"8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-svr2n" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.278877 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-default-certificate\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.278900 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9c0d96b-ed96-4925-b890-8743879a8b38-service-ca-bundle\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.278914 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497-config\") pod \"service-ca-operator-777779d784-svr2n\" (UID: \"8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-svr2n" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.278934 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b72bd4db-e5ea-44f6-bdce-81df2966acfb-srv-cert\") pod \"catalog-operator-68c6474976-m9kmt\" (UID: \"b72bd4db-e5ea-44f6-bdce-81df2966acfb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.278963 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5qdx\" (UniqueName: \"kubernetes.io/projected/db0ec273-54d8-4753-b519-243b727a9efd-kube-api-access-f5qdx\") pod \"package-server-manager-789f6589d5-b2pz8\" (UID: \"db0ec273-54d8-4753-b519-243b727a9efd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279159 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4912a02a-743d-4bbe-9063-7d99ccd3329a-proxy-tls\") pod \"machine-config-operator-74547568cd-2pfhk\" (UID: \"4912a02a-743d-4bbe-9063-7d99ccd3329a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279212 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39da5087-79bc-4154-b340-22183d9e4417-secret-volume\") pod \"collect-profiles-29551035-5fpgn\" (UID: \"39da5087-79bc-4154-b340-22183d9e4417\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279240 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g47h8\" (UniqueName: \"kubernetes.io/projected/0a005f65-920a-4cdd-b4da-a270953113aa-kube-api-access-g47h8\") pod \"auto-csr-approver-29551044-p748f\" (UID: \"0a005f65-920a-4cdd-b4da-a270953113aa\") " pod="openshift-infra/auto-csr-approver-29551044-p748f" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279287 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-metrics-certs\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279305 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14b4aa8c-1066-4388-9442-07722e4c76c2-apiservice-cert\") pod \"packageserver-d55dfcdfc-m4bs6\" (UID: \"14b4aa8c-1066-4388-9442-07722e4c76c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279387 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ba55602-0e3f-4722-b437-546732351bc4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-9k28f\" (UID: \"4ba55602-0e3f-4722-b437-546732351bc4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9k28f" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279441 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/14b4aa8c-1066-4388-9442-07722e4c76c2-tmpfs\") pod \"packageserver-d55dfcdfc-m4bs6\" (UID: \"14b4aa8c-1066-4388-9442-07722e4c76c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279490 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/db0ec273-54d8-4753-b519-243b727a9efd-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-b2pz8\" (UID: \"db0ec273-54d8-4753-b519-243b727a9efd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279510 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/30a07c97-9d99-41be-956e-ba3d6505d318-srv-cert\") pod \"olm-operator-6b444d44fb-ptbpd\" (UID: \"30a07c97-9d99-41be-956e-ba3d6505d318\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279528 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpcrn\" (UniqueName: \"kubernetes.io/projected/4912a02a-743d-4bbe-9063-7d99ccd3329a-kube-api-access-jpcrn\") pod \"machine-config-operator-74547568cd-2pfhk\" (UID: \"4912a02a-743d-4bbe-9063-7d99ccd3329a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279615 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdqjn\" (UniqueName: \"kubernetes.io/projected/4ba55602-0e3f-4722-b437-546732351bc4-kube-api-access-fdqjn\") pod \"control-plane-machine-set-operator-78cbb6b69f-9k28f\" (UID: \"4ba55602-0e3f-4722-b437-546732351bc4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9k28f" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279631 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4912a02a-743d-4bbe-9063-7d99ccd3329a-images\") pod \"machine-config-operator-74547568cd-2pfhk\" (UID: \"4912a02a-743d-4bbe-9063-7d99ccd3329a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279735 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/711447e6-e7cf-4577-8050-b5a391f96f6a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xfvtf\" (UID: \"711447e6-e7cf-4577-8050-b5a391f96f6a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279754 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4912a02a-743d-4bbe-9063-7d99ccd3329a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2pfhk\" (UID: \"4912a02a-743d-4bbe-9063-7d99ccd3329a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279855 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2l2r\" (UniqueName: \"kubernetes.io/projected/ad307984-e46b-466b-8a5c-63a00976fbbf-kube-api-access-b2l2r\") pod \"cluster-samples-operator-665b6dd947-5g2pp\" (UID: \"ad307984-e46b-466b-8a5c-63a00976fbbf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5g2pp" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279875 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-d4gwh\" (UID: \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\") " pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279965 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0959a00-2a83-457f-bcba-7d4af48b11c3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-tvsss\" (UID: \"d0959a00-2a83-457f-bcba-7d4af48b11c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.279986 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67pl5\" (UniqueName: \"kubernetes.io/projected/b9c0d96b-ed96-4925-b890-8743879a8b38-kube-api-access-67pl5\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280014 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/14b4aa8c-1066-4388-9442-07722e4c76c2-tmpfs\") pod \"packageserver-d55dfcdfc-m4bs6\" (UID: \"14b4aa8c-1066-4388-9442-07722e4c76c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280088 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/30a07c97-9d99-41be-956e-ba3d6505d318-profile-collector-cert\") pod \"olm-operator-6b444d44fb-ptbpd\" (UID: \"30a07c97-9d99-41be-956e-ba3d6505d318\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280109 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-d4gwh\" (UID: \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\") " pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280178 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14b4aa8c-1066-4388-9442-07722e4c76c2-webhook-cert\") pod \"packageserver-d55dfcdfc-m4bs6\" (UID: \"14b4aa8c-1066-4388-9442-07722e4c76c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280196 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/711447e6-e7cf-4577-8050-b5a391f96f6a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xfvtf\" (UID: \"711447e6-e7cf-4577-8050-b5a391f96f6a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280239 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/711447e6-e7cf-4577-8050-b5a391f96f6a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xfvtf\" (UID: \"711447e6-e7cf-4577-8050-b5a391f96f6a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280258 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad307984-e46b-466b-8a5c-63a00976fbbf-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5g2pp\" (UID: \"ad307984-e46b-466b-8a5c-63a00976fbbf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5g2pp" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280348 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss45h\" (UniqueName: \"kubernetes.io/projected/39da5087-79bc-4154-b340-22183d9e4417-kube-api-access-ss45h\") pod \"collect-profiles-29551035-5fpgn\" (UID: \"39da5087-79bc-4154-b340-22183d9e4417\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280398 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s85j9\" (UniqueName: \"kubernetes.io/projected/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-kube-api-access-s85j9\") pod \"marketplace-operator-79b997595-d4gwh\" (UID: \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\") " pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280433 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9vzq\" (UniqueName: \"kubernetes.io/projected/14b4aa8c-1066-4388-9442-07722e4c76c2-kube-api-access-w9vzq\") pod \"packageserver-d55dfcdfc-m4bs6\" (UID: \"14b4aa8c-1066-4388-9442-07722e4c76c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280523 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-stats-auth\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280542 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0959a00-2a83-457f-bcba-7d4af48b11c3-config\") pod \"kube-apiserver-operator-766d6c64bb-tvsss\" (UID: \"d0959a00-2a83-457f-bcba-7d4af48b11c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280557 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnv7g\" (UniqueName: \"kubernetes.io/projected/8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497-kube-api-access-vnv7g\") pod \"service-ca-operator-777779d784-svr2n\" (UID: \"8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-svr2n" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280574 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0959a00-2a83-457f-bcba-7d4af48b11c3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-tvsss\" (UID: \"d0959a00-2a83-457f-bcba-7d4af48b11c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280456 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4912a02a-743d-4bbe-9063-7d99ccd3329a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2pfhk\" (UID: \"4912a02a-743d-4bbe-9063-7d99ccd3329a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.280827 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdclx\" (UniqueName: \"kubernetes.io/projected/30a07c97-9d99-41be-956e-ba3d6505d318-kube-api-access-rdclx\") pod \"olm-operator-6b444d44fb-ptbpd\" (UID: \"30a07c97-9d99-41be-956e-ba3d6505d318\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.285092 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad307984-e46b-466b-8a5c-63a00976fbbf-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5g2pp\" (UID: \"ad307984-e46b-466b-8a5c-63a00976fbbf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5g2pp" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.300192 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.325372 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.339504 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.356919 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.377365 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.396597 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.417180 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.432129 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gst9d"] Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.437387 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.456372 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.477071 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.497311 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.517175 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.537780 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.545480 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/30a07c97-9d99-41be-956e-ba3d6505d318-profile-collector-cert\") pod \"olm-operator-6b444d44fb-ptbpd\" (UID: \"30a07c97-9d99-41be-956e-ba3d6505d318\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.545502 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39da5087-79bc-4154-b340-22183d9e4417-secret-volume\") pod \"collect-profiles-29551035-5fpgn\" (UID: \"39da5087-79bc-4154-b340-22183d9e4417\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.546934 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b72bd4db-e5ea-44f6-bdce-81df2966acfb-profile-collector-cert\") pod \"catalog-operator-68c6474976-m9kmt\" (UID: \"b72bd4db-e5ea-44f6-bdce-81df2966acfb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.557701 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.577841 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.580565 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39da5087-79bc-4154-b340-22183d9e4417-config-volume\") pod \"collect-profiles-29551035-5fpgn\" (UID: \"39da5087-79bc-4154-b340-22183d9e4417\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.597780 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.618020 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.623755 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4912a02a-743d-4bbe-9063-7d99ccd3329a-proxy-tls\") pod \"machine-config-operator-74547568cd-2pfhk\" (UID: \"4912a02a-743d-4bbe-9063-7d99ccd3329a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.637390 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.641839 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4912a02a-743d-4bbe-9063-7d99ccd3329a-images\") pod \"machine-config-operator-74547568cd-2pfhk\" (UID: \"4912a02a-743d-4bbe-9063-7d99ccd3329a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.671499 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.676736 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.681980 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0959a00-2a83-457f-bcba-7d4af48b11c3-config\") pod \"kube-apiserver-operator-766d6c64bb-tvsss\" (UID: \"d0959a00-2a83-457f-bcba-7d4af48b11c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.685130 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0959a00-2a83-457f-bcba-7d4af48b11c3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-tvsss\" (UID: \"d0959a00-2a83-457f-bcba-7d4af48b11c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.697132 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.717025 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.738273 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.757760 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.776511 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.796916 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.816870 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.837516 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.858214 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.864309 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/db0ec273-54d8-4753-b519-243b727a9efd-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-b2pz8\" (UID: \"db0ec273-54d8-4753-b519-243b727a9efd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.877219 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.896467 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.916637 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.936977 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.958432 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.977713 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 09 13:24:36 crc kubenswrapper[4764]: I0309 13:24:36.998511 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.002898 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497-serving-cert\") pod \"service-ca-operator-777779d784-svr2n\" (UID: \"8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-svr2n" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.016854 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.038324 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.040500 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497-config\") pod \"service-ca-operator-777779d784-svr2n\" (UID: \"8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-svr2n" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.057350 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.077976 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.095383 4764 request.go:700] Waited for 1.00718331s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator/secrets?fieldSelector=metadata.name%3Dkube-storage-version-migrator-sa-dockercfg-5xfcg&limit=500&resourceVersion=0 Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.097553 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.117256 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.137106 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.142678 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ba55602-0e3f-4722-b437-546732351bc4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-9k28f\" (UID: \"4ba55602-0e3f-4722-b437-546732351bc4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9k28f" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.158070 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.186795 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.197472 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.217270 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.237083 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.257725 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.266910 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/711447e6-e7cf-4577-8050-b5a391f96f6a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xfvtf\" (UID: \"711447e6-e7cf-4577-8050-b5a391f96f6a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.277924 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.279558 4764 secret.go:188] Couldn't get secret openshift-ingress/router-metrics-certs-default: failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.279582 4764 configmap.go:193] Couldn't get configMap openshift-ingress/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.279628 4764 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.279670 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-metrics-certs podName:b9c0d96b-ed96-4925-b890-8743879a8b38 nodeName:}" failed. No retries permitted until 2026-03-09 13:24:37.779619088 +0000 UTC m=+233.029791016 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-metrics-certs") pod "router-default-5444994796-gnnbl" (UID: "b9c0d96b-ed96-4925-b890-8743879a8b38") : failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.279678 4764 secret.go:188] Couldn't get secret openshift-ingress/router-certs-default: failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.279713 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b9c0d96b-ed96-4925-b890-8743879a8b38-service-ca-bundle podName:b9c0d96b-ed96-4925-b890-8743879a8b38 nodeName:}" failed. No retries permitted until 2026-03-09 13:24:37.77969103 +0000 UTC m=+233.029862958 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b9c0d96b-ed96-4925-b890-8743879a8b38-service-ca-bundle") pod "router-default-5444994796-gnnbl" (UID: "b9c0d96b-ed96-4925-b890-8743879a8b38") : failed to sync configmap cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.279673 4764 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.279719 4764 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.279744 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14b4aa8c-1066-4388-9442-07722e4c76c2-apiservice-cert podName:14b4aa8c-1066-4388-9442-07722e4c76c2 nodeName:}" failed. No retries permitted until 2026-03-09 13:24:37.779731021 +0000 UTC m=+233.029902939 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/14b4aa8c-1066-4388-9442-07722e4c76c2-apiservice-cert") pod "packageserver-d55dfcdfc-m4bs6" (UID: "14b4aa8c-1066-4388-9442-07722e4c76c2") : failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.279890 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-default-certificate podName:b9c0d96b-ed96-4925-b890-8743879a8b38 nodeName:}" failed. No retries permitted until 2026-03-09 13:24:37.779871414 +0000 UTC m=+233.030043322 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-certificate" (UniqueName: "kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-default-certificate") pod "router-default-5444994796-gnnbl" (UID: "b9c0d96b-ed96-4925-b890-8743879a8b38") : failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.279919 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b72bd4db-e5ea-44f6-bdce-81df2966acfb-srv-cert podName:b72bd4db-e5ea-44f6-bdce-81df2966acfb nodeName:}" failed. No retries permitted until 2026-03-09 13:24:37.779909355 +0000 UTC m=+233.030081343 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/b72bd4db-e5ea-44f6-bdce-81df2966acfb-srv-cert") pod "catalog-operator-68c6474976-m9kmt" (UID: "b72bd4db-e5ea-44f6-bdce-81df2966acfb") : failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.279948 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30a07c97-9d99-41be-956e-ba3d6505d318-srv-cert podName:30a07c97-9d99-41be-956e-ba3d6505d318 nodeName:}" failed. No retries permitted until 2026-03-09 13:24:37.779933616 +0000 UTC m=+233.030105754 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/30a07c97-9d99-41be-956e-ba3d6505d318-srv-cert") pod "olm-operator-6b444d44fb-ptbpd" (UID: "30a07c97-9d99-41be-956e-ba3d6505d318") : failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.281114 4764 configmap.go:193] Couldn't get configMap openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.281149 4764 secret.go:188] Couldn't get secret openshift-ingress/router-stats-default: failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.281178 4764 secret.go:188] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.281195 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-stats-auth podName:b9c0d96b-ed96-4925-b890-8743879a8b38 nodeName:}" failed. No retries permitted until 2026-03-09 13:24:37.78118266 +0000 UTC m=+233.031354668 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "stats-auth" (UniqueName: "kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-stats-auth") pod "router-default-5444994796-gnnbl" (UID: "b9c0d96b-ed96-4925-b890-8743879a8b38") : failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.281207 4764 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.281339 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/711447e6-e7cf-4577-8050-b5a391f96f6a-config podName:711447e6-e7cf-4577-8050-b5a391f96f6a nodeName:}" failed. No retries permitted until 2026-03-09 13:24:37.781285202 +0000 UTC m=+233.031457220 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/711447e6-e7cf-4577-8050-b5a391f96f6a-config") pod "openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" (UID: "711447e6-e7cf-4577-8050-b5a391f96f6a") : failed to sync configmap cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.281210 4764 configmap.go:193] Couldn't get configMap openshift-marketplace/marketplace-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.281382 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14b4aa8c-1066-4388-9442-07722e4c76c2-webhook-cert podName:14b4aa8c-1066-4388-9442-07722e4c76c2 nodeName:}" failed. No retries permitted until 2026-03-09 13:24:37.781353824 +0000 UTC m=+233.031525732 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/14b4aa8c-1066-4388-9442-07722e4c76c2-webhook-cert") pod "packageserver-d55dfcdfc-m4bs6" (UID: "14b4aa8c-1066-4388-9442-07722e4c76c2") : failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.281405 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-operator-metrics podName:1ccc5b44-95ad-4f4c-8086-c176c41bbd19 nodeName:}" failed. No retries permitted until 2026-03-09 13:24:37.781395595 +0000 UTC m=+233.031567493 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-operator-metrics") pod "marketplace-operator-79b997595-d4gwh" (UID: "1ccc5b44-95ad-4f4c-8086-c176c41bbd19") : failed to sync secret cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: E0309 13:24:37.281443 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-trusted-ca podName:1ccc5b44-95ad-4f4c-8086-c176c41bbd19 nodeName:}" failed. No retries permitted until 2026-03-09 13:24:37.781428076 +0000 UTC m=+233.031599994 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-trusted-ca" (UniqueName: "kubernetes.io/configmap/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-trusted-ca") pod "marketplace-operator-79b997595-d4gwh" (UID: "1ccc5b44-95ad-4f4c-8086-c176c41bbd19") : failed to sync configmap cache: timed out waiting for the condition Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.296363 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.316191 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.337002 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.357031 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.377319 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.397166 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.416696 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.437216 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.442013 4764 generic.go:334] "Generic (PLEG): container finished" podID="1ffb8d96-e6e4-4859-ae7d-37f900979485" containerID="2327438a6c9ed8c1a989acd23de7dd26ca5827f91ab7de12fe2f05d2d7bc5774" exitCode=0 Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.442089 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gst9d" event={"ID":"1ffb8d96-e6e4-4859-ae7d-37f900979485","Type":"ContainerDied","Data":"2327438a6c9ed8c1a989acd23de7dd26ca5827f91ab7de12fe2f05d2d7bc5774"} Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.442140 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gst9d" event={"ID":"1ffb8d96-e6e4-4859-ae7d-37f900979485","Type":"ContainerStarted","Data":"1e3c772518ebee198daf071259de4022c5e3a22024e2e3619714d0e5c88c5454"} Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.456703 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.478768 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.498178 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.519329 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.537284 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.564692 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.578258 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.598037 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.617507 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.637250 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.656796 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.677011 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.696879 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.717174 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.736935 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.778185 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.796874 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.807280 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/30a07c97-9d99-41be-956e-ba3d6505d318-srv-cert\") pod \"olm-operator-6b444d44fb-ptbpd\" (UID: \"30a07c97-9d99-41be-956e-ba3d6505d318\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.807354 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-d4gwh\" (UID: \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\") " pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.807415 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-d4gwh\" (UID: \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\") " pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.807431 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14b4aa8c-1066-4388-9442-07722e4c76c2-webhook-cert\") pod \"packageserver-d55dfcdfc-m4bs6\" (UID: \"14b4aa8c-1066-4388-9442-07722e4c76c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.807458 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/711447e6-e7cf-4577-8050-b5a391f96f6a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xfvtf\" (UID: \"711447e6-e7cf-4577-8050-b5a391f96f6a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.807502 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-stats-auth\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.808951 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-default-certificate\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.809003 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9c0d96b-ed96-4925-b890-8743879a8b38-service-ca-bundle\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.809034 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b72bd4db-e5ea-44f6-bdce-81df2966acfb-srv-cert\") pod \"catalog-operator-68c6474976-m9kmt\" (UID: \"b72bd4db-e5ea-44f6-bdce-81df2966acfb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.809109 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-metrics-certs\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.809131 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14b4aa8c-1066-4388-9442-07722e4c76c2-apiservice-cert\") pod \"packageserver-d55dfcdfc-m4bs6\" (UID: \"14b4aa8c-1066-4388-9442-07722e4c76c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.809169 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/711447e6-e7cf-4577-8050-b5a391f96f6a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xfvtf\" (UID: \"711447e6-e7cf-4577-8050-b5a391f96f6a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.809545 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-d4gwh\" (UID: \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\") " pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.810231 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9c0d96b-ed96-4925-b890-8743879a8b38-service-ca-bundle\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.812599 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-d4gwh\" (UID: \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\") " pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.811891 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14b4aa8c-1066-4388-9442-07722e4c76c2-webhook-cert\") pod \"packageserver-d55dfcdfc-m4bs6\" (UID: \"14b4aa8c-1066-4388-9442-07722e4c76c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.812779 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-stats-auth\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.813175 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-default-certificate\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.813197 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b72bd4db-e5ea-44f6-bdce-81df2966acfb-srv-cert\") pod \"catalog-operator-68c6474976-m9kmt\" (UID: \"b72bd4db-e5ea-44f6-bdce-81df2966acfb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.813266 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14b4aa8c-1066-4388-9442-07722e4c76c2-apiservice-cert\") pod \"packageserver-d55dfcdfc-m4bs6\" (UID: \"14b4aa8c-1066-4388-9442-07722e4c76c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.813267 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9c0d96b-ed96-4925-b890-8743879a8b38-metrics-certs\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.813990 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/30a07c97-9d99-41be-956e-ba3d6505d318-srv-cert\") pod \"olm-operator-6b444d44fb-ptbpd\" (UID: \"30a07c97-9d99-41be-956e-ba3d6505d318\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.817611 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.836594 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.874760 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdsds\" (UniqueName: \"kubernetes.io/projected/f9b3244b-8df0-4330-9887-4092260d416a-kube-api-access-tdsds\") pod \"oauth-openshift-558db77b4-nj856\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.891533 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x6ch\" (UniqueName: \"kubernetes.io/projected/baee6113-40a8-468e-b343-09e9afd65ce3-kube-api-access-8x6ch\") pod \"dns-operator-744455d44c-c2m6k\" (UID: \"baee6113-40a8-468e-b343-09e9afd65ce3\") " pod="openshift-dns-operator/dns-operator-744455d44c-c2m6k" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.910439 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgk9n\" (UniqueName: \"kubernetes.io/projected/6dc446a1-b77b-4f15-ae5f-0141bf374cdd-kube-api-access-pgk9n\") pod \"openshift-apiserver-operator-796bbdcf4f-k5dfc\" (UID: \"6dc446a1-b77b-4f15-ae5f-0141bf374cdd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.913530 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-c2m6k" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.940064 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p79zf\" (UniqueName: \"kubernetes.io/projected/b625331d-48ab-4d48-86fd-fe73466305ff-kube-api-access-p79zf\") pod \"controller-manager-879f6c89f-tgqwl\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.957495 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8fzv\" (UniqueName: \"kubernetes.io/projected/23f87d2b-2a92-4abb-a2a6-2de508837343-kube-api-access-j8fzv\") pod \"etcd-operator-b45778765-nnrmm\" (UID: \"23f87d2b-2a92-4abb-a2a6-2de508837343\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.958111 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.971864 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrcvk\" (UniqueName: \"kubernetes.io/projected/4951d770-ae8c-470a-982a-807c82112722-kube-api-access-lrcvk\") pod \"authentication-operator-69f744f599-ptjnd\" (UID: \"4951d770-ae8c-470a-982a-807c82112722\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.993802 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k77xq\" (UniqueName: \"kubernetes.io/projected/cf3d7b2a-75e7-4c07-9211-b66c64c15def-kube-api-access-k77xq\") pod \"openshift-controller-manager-operator-756b6f6bc6-klv4l\" (UID: \"cf3d7b2a-75e7-4c07-9211-b66c64c15def\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l" Mar 09 13:24:37 crc kubenswrapper[4764]: I0309 13:24:37.997891 4764 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.016790 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.037001 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.038469 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.068043 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.078801 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn7dv\" (UniqueName: \"kubernetes.io/projected/f95ed010-a6a4-49ab-b61b-fc4ee2d856bb-kube-api-access-vn7dv\") pod \"apiserver-7bbb656c7d-mp9p7\" (UID: \"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.101502 4764 request.go:700] Waited for 1.92812916s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/configmaps?fieldSelector=metadata.name%3Ddns-default&limit=500&resourceVersion=0 Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.103107 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.105521 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndfng\" (UniqueName: \"kubernetes.io/projected/61b85db0-a292-42a8-8296-d0e476d80c89-kube-api-access-ndfng\") pod \"openshift-config-operator-7777fb866f-fxv7j\" (UID: \"61b85db0-a292-42a8-8296-d0e476d80c89\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.116747 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.117788 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c2m6k"] Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.120352 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.137397 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.151340 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.163763 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc"] Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.175035 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48tpb\" (UniqueName: \"kubernetes.io/projected/ad49b4d8-1218-4a34-8455-831d0f563cbf-kube-api-access-48tpb\") pod \"route-controller-manager-6576b87f9c-mvq6r\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.187231 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.193007 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc9tr\" (UniqueName: \"kubernetes.io/projected/4125448d-5832-43c2-8dba-d95adde7458a-kube-api-access-vc9tr\") pod \"machine-api-operator-5694c8668f-t8ft9\" (UID: \"4125448d-5832-43c2-8dba-d95adde7458a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.214988 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qxpr\" (UniqueName: \"kubernetes.io/projected/e025e897-5ff3-476b-81c9-afdd0ae7a25f-kube-api-access-2qxpr\") pod \"cluster-image-registry-operator-dc59b4c8b-84448\" (UID: \"e025e897-5ff3-476b-81c9-afdd0ae7a25f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.225134 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ptjnd"] Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.231233 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e025e897-5ff3-476b-81c9-afdd0ae7a25f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-84448\" (UID: \"e025e897-5ff3-476b-81c9-afdd0ae7a25f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.241400 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.250001 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.257228 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.269354 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f9zl\" (UniqueName: \"kubernetes.io/projected/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-kube-api-access-9f9zl\") pod \"console-f9d7485db-8g9lj\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.278519 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.293596 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tgqwl"] Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.300305 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 09 13:24:38 crc kubenswrapper[4764]: W0309 13:24:38.317869 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb625331d_48ab_4d48_86fd_fe73466305ff.slice/crio-22fd4fd8035fc57032bbb13916cd7b7736a6fafcf098f02294f9432a8954ab17 WatchSource:0}: Error finding container 22fd4fd8035fc57032bbb13916cd7b7736a6fafcf098f02294f9432a8954ab17: Status 404 returned error can't find the container with id 22fd4fd8035fc57032bbb13916cd7b7736a6fafcf098f02294f9432a8954ab17 Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.334790 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgwzj\" (UniqueName: \"kubernetes.io/projected/ee2ad8bf-7cf9-4bab-9638-b26d9c593188-kube-api-access-bgwzj\") pod \"console-operator-58897d9998-sp2mq\" (UID: \"ee2ad8bf-7cf9-4bab-9638-b26d9c593188\") " pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.369043 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l"] Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.376414 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgbg9\" (UniqueName: \"kubernetes.io/projected/b72bd4db-e5ea-44f6-bdce-81df2966acfb-kube-api-access-mgbg9\") pod \"catalog-operator-68c6474976-m9kmt\" (UID: \"b72bd4db-e5ea-44f6-bdce-81df2966acfb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.394617 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.396392 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5qdx\" (UniqueName: \"kubernetes.io/projected/db0ec273-54d8-4753-b519-243b727a9efd-kube-api-access-f5qdx\") pod \"package-server-manager-789f6589d5-b2pz8\" (UID: \"db0ec273-54d8-4753-b519-243b727a9efd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.416248 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g47h8\" (UniqueName: \"kubernetes.io/projected/0a005f65-920a-4cdd-b4da-a270953113aa-kube-api-access-g47h8\") pod \"auto-csr-approver-29551044-p748f\" (UID: \"0a005f65-920a-4cdd-b4da-a270953113aa\") " pod="openshift-infra/auto-csr-approver-29551044-p748f" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.429209 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.452429 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpcrn\" (UniqueName: \"kubernetes.io/projected/4912a02a-743d-4bbe-9063-7d99ccd3329a-kube-api-access-jpcrn\") pod \"machine-config-operator-74547568cd-2pfhk\" (UID: \"4912a02a-743d-4bbe-9063-7d99ccd3329a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" Mar 09 13:24:38 crc kubenswrapper[4764]: W0309 13:24:38.456867 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf3d7b2a_75e7_4c07_9211_b66c64c15def.slice/crio-3a2f5d25288e8bc86cb3cee224bc80c295dc724258bc8041325b003c8ff187db WatchSource:0}: Error finding container 3a2f5d25288e8bc86cb3cee224bc80c295dc724258bc8041325b003c8ff187db: Status 404 returned error can't find the container with id 3a2f5d25288e8bc86cb3cee224bc80c295dc724258bc8041325b003c8ff187db Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.459280 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdqjn\" (UniqueName: \"kubernetes.io/projected/4ba55602-0e3f-4722-b437-546732351bc4-kube-api-access-fdqjn\") pod \"control-plane-machine-set-operator-78cbb6b69f-9k28f\" (UID: \"4ba55602-0e3f-4722-b437-546732351bc4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9k28f" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.462908 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gst9d" event={"ID":"1ffb8d96-e6e4-4859-ae7d-37f900979485","Type":"ContainerStarted","Data":"c416f2dc40385063724840456179604ca4883ff273e05587d4d08e7c6e5aa92a"} Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.462952 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gst9d" event={"ID":"1ffb8d96-e6e4-4859-ae7d-37f900979485","Type":"ContainerStarted","Data":"44e73e4e6b5264ba8304d7c2411b0e137ff9d4aa3edb572b0be2d425d0cdc0e1"} Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.463909 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c2m6k" event={"ID":"baee6113-40a8-468e-b343-09e9afd65ce3","Type":"ContainerStarted","Data":"6fb4329e2fa9390478b8f6dd1725de8444e1017e3158aebde222bd496b98fb80"} Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.469289 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" event={"ID":"4951d770-ae8c-470a-982a-807c82112722","Type":"ContainerStarted","Data":"71223c683cd0648308c14fd45ca542bb5d131f54ba2d79963157873d6c25e727"} Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.473590 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2l2r\" (UniqueName: \"kubernetes.io/projected/ad307984-e46b-466b-8a5c-63a00976fbbf-kube-api-access-b2l2r\") pod \"cluster-samples-operator-665b6dd947-5g2pp\" (UID: \"ad307984-e46b-466b-8a5c-63a00976fbbf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5g2pp" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.482552 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r"] Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.485171 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc" event={"ID":"6dc446a1-b77b-4f15-ae5f-0141bf374cdd","Type":"ContainerStarted","Data":"17d5b491d947e91ddf28c82de6dbc1281f0bec8b013043771e2e286b4c427bba"} Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.489322 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" event={"ID":"b625331d-48ab-4d48-86fd-fe73466305ff","Type":"ContainerStarted","Data":"22fd4fd8035fc57032bbb13916cd7b7736a6fafcf098f02294f9432a8954ab17"} Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.491633 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0959a00-2a83-457f-bcba-7d4af48b11c3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-tvsss\" (UID: \"d0959a00-2a83-457f-bcba-7d4af48b11c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.497018 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.509298 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nj856"] Mar 09 13:24:38 crc kubenswrapper[4764]: W0309 13:24:38.511793 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad49b4d8_1218_4a34_8455_831d0f563cbf.slice/crio-3ebce9784954b6d704d045dc1cbbc83b452607e4967b3e3637a00f82a3e69729 WatchSource:0}: Error finding container 3ebce9784954b6d704d045dc1cbbc83b452607e4967b3e3637a00f82a3e69729: Status 404 returned error can't find the container with id 3ebce9784954b6d704d045dc1cbbc83b452607e4967b3e3637a00f82a3e69729 Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.518962 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.519595 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67pl5\" (UniqueName: \"kubernetes.io/projected/b9c0d96b-ed96-4925-b890-8743879a8b38-kube-api-access-67pl5\") pod \"router-default-5444994796-gnnbl\" (UID: \"b9c0d96b-ed96-4925-b890-8743879a8b38\") " pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.529157 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.540891 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/711447e6-e7cf-4577-8050-b5a391f96f6a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xfvtf\" (UID: \"711447e6-e7cf-4577-8050-b5a391f96f6a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.541091 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551044-p748f" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.550829 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.556456 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss45h\" (UniqueName: \"kubernetes.io/projected/39da5087-79bc-4154-b340-22183d9e4417-kube-api-access-ss45h\") pod \"collect-profiles-29551035-5fpgn\" (UID: \"39da5087-79bc-4154-b340-22183d9e4417\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.562936 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nnrmm"] Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.566463 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5g2pp" Mar 09 13:24:38 crc kubenswrapper[4764]: W0309 13:24:38.573330 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9b3244b_8df0_4330_9887_4092260d416a.slice/crio-42c43e046cf7b3c37122d69fdadfdf37b294da8c4d73ad8e7c4ac09039e43f1c WatchSource:0}: Error finding container 42c43e046cf7b3c37122d69fdadfdf37b294da8c4d73ad8e7c4ac09039e43f1c: Status 404 returned error can't find the container with id 42c43e046cf7b3c37122d69fdadfdf37b294da8c4d73ad8e7c4ac09039e43f1c Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.582873 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s85j9\" (UniqueName: \"kubernetes.io/projected/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-kube-api-access-s85j9\") pod \"marketplace-operator-79b997595-d4gwh\" (UID: \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\") " pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:24:38 crc kubenswrapper[4764]: W0309 13:24:38.593141 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23f87d2b_2a92_4abb_a2a6_2de508837343.slice/crio-06f569219e270618320b5e9344203a70dabe7cff34d14271dabece2d5bfeef8b WatchSource:0}: Error finding container 06f569219e270618320b5e9344203a70dabe7cff34d14271dabece2d5bfeef8b: Status 404 returned error can't find the container with id 06f569219e270618320b5e9344203a70dabe7cff34d14271dabece2d5bfeef8b Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.594495 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9vzq\" (UniqueName: \"kubernetes.io/projected/14b4aa8c-1066-4388-9442-07722e4c76c2-kube-api-access-w9vzq\") pod \"packageserver-d55dfcdfc-m4bs6\" (UID: \"14b4aa8c-1066-4388-9442-07722e4c76c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.617035 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7"] Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.634244 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdclx\" (UniqueName: \"kubernetes.io/projected/30a07c97-9d99-41be-956e-ba3d6505d318-kube-api-access-rdclx\") pod \"olm-operator-6b444d44fb-ptbpd\" (UID: \"30a07c97-9d99-41be-956e-ba3d6505d318\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.667783 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j"] Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.668924 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.669465 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.669877 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.673840 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnv7g\" (UniqueName: \"kubernetes.io/projected/8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497-kube-api-access-vnv7g\") pod \"service-ca-operator-777779d784-svr2n\" (UID: \"8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-svr2n" Mar 09 13:24:38 crc kubenswrapper[4764]: W0309 13:24:38.683129 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf95ed010_a6a4_49ab_b61b_fc4ee2d856bb.slice/crio-cfd2c52e9451edb42c1574c3bfe44c1c8e89acf647b496a5ee4bfe6db1b87112 WatchSource:0}: Error finding container cfd2c52e9451edb42c1574c3bfe44c1c8e89acf647b496a5ee4bfe6db1b87112: Status 404 returned error can't find the container with id cfd2c52e9451edb42c1574c3bfe44c1c8e89acf647b496a5ee4bfe6db1b87112 Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.691297 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8" Mar 09 13:24:38 crc kubenswrapper[4764]: W0309 13:24:38.718976 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61b85db0_a292_42a8_8296_d0e476d80c89.slice/crio-f998595181102b6c527dff000f09d875d8d715e5310d9ab7d8ba0e72dc742c89 WatchSource:0}: Error finding container f998595181102b6c527dff000f09d875d8d715e5310d9ab7d8ba0e72dc742c89: Status 404 returned error can't find the container with id f998595181102b6c527dff000f09d875d8d715e5310d9ab7d8ba0e72dc742c89 Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.723408 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t8ft9"] Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726075 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f4f31f9d-d2da-4e3d-b002-2f477d344fc0-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kt67m\" (UID: \"f4f31f9d-d2da-4e3d-b002-2f477d344fc0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726123 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpnqx\" (UniqueName: \"kubernetes.io/projected/2c795327-bff9-453d-b1b2-d48a2ef2b48f-kube-api-access-gpnqx\") pod \"migrator-59844c95c7-b6nrf\" (UID: \"2c795327-bff9-453d-b1b2-d48a2ef2b48f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b6nrf" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726143 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fe7eff7-ce11-4d46-bf25-06162522c1ff-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6brcn\" (UID: \"1fe7eff7-ce11-4d46-bf25-06162522c1ff\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726161 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdhqw\" (UniqueName: \"kubernetes.io/projected/2968d8f2-48fa-470b-b90e-41bb83bd77c4-kube-api-access-bdhqw\") pod \"machine-approver-56656f9798-cvsb9\" (UID: \"2968d8f2-48fa-470b-b90e-41bb83bd77c4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726177 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fe7eff7-ce11-4d46-bf25-06162522c1ff-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6brcn\" (UID: \"1fe7eff7-ce11-4d46-bf25-06162522c1ff\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726252 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st8mw\" (UniqueName: \"kubernetes.io/projected/d245b116-e47d-4b15-a40d-0a9fa34cf1df-kube-api-access-st8mw\") pod \"downloads-7954f5f757-x927s\" (UID: \"d245b116-e47d-4b15-a40d-0a9fa34cf1df\") " pod="openshift-console/downloads-7954f5f757-x927s" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726297 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/24df0ad8-c9b4-46b6-8751-23e26fc391c5-signing-cabundle\") pod \"service-ca-9c57cc56f-qddjs\" (UID: \"24df0ad8-c9b4-46b6-8751-23e26fc391c5\") " pod="openshift-service-ca/service-ca-9c57cc56f-qddjs" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726328 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d3652fe0-4889-432f-af3f-787dd19c60d6-installation-pull-secrets\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726361 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/24df0ad8-c9b4-46b6-8751-23e26fc391c5-signing-key\") pod \"service-ca-9c57cc56f-qddjs\" (UID: \"24df0ad8-c9b4-46b6-8751-23e26fc391c5\") " pod="openshift-service-ca/service-ca-9c57cc56f-qddjs" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726386 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-bound-sa-token\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726401 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tw8z\" (UniqueName: \"kubernetes.io/projected/f4f31f9d-d2da-4e3d-b002-2f477d344fc0-kube-api-access-7tw8z\") pod \"machine-config-controller-84d6567774-kt67m\" (UID: \"f4f31f9d-d2da-4e3d-b002-2f477d344fc0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726431 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3652fe0-4889-432f-af3f-787dd19c60d6-trusted-ca\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726469 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4f31f9d-d2da-4e3d-b002-2f477d344fc0-proxy-tls\") pod \"machine-config-controller-84d6567774-kt67m\" (UID: \"f4f31f9d-d2da-4e3d-b002-2f477d344fc0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726486 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36228b47-79c7-484d-9753-6f36806aa344-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-mn5w6\" (UID: \"36228b47-79c7-484d-9753-6f36806aa344\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726525 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39249ace-8681-491c-a547-869e72d297a7-metrics-tls\") pod \"ingress-operator-5b745b69d9-cd5zc\" (UID: \"39249ace-8681-491c-a547-869e72d297a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726550 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d3652fe0-4889-432f-af3f-787dd19c60d6-registry-certificates\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726575 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726606 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36228b47-79c7-484d-9753-6f36806aa344-config\") pod \"kube-controller-manager-operator-78b949d7b-mn5w6\" (UID: \"36228b47-79c7-484d-9753-6f36806aa344\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726636 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/957cfae4-9b1a-4e7f-b526-60bc1ee1a1b4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hf98d\" (UID: \"957cfae4-9b1a-4e7f-b526-60bc1ee1a1b4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hf98d" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726688 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4js2s\" (UniqueName: \"kubernetes.io/projected/957cfae4-9b1a-4e7f-b526-60bc1ee1a1b4-kube-api-access-4js2s\") pod \"multus-admission-controller-857f4d67dd-hf98d\" (UID: \"957cfae4-9b1a-4e7f-b526-60bc1ee1a1b4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hf98d" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726724 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftcvk\" (UniqueName: \"kubernetes.io/projected/39249ace-8681-491c-a547-869e72d297a7-kube-api-access-ftcvk\") pod \"ingress-operator-5b745b69d9-cd5zc\" (UID: \"39249ace-8681-491c-a547-869e72d297a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726743 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsz44\" (UniqueName: \"kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-kube-api-access-xsz44\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726769 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l68gm\" (UniqueName: \"kubernetes.io/projected/24df0ad8-c9b4-46b6-8751-23e26fc391c5-kube-api-access-l68gm\") pod \"service-ca-9c57cc56f-qddjs\" (UID: \"24df0ad8-c9b4-46b6-8751-23e26fc391c5\") " pod="openshift-service-ca/service-ca-9c57cc56f-qddjs" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726783 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2968d8f2-48fa-470b-b90e-41bb83bd77c4-auth-proxy-config\") pod \"machine-approver-56656f9798-cvsb9\" (UID: \"2968d8f2-48fa-470b-b90e-41bb83bd77c4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726817 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2968d8f2-48fa-470b-b90e-41bb83bd77c4-machine-approver-tls\") pod \"machine-approver-56656f9798-cvsb9\" (UID: \"2968d8f2-48fa-470b-b90e-41bb83bd77c4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726834 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2968d8f2-48fa-470b-b90e-41bb83bd77c4-config\") pod \"machine-approver-56656f9798-cvsb9\" (UID: \"2968d8f2-48fa-470b-b90e-41bb83bd77c4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726852 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39249ace-8681-491c-a547-869e72d297a7-trusted-ca\") pod \"ingress-operator-5b745b69d9-cd5zc\" (UID: \"39249ace-8681-491c-a547-869e72d297a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726866 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39249ace-8681-491c-a547-869e72d297a7-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cd5zc\" (UID: \"39249ace-8681-491c-a547-869e72d297a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726883 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6ztl\" (UniqueName: \"kubernetes.io/projected/1fe7eff7-ce11-4d46-bf25-06162522c1ff-kube-api-access-q6ztl\") pod \"kube-storage-version-migrator-operator-b67b599dd-6brcn\" (UID: \"1fe7eff7-ce11-4d46-bf25-06162522c1ff\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726899 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36228b47-79c7-484d-9753-6f36806aa344-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-mn5w6\" (UID: \"36228b47-79c7-484d-9753-6f36806aa344\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726924 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-registry-tls\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.726945 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d3652fe0-4889-432f-af3f-787dd19c60d6-ca-trust-extracted\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: E0309 13:24:38.730464 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:39.230451675 +0000 UTC m=+234.480623583 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.735864 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-svr2n" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.755265 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9k28f" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.772214 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.783362 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sp2mq"] Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.786970 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.796310 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.804086 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.819872 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.830834 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:38 crc kubenswrapper[4764]: E0309 13:24:38.830953 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:39.330927416 +0000 UTC m=+234.581099324 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.831103 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0c464093-f30e-4c79-84cd-98a6d49b813c-metrics-tls\") pod \"dns-default-6nhpx\" (UID: \"0c464093-f30e-4c79-84cd-98a6d49b813c\") " pod="openshift-dns/dns-default-6nhpx" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.831155 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2968d8f2-48fa-470b-b90e-41bb83bd77c4-machine-approver-tls\") pod \"machine-approver-56656f9798-cvsb9\" (UID: \"2968d8f2-48fa-470b-b90e-41bb83bd77c4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.831177 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2968d8f2-48fa-470b-b90e-41bb83bd77c4-config\") pod \"machine-approver-56656f9798-cvsb9\" (UID: \"2968d8f2-48fa-470b-b90e-41bb83bd77c4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.831219 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxvw8\" (UniqueName: \"kubernetes.io/projected/0c464093-f30e-4c79-84cd-98a6d49b813c-kube-api-access-kxvw8\") pod \"dns-default-6nhpx\" (UID: \"0c464093-f30e-4c79-84cd-98a6d49b813c\") " pod="openshift-dns/dns-default-6nhpx" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.831246 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39249ace-8681-491c-a547-869e72d297a7-trusted-ca\") pod \"ingress-operator-5b745b69d9-cd5zc\" (UID: \"39249ace-8681-491c-a547-869e72d297a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.831945 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39249ace-8681-491c-a547-869e72d297a7-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cd5zc\" (UID: \"39249ace-8681-491c-a547-869e72d297a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.831968 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6ztl\" (UniqueName: \"kubernetes.io/projected/1fe7eff7-ce11-4d46-bf25-06162522c1ff-kube-api-access-q6ztl\") pod \"kube-storage-version-migrator-operator-b67b599dd-6brcn\" (UID: \"1fe7eff7-ce11-4d46-bf25-06162522c1ff\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.831994 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36228b47-79c7-484d-9753-6f36806aa344-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-mn5w6\" (UID: \"36228b47-79c7-484d-9753-6f36806aa344\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832037 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-registry-tls\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832065 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d3652fe0-4889-432f-af3f-787dd19c60d6-ca-trust-extracted\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832127 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvm6v\" (UniqueName: \"kubernetes.io/projected/e56f29e6-292c-48d2-a5d7-531cfa6689f5-kube-api-access-qvm6v\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832213 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f4f31f9d-d2da-4e3d-b002-2f477d344fc0-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kt67m\" (UID: \"f4f31f9d-d2da-4e3d-b002-2f477d344fc0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832375 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e56f29e6-292c-48d2-a5d7-531cfa6689f5-socket-dir\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832404 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpnqx\" (UniqueName: \"kubernetes.io/projected/2c795327-bff9-453d-b1b2-d48a2ef2b48f-kube-api-access-gpnqx\") pod \"migrator-59844c95c7-b6nrf\" (UID: \"2c795327-bff9-453d-b1b2-d48a2ef2b48f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b6nrf" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832428 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fe7eff7-ce11-4d46-bf25-06162522c1ff-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6brcn\" (UID: \"1fe7eff7-ce11-4d46-bf25-06162522c1ff\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832452 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0c464093-f30e-4c79-84cd-98a6d49b813c-config-volume\") pod \"dns-default-6nhpx\" (UID: \"0c464093-f30e-4c79-84cd-98a6d49b813c\") " pod="openshift-dns/dns-default-6nhpx" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832531 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e56f29e6-292c-48d2-a5d7-531cfa6689f5-mountpoint-dir\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832554 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fe7eff7-ce11-4d46-bf25-06162522c1ff-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6brcn\" (UID: \"1fe7eff7-ce11-4d46-bf25-06162522c1ff\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832607 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdhqw\" (UniqueName: \"kubernetes.io/projected/2968d8f2-48fa-470b-b90e-41bb83bd77c4-kube-api-access-bdhqw\") pod \"machine-approver-56656f9798-cvsb9\" (UID: \"2968d8f2-48fa-470b-b90e-41bb83bd77c4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832685 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st8mw\" (UniqueName: \"kubernetes.io/projected/d245b116-e47d-4b15-a40d-0a9fa34cf1df-kube-api-access-st8mw\") pod \"downloads-7954f5f757-x927s\" (UID: \"d245b116-e47d-4b15-a40d-0a9fa34cf1df\") " pod="openshift-console/downloads-7954f5f757-x927s" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832734 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrqh5\" (UniqueName: \"kubernetes.io/projected/b7bf830c-e91a-4dd1-a3b5-64ca95d57e44-kube-api-access-mrqh5\") pod \"ingress-canary-k96kg\" (UID: \"b7bf830c-e91a-4dd1-a3b5-64ca95d57e44\") " pod="openshift-ingress-canary/ingress-canary-k96kg" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832762 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/24df0ad8-c9b4-46b6-8751-23e26fc391c5-signing-cabundle\") pod \"service-ca-9c57cc56f-qddjs\" (UID: \"24df0ad8-c9b4-46b6-8751-23e26fc391c5\") " pod="openshift-service-ca/service-ca-9c57cc56f-qddjs" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832841 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d3652fe0-4889-432f-af3f-787dd19c60d6-installation-pull-secrets\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832877 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e56f29e6-292c-48d2-a5d7-531cfa6689f5-csi-data-dir\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832925 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/24df0ad8-c9b4-46b6-8751-23e26fc391c5-signing-key\") pod \"service-ca-9c57cc56f-qddjs\" (UID: \"24df0ad8-c9b4-46b6-8751-23e26fc391c5\") " pod="openshift-service-ca/service-ca-9c57cc56f-qddjs" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832951 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-bound-sa-token\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.832972 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tw8z\" (UniqueName: \"kubernetes.io/projected/f4f31f9d-d2da-4e3d-b002-2f477d344fc0-kube-api-access-7tw8z\") pod \"machine-config-controller-84d6567774-kt67m\" (UID: \"f4f31f9d-d2da-4e3d-b002-2f477d344fc0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833040 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7bf830c-e91a-4dd1-a3b5-64ca95d57e44-cert\") pod \"ingress-canary-k96kg\" (UID: \"b7bf830c-e91a-4dd1-a3b5-64ca95d57e44\") " pod="openshift-ingress-canary/ingress-canary-k96kg" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833145 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3652fe0-4889-432f-af3f-787dd19c60d6-trusted-ca\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833170 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4f31f9d-d2da-4e3d-b002-2f477d344fc0-proxy-tls\") pod \"machine-config-controller-84d6567774-kt67m\" (UID: \"f4f31f9d-d2da-4e3d-b002-2f477d344fc0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833241 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36228b47-79c7-484d-9753-6f36806aa344-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-mn5w6\" (UID: \"36228b47-79c7-484d-9753-6f36806aa344\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833277 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e56f29e6-292c-48d2-a5d7-531cfa6689f5-plugins-dir\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833322 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8e5162b2-c722-4535-9adf-3af0eee24211-node-bootstrap-token\") pod \"machine-config-server-kdxg4\" (UID: \"8e5162b2-c722-4535-9adf-3af0eee24211\") " pod="openshift-machine-config-operator/machine-config-server-kdxg4" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833343 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8e5162b2-c722-4535-9adf-3af0eee24211-certs\") pod \"machine-config-server-kdxg4\" (UID: \"8e5162b2-c722-4535-9adf-3af0eee24211\") " pod="openshift-machine-config-operator/machine-config-server-kdxg4" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833362 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhg8d\" (UniqueName: \"kubernetes.io/projected/8e5162b2-c722-4535-9adf-3af0eee24211-kube-api-access-nhg8d\") pod \"machine-config-server-kdxg4\" (UID: \"8e5162b2-c722-4535-9adf-3af0eee24211\") " pod="openshift-machine-config-operator/machine-config-server-kdxg4" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833399 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39249ace-8681-491c-a547-869e72d297a7-metrics-tls\") pod \"ingress-operator-5b745b69d9-cd5zc\" (UID: \"39249ace-8681-491c-a547-869e72d297a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833434 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d3652fe0-4889-432f-af3f-787dd19c60d6-registry-certificates\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833464 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36228b47-79c7-484d-9753-6f36806aa344-config\") pod \"kube-controller-manager-operator-78b949d7b-mn5w6\" (UID: \"36228b47-79c7-484d-9753-6f36806aa344\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833516 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833552 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/957cfae4-9b1a-4e7f-b526-60bc1ee1a1b4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hf98d\" (UID: \"957cfae4-9b1a-4e7f-b526-60bc1ee1a1b4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hf98d" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833620 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4js2s\" (UniqueName: \"kubernetes.io/projected/957cfae4-9b1a-4e7f-b526-60bc1ee1a1b4-kube-api-access-4js2s\") pod \"multus-admission-controller-857f4d67dd-hf98d\" (UID: \"957cfae4-9b1a-4e7f-b526-60bc1ee1a1b4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hf98d" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833689 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftcvk\" (UniqueName: \"kubernetes.io/projected/39249ace-8681-491c-a547-869e72d297a7-kube-api-access-ftcvk\") pod \"ingress-operator-5b745b69d9-cd5zc\" (UID: \"39249ace-8681-491c-a547-869e72d297a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833726 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsz44\" (UniqueName: \"kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-kube-api-access-xsz44\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833775 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e56f29e6-292c-48d2-a5d7-531cfa6689f5-registration-dir\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833797 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2968d8f2-48fa-470b-b90e-41bb83bd77c4-auth-proxy-config\") pod \"machine-approver-56656f9798-cvsb9\" (UID: \"2968d8f2-48fa-470b-b90e-41bb83bd77c4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.833834 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l68gm\" (UniqueName: \"kubernetes.io/projected/24df0ad8-c9b4-46b6-8751-23e26fc391c5-kube-api-access-l68gm\") pod \"service-ca-9c57cc56f-qddjs\" (UID: \"24df0ad8-c9b4-46b6-8751-23e26fc391c5\") " pod="openshift-service-ca/service-ca-9c57cc56f-qddjs" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.835150 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2968d8f2-48fa-470b-b90e-41bb83bd77c4-config\") pod \"machine-approver-56656f9798-cvsb9\" (UID: \"2968d8f2-48fa-470b-b90e-41bb83bd77c4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.835204 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f4f31f9d-d2da-4e3d-b002-2f477d344fc0-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kt67m\" (UID: \"f4f31f9d-d2da-4e3d-b002-2f477d344fc0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.835770 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2968d8f2-48fa-470b-b90e-41bb83bd77c4-machine-approver-tls\") pod \"machine-approver-56656f9798-cvsb9\" (UID: \"2968d8f2-48fa-470b-b90e-41bb83bd77c4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.837936 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fe7eff7-ce11-4d46-bf25-06162522c1ff-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6brcn\" (UID: \"1fe7eff7-ce11-4d46-bf25-06162522c1ff\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn" Mar 09 13:24:38 crc kubenswrapper[4764]: E0309 13:24:38.844018 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:39.344004098 +0000 UTC m=+234.594176006 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.857555 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d3652fe0-4889-432f-af3f-787dd19c60d6-ca-trust-extracted\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.862088 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36228b47-79c7-484d-9753-6f36806aa344-config\") pod \"kube-controller-manager-operator-78b949d7b-mn5w6\" (UID: \"36228b47-79c7-484d-9753-6f36806aa344\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.868423 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/24df0ad8-c9b4-46b6-8751-23e26fc391c5-signing-cabundle\") pod \"service-ca-9c57cc56f-qddjs\" (UID: \"24df0ad8-c9b4-46b6-8751-23e26fc391c5\") " pod="openshift-service-ca/service-ca-9c57cc56f-qddjs" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.869522 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36228b47-79c7-484d-9753-6f36806aa344-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-mn5w6\" (UID: \"36228b47-79c7-484d-9753-6f36806aa344\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.870569 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4f31f9d-d2da-4e3d-b002-2f477d344fc0-proxy-tls\") pod \"machine-config-controller-84d6567774-kt67m\" (UID: \"f4f31f9d-d2da-4e3d-b002-2f477d344fc0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.879969 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l68gm\" (UniqueName: \"kubernetes.io/projected/24df0ad8-c9b4-46b6-8751-23e26fc391c5-kube-api-access-l68gm\") pod \"service-ca-9c57cc56f-qddjs\" (UID: \"24df0ad8-c9b4-46b6-8751-23e26fc391c5\") " pod="openshift-service-ca/service-ca-9c57cc56f-qddjs" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.880226 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d3652fe0-4889-432f-af3f-787dd19c60d6-registry-certificates\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.880573 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fe7eff7-ce11-4d46-bf25-06162522c1ff-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6brcn\" (UID: \"1fe7eff7-ce11-4d46-bf25-06162522c1ff\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.881006 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2968d8f2-48fa-470b-b90e-41bb83bd77c4-auth-proxy-config\") pod \"machine-approver-56656f9798-cvsb9\" (UID: \"2968d8f2-48fa-470b-b90e-41bb83bd77c4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.881431 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-registry-tls\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.881576 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39249ace-8681-491c-a547-869e72d297a7-trusted-ca\") pod \"ingress-operator-5b745b69d9-cd5zc\" (UID: \"39249ace-8681-491c-a547-869e72d297a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.882402 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39249ace-8681-491c-a547-869e72d297a7-metrics-tls\") pod \"ingress-operator-5b745b69d9-cd5zc\" (UID: \"39249ace-8681-491c-a547-869e72d297a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.882804 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39249ace-8681-491c-a547-869e72d297a7-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cd5zc\" (UID: \"39249ace-8681-491c-a547-869e72d297a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.883072 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3652fe0-4889-432f-af3f-787dd19c60d6-trusted-ca\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.884823 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/24df0ad8-c9b4-46b6-8751-23e26fc391c5-signing-key\") pod \"service-ca-9c57cc56f-qddjs\" (UID: \"24df0ad8-c9b4-46b6-8751-23e26fc391c5\") " pod="openshift-service-ca/service-ca-9c57cc56f-qddjs" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.885587 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d3652fe0-4889-432f-af3f-787dd19c60d6-installation-pull-secrets\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.905108 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/957cfae4-9b1a-4e7f-b526-60bc1ee1a1b4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hf98d\" (UID: \"957cfae4-9b1a-4e7f-b526-60bc1ee1a1b4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hf98d" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.925218 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6ztl\" (UniqueName: \"kubernetes.io/projected/1fe7eff7-ce11-4d46-bf25-06162522c1ff-kube-api-access-q6ztl\") pod \"kube-storage-version-migrator-operator-b67b599dd-6brcn\" (UID: \"1fe7eff7-ce11-4d46-bf25-06162522c1ff\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.954622 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpnqx\" (UniqueName: \"kubernetes.io/projected/2c795327-bff9-453d-b1b2-d48a2ef2b48f-kube-api-access-gpnqx\") pod \"migrator-59844c95c7-b6nrf\" (UID: \"2c795327-bff9-453d-b1b2-d48a2ef2b48f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b6nrf" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.955127 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.955354 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e56f29e6-292c-48d2-a5d7-531cfa6689f5-socket-dir\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.955380 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0c464093-f30e-4c79-84cd-98a6d49b813c-config-volume\") pod \"dns-default-6nhpx\" (UID: \"0c464093-f30e-4c79-84cd-98a6d49b813c\") " pod="openshift-dns/dns-default-6nhpx" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.955403 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e56f29e6-292c-48d2-a5d7-531cfa6689f5-mountpoint-dir\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.955446 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrqh5\" (UniqueName: \"kubernetes.io/projected/b7bf830c-e91a-4dd1-a3b5-64ca95d57e44-kube-api-access-mrqh5\") pod \"ingress-canary-k96kg\" (UID: \"b7bf830c-e91a-4dd1-a3b5-64ca95d57e44\") " pod="openshift-ingress-canary/ingress-canary-k96kg" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.955472 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e56f29e6-292c-48d2-a5d7-531cfa6689f5-csi-data-dir\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.955512 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7bf830c-e91a-4dd1-a3b5-64ca95d57e44-cert\") pod \"ingress-canary-k96kg\" (UID: \"b7bf830c-e91a-4dd1-a3b5-64ca95d57e44\") " pod="openshift-ingress-canary/ingress-canary-k96kg" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.955537 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e56f29e6-292c-48d2-a5d7-531cfa6689f5-plugins-dir\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.955562 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8e5162b2-c722-4535-9adf-3af0eee24211-node-bootstrap-token\") pod \"machine-config-server-kdxg4\" (UID: \"8e5162b2-c722-4535-9adf-3af0eee24211\") " pod="openshift-machine-config-operator/machine-config-server-kdxg4" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.955581 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8e5162b2-c722-4535-9adf-3af0eee24211-certs\") pod \"machine-config-server-kdxg4\" (UID: \"8e5162b2-c722-4535-9adf-3af0eee24211\") " pod="openshift-machine-config-operator/machine-config-server-kdxg4" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.955603 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhg8d\" (UniqueName: \"kubernetes.io/projected/8e5162b2-c722-4535-9adf-3af0eee24211-kube-api-access-nhg8d\") pod \"machine-config-server-kdxg4\" (UID: \"8e5162b2-c722-4535-9adf-3af0eee24211\") " pod="openshift-machine-config-operator/machine-config-server-kdxg4" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.955709 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e56f29e6-292c-48d2-a5d7-531cfa6689f5-registration-dir\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.955736 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0c464093-f30e-4c79-84cd-98a6d49b813c-metrics-tls\") pod \"dns-default-6nhpx\" (UID: \"0c464093-f30e-4c79-84cd-98a6d49b813c\") " pod="openshift-dns/dns-default-6nhpx" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.955759 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxvw8\" (UniqueName: \"kubernetes.io/projected/0c464093-f30e-4c79-84cd-98a6d49b813c-kube-api-access-kxvw8\") pod \"dns-default-6nhpx\" (UID: \"0c464093-f30e-4c79-84cd-98a6d49b813c\") " pod="openshift-dns/dns-default-6nhpx" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.955805 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvm6v\" (UniqueName: \"kubernetes.io/projected/e56f29e6-292c-48d2-a5d7-531cfa6689f5-kube-api-access-qvm6v\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: E0309 13:24:38.956051 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:39.4560324 +0000 UTC m=+234.706204308 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.956260 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e56f29e6-292c-48d2-a5d7-531cfa6689f5-socket-dir\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.956925 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0c464093-f30e-4c79-84cd-98a6d49b813c-config-volume\") pod \"dns-default-6nhpx\" (UID: \"0c464093-f30e-4c79-84cd-98a6d49b813c\") " pod="openshift-dns/dns-default-6nhpx" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.956943 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e56f29e6-292c-48d2-a5d7-531cfa6689f5-csi-data-dir\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.957149 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e56f29e6-292c-48d2-a5d7-531cfa6689f5-mountpoint-dir\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.973484 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e56f29e6-292c-48d2-a5d7-531cfa6689f5-plugins-dir\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.974276 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8e5162b2-c722-4535-9adf-3af0eee24211-certs\") pod \"machine-config-server-kdxg4\" (UID: \"8e5162b2-c722-4535-9adf-3af0eee24211\") " pod="openshift-machine-config-operator/machine-config-server-kdxg4" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.974977 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8e5162b2-c722-4535-9adf-3af0eee24211-node-bootstrap-token\") pod \"machine-config-server-kdxg4\" (UID: \"8e5162b2-c722-4535-9adf-3af0eee24211\") " pod="openshift-machine-config-operator/machine-config-server-kdxg4" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.976117 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7bf830c-e91a-4dd1-a3b5-64ca95d57e44-cert\") pod \"ingress-canary-k96kg\" (UID: \"b7bf830c-e91a-4dd1-a3b5-64ca95d57e44\") " pod="openshift-ingress-canary/ingress-canary-k96kg" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.978835 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-qddjs" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.979098 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e56f29e6-292c-48d2-a5d7-531cfa6689f5-registration-dir\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.990321 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-bound-sa-token\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.990328 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st8mw\" (UniqueName: \"kubernetes.io/projected/d245b116-e47d-4b15-a40d-0a9fa34cf1df-kube-api-access-st8mw\") pod \"downloads-7954f5f757-x927s\" (UID: \"d245b116-e47d-4b15-a40d-0a9fa34cf1df\") " pod="openshift-console/downloads-7954f5f757-x927s" Mar 09 13:24:38 crc kubenswrapper[4764]: I0309 13:24:38.997525 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0c464093-f30e-4c79-84cd-98a6d49b813c-metrics-tls\") pod \"dns-default-6nhpx\" (UID: \"0c464093-f30e-4c79-84cd-98a6d49b813c\") " pod="openshift-dns/dns-default-6nhpx" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.002124 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdhqw\" (UniqueName: \"kubernetes.io/projected/2968d8f2-48fa-470b-b90e-41bb83bd77c4-kube-api-access-bdhqw\") pod \"machine-approver-56656f9798-cvsb9\" (UID: \"2968d8f2-48fa-470b-b90e-41bb83bd77c4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.012164 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-8g9lj"] Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.018122 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tw8z\" (UniqueName: \"kubernetes.io/projected/f4f31f9d-d2da-4e3d-b002-2f477d344fc0-kube-api-access-7tw8z\") pod \"machine-config-controller-84d6567774-kt67m\" (UID: \"f4f31f9d-d2da-4e3d-b002-2f477d344fc0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.023248 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.031090 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36228b47-79c7-484d-9753-6f36806aa344-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-mn5w6\" (UID: \"36228b47-79c7-484d-9753-6f36806aa344\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.045884 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b6nrf" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.060560 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:39 crc kubenswrapper[4764]: E0309 13:24:39.061682 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:39.56166625 +0000 UTC m=+234.811838158 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.075870 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4js2s\" (UniqueName: \"kubernetes.io/projected/957cfae4-9b1a-4e7f-b526-60bc1ee1a1b4-kube-api-access-4js2s\") pod \"multus-admission-controller-857f4d67dd-hf98d\" (UID: \"957cfae4-9b1a-4e7f-b526-60bc1ee1a1b4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hf98d" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.087963 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftcvk\" (UniqueName: \"kubernetes.io/projected/39249ace-8681-491c-a547-869e72d297a7-kube-api-access-ftcvk\") pod \"ingress-operator-5b745b69d9-cd5zc\" (UID: \"39249ace-8681-491c-a547-869e72d297a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.089667 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt"] Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.108694 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsz44\" (UniqueName: \"kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-kube-api-access-xsz44\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.114172 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6" Mar 09 13:24:39 crc kubenswrapper[4764]: W0309 13:24:39.128922 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1b4dc0b_edea_4c0d_8d61_3e3d3133605d.slice/crio-74b142593d2c61fe165941847cdc507401c02d77a28ca82096827ce65a85b3e5 WatchSource:0}: Error finding container 74b142593d2c61fe165941847cdc507401c02d77a28ca82096827ce65a85b3e5: Status 404 returned error can't find the container with id 74b142593d2c61fe165941847cdc507401c02d77a28ca82096827ce65a85b3e5 Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.148343 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvm6v\" (UniqueName: \"kubernetes.io/projected/e56f29e6-292c-48d2-a5d7-531cfa6689f5-kube-api-access-qvm6v\") pod \"csi-hostpathplugin-2bqx9\" (UID: \"e56f29e6-292c-48d2-a5d7-531cfa6689f5\") " pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.162601 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:39 crc kubenswrapper[4764]: E0309 13:24:39.162581 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:39.662556842 +0000 UTC m=+234.912728750 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.162872 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrqh5\" (UniqueName: \"kubernetes.io/projected/b7bf830c-e91a-4dd1-a3b5-64ca95d57e44-kube-api-access-mrqh5\") pod \"ingress-canary-k96kg\" (UID: \"b7bf830c-e91a-4dd1-a3b5-64ca95d57e44\") " pod="openshift-ingress-canary/ingress-canary-k96kg" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.163119 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:39 crc kubenswrapper[4764]: E0309 13:24:39.164336 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:39.66432523 +0000 UTC m=+234.914497148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.165911 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.187396 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhg8d\" (UniqueName: \"kubernetes.io/projected/8e5162b2-c722-4535-9adf-3af0eee24211-kube-api-access-nhg8d\") pod \"machine-config-server-kdxg4\" (UID: \"8e5162b2-c722-4535-9adf-3af0eee24211\") " pod="openshift-machine-config-operator/machine-config-server-kdxg4" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.191346 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-kdxg4" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.215465 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-x927s" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.221558 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.228483 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxvw8\" (UniqueName: \"kubernetes.io/projected/0c464093-f30e-4c79-84cd-98a6d49b813c-kube-api-access-kxvw8\") pod \"dns-default-6nhpx\" (UID: \"0c464093-f30e-4c79-84cd-98a6d49b813c\") " pod="openshift-dns/dns-default-6nhpx" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.242203 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.265682 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:39 crc kubenswrapper[4764]: E0309 13:24:39.265816 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:39.765785118 +0000 UTC m=+235.015957026 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.266324 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:39 crc kubenswrapper[4764]: E0309 13:24:39.266954 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:39.766943059 +0000 UTC m=+235.017114967 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.282993 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hf98d" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.363524 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.367694 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:39 crc kubenswrapper[4764]: E0309 13:24:39.368203 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:39.868187911 +0000 UTC m=+235.118359819 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:39 crc kubenswrapper[4764]: W0309 13:24:39.373423 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb72bd4db_e5ea_44f6_bdce_81df2966acfb.slice/crio-c18c57b8ec43425645ca5f35f11b40303b55e73502767f0420d6ee129a178222 WatchSource:0}: Error finding container c18c57b8ec43425645ca5f35f11b40303b55e73502767f0420d6ee129a178222: Status 404 returned error can't find the container with id c18c57b8ec43425645ca5f35f11b40303b55e73502767f0420d6ee129a178222 Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.460071 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k96kg" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.477069 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:39 crc kubenswrapper[4764]: E0309 13:24:39.477448 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:39.977433508 +0000 UTC m=+235.227605416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.482352 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6nhpx" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.498628 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551044-p748f"] Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.578194 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:39 crc kubenswrapper[4764]: E0309 13:24:39.579019 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:40.079000999 +0000 UTC m=+235.329172917 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.579235 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:39 crc kubenswrapper[4764]: E0309 13:24:39.579564 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:40.079544484 +0000 UTC m=+235.329716392 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.580739 4764 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-tgqwl container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.580792 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" podUID="b625331d-48ab-4d48-86fd-fe73466305ff" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.592982 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" event={"ID":"b625331d-48ab-4d48-86fd-fe73466305ff","Type":"ContainerStarted","Data":"878ba1fbacc5d195f2c3f5d1f7ed102a9da213e083eb5f15b669f308393abf9d"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.593204 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-nj856" event={"ID":"f9b3244b-8df0-4330-9887-4092260d416a","Type":"ContainerStarted","Data":"42c43e046cf7b3c37122d69fdadfdf37b294da8c4d73ad8e7c4ac09039e43f1c"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.593332 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5g2pp"] Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.593452 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.593602 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk"] Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.593773 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448"] Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.596638 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-sp2mq" event={"ID":"ee2ad8bf-7cf9-4bab-9638-b26d9c593188","Type":"ContainerStarted","Data":"a839636883aa4f201eff1e2771eff45834e9fd880c3edbdfd1093253a12cbf1a"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.608604 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l" event={"ID":"cf3d7b2a-75e7-4c07-9211-b66c64c15def","Type":"ContainerStarted","Data":"f44602f0fd06fc5b4a5856699a9ae61fb2b61106450e70eb3f6104cfbee6dfe5"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.608719 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l" event={"ID":"cf3d7b2a-75e7-4c07-9211-b66c64c15def","Type":"ContainerStarted","Data":"3a2f5d25288e8bc86cb3cee224bc80c295dc724258bc8041325b003c8ff187db"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.610278 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn"] Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.613094 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" event={"ID":"61b85db0-a292-42a8-8296-d0e476d80c89","Type":"ContainerStarted","Data":"f998595181102b6c527dff000f09d875d8d715e5310d9ab7d8ba0e72dc742c89"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.615695 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" event={"ID":"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb","Type":"ContainerStarted","Data":"cfd2c52e9451edb42c1574c3bfe44c1c8e89acf647b496a5ee4bfe6db1b87112"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.621089 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" event={"ID":"23f87d2b-2a92-4abb-a2a6-2de508837343","Type":"ContainerStarted","Data":"06f569219e270618320b5e9344203a70dabe7cff34d14271dabece2d5bfeef8b"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.647058 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c2m6k" event={"ID":"baee6113-40a8-468e-b343-09e9afd65ce3","Type":"ContainerStarted","Data":"ef865bd11e36a99798afe2b99b9e97135ef2eab7e755d81a456c363efcf457a1"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.650513 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" event={"ID":"b72bd4db-e5ea-44f6-bdce-81df2966acfb","Type":"ContainerStarted","Data":"c18c57b8ec43425645ca5f35f11b40303b55e73502767f0420d6ee129a178222"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.652417 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.654313 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" event={"ID":"ad49b4d8-1218-4a34-8455-831d0f563cbf","Type":"ContainerStarted","Data":"87b7e8df716d6db2234cb0d9f4b5435733eebf3453983d174cfd8ed4a517ff0c"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.654370 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" event={"ID":"ad49b4d8-1218-4a34-8455-831d0f563cbf","Type":"ContainerStarted","Data":"3ebce9784954b6d704d045dc1cbbc83b452607e4967b3e3637a00f82a3e69729"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.655250 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.662067 4764 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-mvq6r container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.662131 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" podUID="ad49b4d8-1218-4a34-8455-831d0f563cbf" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.662639 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" event={"ID":"4125448d-5832-43c2-8dba-d95adde7458a","Type":"ContainerStarted","Data":"9dbae435a7eb2868f80bef0b5a8774f6a265623101848e72196b767d1eafe722"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.667256 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" event={"ID":"4951d770-ae8c-470a-982a-807c82112722","Type":"ContainerStarted","Data":"71eee8dfc7e240ec93ee7fb35f30941ec5a5be0356b37b1332a7ecab80f5293c"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.676046 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc" event={"ID":"6dc446a1-b77b-4f15-ae5f-0141bf374cdd","Type":"ContainerStarted","Data":"263920c9620557294e3ee0141e1581d377009113573b651bea6f701776236ba7"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.677854 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-gnnbl" event={"ID":"b9c0d96b-ed96-4925-b890-8743879a8b38","Type":"ContainerStarted","Data":"e698665a8d11a1629142c5086f14482532d66300a5375d8c56a007f248de02c1"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.691271 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:39 crc kubenswrapper[4764]: E0309 13:24:39.695160 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:40.195124471 +0000 UTC m=+235.445296369 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.697949 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8g9lj" event={"ID":"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d","Type":"ContainerStarted","Data":"74b142593d2c61fe165941847cdc507401c02d77a28ca82096827ce65a85b3e5"} Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.793546 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:39 crc kubenswrapper[4764]: E0309 13:24:39.794907 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:40.294892484 +0000 UTC m=+235.545064462 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:39 crc kubenswrapper[4764]: W0309 13:24:39.871898 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode025e897_5ff3_476b_81c9_afdd0ae7a25f.slice/crio-096479a1922d69082b1ef9b9e58ac118c8a36513f53b16f4ee961276e252ac7b WatchSource:0}: Error finding container 096479a1922d69082b1ef9b9e58ac118c8a36513f53b16f4ee961276e252ac7b: Status 404 returned error can't find the container with id 096479a1922d69082b1ef9b9e58ac118c8a36513f53b16f4ee961276e252ac7b Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.897580 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:39 crc kubenswrapper[4764]: E0309 13:24:39.898406 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:40.398380986 +0000 UTC m=+235.648552894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:39 crc kubenswrapper[4764]: I0309 13:24:39.999720 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:40 crc kubenswrapper[4764]: E0309 13:24:40.000090 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:40.50007578 +0000 UTC m=+235.750247708 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.101129 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:40 crc kubenswrapper[4764]: E0309 13:24:40.101848 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:40.601833766 +0000 UTC m=+235.852005674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.202707 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:40 crc kubenswrapper[4764]: E0309 13:24:40.203093 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:40.703073838 +0000 UTC m=+235.953245746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.309149 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:40 crc kubenswrapper[4764]: E0309 13:24:40.309520 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:40.80950095 +0000 UTC m=+236.059672878 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.434337 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:40 crc kubenswrapper[4764]: E0309 13:24:40.434863 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:40.93485121 +0000 UTC m=+236.185023118 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.485593 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-gst9d" podStartSLOduration=170.485573094 podStartE2EDuration="2m50.485573094s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:40.42703911 +0000 UTC m=+235.677211028" watchObservedRunningTime="2026-03-09 13:24:40.485573094 +0000 UTC m=+235.735745002" Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.536067 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:40 crc kubenswrapper[4764]: E0309 13:24:40.538119 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:41.038100186 +0000 UTC m=+236.288272094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.639902 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:40 crc kubenswrapper[4764]: E0309 13:24:40.640210 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:41.140200191 +0000 UTC m=+236.390372099 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.749844 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:40 crc kubenswrapper[4764]: E0309 13:24:40.750566 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:41.250547598 +0000 UTC m=+236.500719516 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.767472 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-sp2mq" event={"ID":"ee2ad8bf-7cf9-4bab-9638-b26d9c593188","Type":"ContainerStarted","Data":"6406604b5450268574899e64582810b5f907ff3c88fc83afff57a81b94a53144"} Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.769534 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k5dfc" podStartSLOduration=170.769507208 podStartE2EDuration="2m50.769507208s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:40.755033129 +0000 UTC m=+236.005205037" watchObservedRunningTime="2026-03-09 13:24:40.769507208 +0000 UTC m=+236.019679106" Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.772458 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.786533 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" event={"ID":"4912a02a-743d-4bbe-9063-7d99ccd3329a","Type":"ContainerStarted","Data":"5ba2f46b011491b44b263563cb6458b4ff96a75dc2fc530aa089b2237d16c9dc"} Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.830805 4764 patch_prober.go:28] interesting pod/console-operator-58897d9998-sp2mq container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.830868 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-sp2mq" podUID="ee2ad8bf-7cf9-4bab-9638-b26d9c593188" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.831064 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" event={"ID":"e025e897-5ff3-476b-81c9-afdd0ae7a25f","Type":"ContainerStarted","Data":"096479a1922d69082b1ef9b9e58ac118c8a36513f53b16f4ee961276e252ac7b"} Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.854389 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" podStartSLOduration=169.85437101 podStartE2EDuration="2m49.85437101s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:40.803926834 +0000 UTC m=+236.054098752" watchObservedRunningTime="2026-03-09 13:24:40.85437101 +0000 UTC m=+236.104542918" Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.855481 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-ptjnd" podStartSLOduration=170.8554762 podStartE2EDuration="2m50.8554762s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:40.854199745 +0000 UTC m=+236.104371653" watchObservedRunningTime="2026-03-09 13:24:40.8554762 +0000 UTC m=+236.105648118" Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.872131 4764 generic.go:334] "Generic (PLEG): container finished" podID="f95ed010-a6a4-49ab-b61b-fc4ee2d856bb" containerID="58c3f7cb0255d51f3d1c36ba3c3a9cc17e6a287531939e7bdc0a0ed62bdddec0" exitCode=0 Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.872456 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" event={"ID":"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb","Type":"ContainerDied","Data":"58c3f7cb0255d51f3d1c36ba3c3a9cc17e6a287531939e7bdc0a0ed62bdddec0"} Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.883475 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:40 crc kubenswrapper[4764]: E0309 13:24:40.891592 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:41.39156239 +0000 UTC m=+236.641734298 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.900802 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" podStartSLOduration=170.900781128 podStartE2EDuration="2m50.900781128s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:40.900273504 +0000 UTC m=+236.150445442" watchObservedRunningTime="2026-03-09 13:24:40.900781128 +0000 UTC m=+236.150953036" Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.913120 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" event={"ID":"2968d8f2-48fa-470b-b90e-41bb83bd77c4","Type":"ContainerStarted","Data":"b271e38c47490532d225833222b6ca5e4a305ef1f97ca9ce9dcd15c0dd256421"} Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.964460 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5g2pp" event={"ID":"ad307984-e46b-466b-8a5c-63a00976fbbf","Type":"ContainerStarted","Data":"cc1cf6f09872939d79376154021ce0b2d674d8e6ef7e53ddd1bd64b7e2f6ebe6"} Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.965880 4764 generic.go:334] "Generic (PLEG): container finished" podID="61b85db0-a292-42a8-8296-d0e476d80c89" containerID="f192a03e8cad6434074c3bf8b0d1239551233c346c1a5a801901b4e502485f85" exitCode=0 Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.965966 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" event={"ID":"61b85db0-a292-42a8-8296-d0e476d80c89","Type":"ContainerDied","Data":"f192a03e8cad6434074c3bf8b0d1239551233c346c1a5a801901b4e502485f85"} Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.967310 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-klv4l" podStartSLOduration=170.967283806 podStartE2EDuration="2m50.967283806s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:40.963473243 +0000 UTC m=+236.213645151" watchObservedRunningTime="2026-03-09 13:24:40.967283806 +0000 UTC m=+236.217455714" Mar 09 13:24:40 crc kubenswrapper[4764]: I0309 13:24:40.989199 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:40 crc kubenswrapper[4764]: E0309 13:24:40.990403 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:41.490388707 +0000 UTC m=+236.740560615 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.004910 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-sp2mq" podStartSLOduration=171.004887507 podStartE2EDuration="2m51.004887507s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:41.003274413 +0000 UTC m=+236.253446321" watchObservedRunningTime="2026-03-09 13:24:41.004887507 +0000 UTC m=+236.255059415" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.014611 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551044-p748f" event={"ID":"0a005f65-920a-4cdd-b4da-a270953113aa","Type":"ContainerStarted","Data":"0090a60e7aca5b3b5065eab933849dd34829a2b082555fb9f3ff31c4a933640e"} Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.038470 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-gnnbl" event={"ID":"b9c0d96b-ed96-4925-b890-8743879a8b38","Type":"ContainerStarted","Data":"dc0224d4c710f59d5ae4d9c8ebaf5e2202f8d17f96530cfbc95c1c94218d479e"} Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.061779 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-kdxg4" event={"ID":"8e5162b2-c722-4535-9adf-3af0eee24211","Type":"ContainerStarted","Data":"f37b9faf1705a2e566b1a64d30c1be15f3c6b2bdccb7212ed794a08b63dc64b0"} Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.061858 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-kdxg4" event={"ID":"8e5162b2-c722-4535-9adf-3af0eee24211","Type":"ContainerStarted","Data":"99260d16307bfad0173f86d572b75d711422195fab1bc9319bafc392c5caee3f"} Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.071044 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" event={"ID":"39da5087-79bc-4154-b340-22183d9e4417","Type":"ContainerStarted","Data":"c05318f0e67358bb015e1f76742485abff9f53469c740dc411704a1af2febb07"} Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.084732 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-svr2n"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.092852 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" podStartSLOduration=171.092827441 podStartE2EDuration="2m51.092827441s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:41.088768352 +0000 UTC m=+236.338940270" watchObservedRunningTime="2026-03-09 13:24:41.092827441 +0000 UTC m=+236.342999349" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.094027 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:41 crc kubenswrapper[4764]: E0309 13:24:41.094443 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:41.594428654 +0000 UTC m=+236.844600562 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.111084 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" event={"ID":"23f87d2b-2a92-4abb-a2a6-2de508837343","Type":"ContainerStarted","Data":"9e34b8440c3bc84a76b19e43ca39644e8f0942f7bbfffcb2eb8834dd406ad88b"} Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.139108 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c2m6k" event={"ID":"baee6113-40a8-468e-b343-09e9afd65ce3","Type":"ContainerStarted","Data":"ab46b2a08c4693388a36bd0c17784e529e27cba686dcd4cbbefb51c733f08617"} Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.165493 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" podStartSLOduration=171.165463274 podStartE2EDuration="2m51.165463274s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:41.144259504 +0000 UTC m=+236.394431422" watchObservedRunningTime="2026-03-09 13:24:41.165463274 +0000 UTC m=+236.415635182" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.167043 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9k28f"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.195922 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:41 crc kubenswrapper[4764]: E0309 13:24:41.199061 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:41.699031087 +0000 UTC m=+236.949202995 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.221289 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.221763 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.229801 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-gnnbl" podStartSLOduration=171.229779093 podStartE2EDuration="2m51.229779093s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:41.217721369 +0000 UTC m=+236.467893277" watchObservedRunningTime="2026-03-09 13:24:41.229779093 +0000 UTC m=+236.479951011" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.242848 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8g9lj" event={"ID":"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d","Type":"ContainerStarted","Data":"472485e262204415197653c119165e15c1dde77e4c46e1bf7f2b023ba959af99"} Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.249419 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.266196 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-nj856" event={"ID":"f9b3244b-8df0-4330-9887-4092260d416a","Type":"ContainerStarted","Data":"5e8fcdb2f8638bd16f9f163bfced3472de42f2c6547d60287a9a8804c7cc74e3"} Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.267415 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.274452 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.277143 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.279044 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-kdxg4" podStartSLOduration=5.279018277 podStartE2EDuration="5.279018277s" podCreationTimestamp="2026-03-09 13:24:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:41.268674699 +0000 UTC m=+236.518846607" watchObservedRunningTime="2026-03-09 13:24:41.279018277 +0000 UTC m=+236.529190175" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.282615 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.284713 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-d4gwh"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.297405 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" event={"ID":"4125448d-5832-43c2-8dba-d95adde7458a","Type":"ContainerStarted","Data":"974a6a0feae12692003e99871ae67b10d7691df07927796cce4716a1be304c39"} Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.297533 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" event={"ID":"4125448d-5832-43c2-8dba-d95adde7458a","Type":"ContainerStarted","Data":"b690fd6d76bc82d1b0b1264df224431c765be00012875b5c40608249817b21f6"} Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.304055 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:41 crc kubenswrapper[4764]: E0309 13:24:41.304477 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:41.804460631 +0000 UTC m=+237.054632539 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.316209 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.316288 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.319614 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.356635 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-c2m6k" podStartSLOduration=171.356610893 podStartE2EDuration="2m51.356610893s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:41.33677884 +0000 UTC m=+236.586950758" watchObservedRunningTime="2026-03-09 13:24:41.356610893 +0000 UTC m=+236.606782811" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.388685 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-8g9lj" podStartSLOduration=171.388671455 podStartE2EDuration="2m51.388671455s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:41.38809454 +0000 UTC m=+236.638266438" watchObservedRunningTime="2026-03-09 13:24:41.388671455 +0000 UTC m=+236.638843363" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.410870 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:41 crc kubenswrapper[4764]: E0309 13:24:41.412415 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:41.912399973 +0000 UTC m=+237.162571881 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.513254 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:41 crc kubenswrapper[4764]: E0309 13:24:41.513907 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:42.013894162 +0000 UTC m=+237.264066070 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.518245 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-nnrmm" podStartSLOduration=171.518220379 podStartE2EDuration="2m51.518220379s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:41.454559717 +0000 UTC m=+236.704731625" watchObservedRunningTime="2026-03-09 13:24:41.518220379 +0000 UTC m=+236.768392287" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.519449 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-nj856" podStartSLOduration=171.519441941 podStartE2EDuration="2m51.519441941s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:41.517370486 +0000 UTC m=+236.767542414" watchObservedRunningTime="2026-03-09 13:24:41.519441941 +0000 UTC m=+236.769613849" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.553940 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-b6nrf"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.561883 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-t8ft9" podStartSLOduration=170.561839701 podStartE2EDuration="2m50.561839701s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:41.561841501 +0000 UTC m=+236.812013409" watchObservedRunningTime="2026-03-09 13:24:41.561839701 +0000 UTC m=+236.812011609" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.586809 4764 patch_prober.go:28] interesting pod/apiserver-76f77b778f-gst9d container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 09 13:24:41 crc kubenswrapper[4764]: [+]log ok Mar 09 13:24:41 crc kubenswrapper[4764]: [+]etcd ok Mar 09 13:24:41 crc kubenswrapper[4764]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 09 13:24:41 crc kubenswrapper[4764]: [+]poststarthook/generic-apiserver-start-informers ok Mar 09 13:24:41 crc kubenswrapper[4764]: [+]poststarthook/max-in-flight-filter ok Mar 09 13:24:41 crc kubenswrapper[4764]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 09 13:24:41 crc kubenswrapper[4764]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 09 13:24:41 crc kubenswrapper[4764]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 09 13:24:41 crc kubenswrapper[4764]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 09 13:24:41 crc kubenswrapper[4764]: [+]poststarthook/project.openshift.io-projectcache ok Mar 09 13:24:41 crc kubenswrapper[4764]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 09 13:24:41 crc kubenswrapper[4764]: [+]poststarthook/openshift.io-startinformers ok Mar 09 13:24:41 crc kubenswrapper[4764]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 09 13:24:41 crc kubenswrapper[4764]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 09 13:24:41 crc kubenswrapper[4764]: livez check failed Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.586877 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-gst9d" podUID="1ffb8d96-e6e4-4859-ae7d-37f900979485" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.619167 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:41 crc kubenswrapper[4764]: E0309 13:24:41.619512 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:42.119497142 +0000 UTC m=+237.369669040 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.720832 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:41 crc kubenswrapper[4764]: E0309 13:24:41.721251 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:42.221235697 +0000 UTC m=+237.471407605 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.730577 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hf98d"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.731041 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.731213 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2bqx9"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.731299 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.737850 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.740824 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6nhpx"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.745399 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-qddjs"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.752687 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.785430 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-x927s"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.796046 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.803061 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.803235 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.821815 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:41 crc kubenswrapper[4764]: E0309 13:24:41.822178 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:42.32215105 +0000 UTC m=+237.572322958 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.822270 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:41 crc kubenswrapper[4764]: E0309 13:24:41.824622 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:42.324607616 +0000 UTC m=+237.574779524 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.892769 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-k96kg"] Mar 09 13:24:41 crc kubenswrapper[4764]: I0309 13:24:41.923208 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:41 crc kubenswrapper[4764]: E0309 13:24:41.923548 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:42.423533836 +0000 UTC m=+237.673705744 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:41 crc kubenswrapper[4764]: W0309 13:24:41.997101 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7bf830c_e91a_4dd1_a3b5_64ca95d57e44.slice/crio-51e42dcb16123e5cf2fd2b0d634046ecad5b3c0788df6e54acad60f0073d6310 WatchSource:0}: Error finding container 51e42dcb16123e5cf2fd2b0d634046ecad5b3c0788df6e54acad60f0073d6310: Status 404 returned error can't find the container with id 51e42dcb16123e5cf2fd2b0d634046ecad5b3c0788df6e54acad60f0073d6310 Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.043466 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:42 crc kubenswrapper[4764]: E0309 13:24:42.043765 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:42.543754198 +0000 UTC m=+237.793926106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.144266 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:42 crc kubenswrapper[4764]: E0309 13:24:42.144461 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:42.644428975 +0000 UTC m=+237.894600893 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.144579 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:42 crc kubenswrapper[4764]: E0309 13:24:42.144949 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:42.644933859 +0000 UTC m=+237.895105767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.246742 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:42 crc kubenswrapper[4764]: E0309 13:24:42.246841 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:42.746822257 +0000 UTC m=+237.996994165 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.247274 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:42 crc kubenswrapper[4764]: E0309 13:24:42.247609 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:42.747602048 +0000 UTC m=+237.997773956 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.257323 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tgqwl"] Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.265013 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.327486 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" event={"ID":"2968d8f2-48fa-470b-b90e-41bb83bd77c4","Type":"ContainerStarted","Data":"3c841173303d5e946e28133a9f7209f8258a96e0d394cc6cb05219cd1ad5cd80"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.327544 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" event={"ID":"2968d8f2-48fa-470b-b90e-41bb83bd77c4","Type":"ContainerStarted","Data":"e882a241a6b64098b307bd65302f593b109c7616d0dfcd5af3432be62bac1490"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.329181 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9k28f" event={"ID":"4ba55602-0e3f-4722-b437-546732351bc4","Type":"ContainerStarted","Data":"ce0e28aa70d3b1872a1159ed0a983773af51f49c62d7d29ccea3a54c85809714"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.329201 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9k28f" event={"ID":"4ba55602-0e3f-4722-b437-546732351bc4","Type":"ContainerStarted","Data":"ce323e500dd6968c2c9c8d88fa53de8fd906e1738bf2fd3a70a4cc1dbd027d43"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.330462 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" event={"ID":"711447e6-e7cf-4577-8050-b5a391f96f6a","Type":"ContainerStarted","Data":"d1aeac6ab405adbedcfbf26d14673b870c7935fc9017a2f997ff017a10884246"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.332230 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6" event={"ID":"36228b47-79c7-484d-9753-6f36806aa344","Type":"ContainerStarted","Data":"2a85ff48150941711bc5dc6f6e1946e85fc51e5e1ee8355d96efd19f94408a1b"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.347814 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:42 crc kubenswrapper[4764]: E0309 13:24:42.348293 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:42.848278355 +0000 UTC m=+238.098450263 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.356824 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-84448" event={"ID":"e025e897-5ff3-476b-81c9-afdd0ae7a25f","Type":"ContainerStarted","Data":"683a86869b3cf186005a2d8d66a9152062ee1aa672b9fadbcc0e02bd95576ecb"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.421483 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" event={"ID":"39249ace-8681-491c-a547-869e72d297a7","Type":"ContainerStarted","Data":"e130962f3c9e2e9cc9876329a3337db4ebea068c866a0f3f7b18814bee37cbac"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.434862 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss" event={"ID":"d0959a00-2a83-457f-bcba-7d4af48b11c3","Type":"ContainerStarted","Data":"46c216dd80fead93f3b79f0eb2a4d786abd149af408a05f24bc985c61274f03d"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.439546 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" event={"ID":"f95ed010-a6a4-49ab-b61b-fc4ee2d856bb","Type":"ContainerStarted","Data":"4f9046706a3f827f5fd453c79a14e6921bf3de7ff8fea0a8fcd6339057fbf22a"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.447960 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" event={"ID":"39da5087-79bc-4154-b340-22183d9e4417","Type":"ContainerStarted","Data":"45383c97959a17ffdeaa9f0ab6e8a1110b113c90d84de0fa490663169c04fa26"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.449023 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:42 crc kubenswrapper[4764]: E0309 13:24:42.450960 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:42.950948325 +0000 UTC m=+238.201120233 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.493933 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-k96kg" event={"ID":"b7bf830c-e91a-4dd1-a3b5-64ca95d57e44","Type":"ContainerStarted","Data":"51e42dcb16123e5cf2fd2b0d634046ecad5b3c0788df6e54acad60f0073d6310"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.494719 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r"] Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.520604 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cvsb9" podStartSLOduration=172.520582147 podStartE2EDuration="2m52.520582147s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:42.494838185 +0000 UTC m=+237.745010093" watchObservedRunningTime="2026-03-09 13:24:42.520582147 +0000 UTC m=+237.770754055" Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.530951 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" event={"ID":"1ccc5b44-95ad-4f4c-8086-c176c41bbd19","Type":"ContainerStarted","Data":"771cd63965fde5f5f03cba604e9f4e1989cf6a4881a27fbd710be5727898d90a"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.545827 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9k28f" podStartSLOduration=171.545798206 podStartE2EDuration="2m51.545798206s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:42.542353013 +0000 UTC m=+237.792524921" watchObservedRunningTime="2026-03-09 13:24:42.545798206 +0000 UTC m=+237.795970114" Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.546279 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn" event={"ID":"1fe7eff7-ce11-4d46-bf25-06162522c1ff","Type":"ContainerStarted","Data":"8e8672c3608b3ac01b268f2222a4a8cd4ad621c1058e37144ca6a9c8fec3428d"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.546339 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn" event={"ID":"1fe7eff7-ce11-4d46-bf25-06162522c1ff","Type":"ContainerStarted","Data":"b76cc8470833290a8cdc0af534890facef6330d24592cf5cab7eff90a97b01b4"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.568803 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:42 crc kubenswrapper[4764]: E0309 13:24:42.569901 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:43.069885673 +0000 UTC m=+238.320057581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.586726 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" event={"ID":"4912a02a-743d-4bbe-9063-7d99ccd3329a","Type":"ContainerStarted","Data":"bd09da2955adc131fc7c3ddd3df7a6b960386a79e7cc0b7bb14222b804484b3c"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.586770 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" event={"ID":"4912a02a-743d-4bbe-9063-7d99ccd3329a","Type":"ContainerStarted","Data":"9ba7188c7b70176f3445c4dce5f61edb07ec68251679c54544de9639e45e6110"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.597225 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" event={"ID":"30a07c97-9d99-41be-956e-ba3d6505d318","Type":"ContainerStarted","Data":"52ff6a9b11622efde32cfd2c095ae29f67209b68a6062d6f9f3d39a5e5954bcb"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.612332 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" event={"ID":"14b4aa8c-1066-4388-9442-07722e4c76c2","Type":"ContainerStarted","Data":"6e0609643514d25b899aa0fec5020ebe1e5f97db34a9c6358382637eb561b8bf"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.613623 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6nhpx" event={"ID":"0c464093-f30e-4c79-84cd-98a6d49b813c","Type":"ContainerStarted","Data":"6fda0449cd3de2ac61b5306b39c5053b3087e83da3f97ebe92a6989af5624216"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.629068 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" podStartSLOduration=171.629050654 podStartE2EDuration="2m51.629050654s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:42.626733391 +0000 UTC m=+237.876905309" watchObservedRunningTime="2026-03-09 13:24:42.629050654 +0000 UTC m=+237.879222562" Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.630785 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m" event={"ID":"f4f31f9d-d2da-4e3d-b002-2f477d344fc0","Type":"ContainerStarted","Data":"018ed989a89604caa549e9933a3e4afacd7cdbe9d8fb62d4a234b3d49348aead"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.666770 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" event={"ID":"61b85db0-a292-42a8-8296-d0e476d80c89","Type":"ContainerStarted","Data":"dc18497145249da5d49ff9ecb83795ba7f30271084e17b522ebdc49b11a91a79"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.667401 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.670619 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:42 crc kubenswrapper[4764]: E0309 13:24:42.688263 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:43.188233235 +0000 UTC m=+238.438405143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.688432 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8" event={"ID":"db0ec273-54d8-4753-b519-243b727a9efd","Type":"ContainerStarted","Data":"85384ed843ba7525776bc0e37762a1506fee5aed8c8b4de5a63a312c01aa5397"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.708554 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b6nrf" event={"ID":"2c795327-bff9-453d-b1b2-d48a2ef2b48f","Type":"ContainerStarted","Data":"0e8d6fc6921c311c06fdbd8c2b577d743a2f36ee5852e92052d728ca4e7e7b9e"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.709577 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" event={"ID":"e56f29e6-292c-48d2-a5d7-531cfa6689f5","Type":"ContainerStarted","Data":"4b71e28be35b18ba52af2a6b9e7dfe5e124ed335ec95f1aabd4ac0cb31e12461"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.707867 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2pfhk" podStartSLOduration=171.707843802 podStartE2EDuration="2m51.707843802s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:42.706778294 +0000 UTC m=+237.956950202" watchObservedRunningTime="2026-03-09 13:24:42.707843802 +0000 UTC m=+237.958015710" Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.712934 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6brcn" podStartSLOduration=171.712914439 podStartE2EDuration="2m51.712914439s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:42.666280325 +0000 UTC m=+237.916452233" watchObservedRunningTime="2026-03-09 13:24:42.712914439 +0000 UTC m=+237.963086357" Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.755902 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" podStartSLOduration=172.755881114 podStartE2EDuration="2m52.755881114s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:42.755248807 +0000 UTC m=+238.005420735" watchObservedRunningTime="2026-03-09 13:24:42.755881114 +0000 UTC m=+238.006053022" Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.769336 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hf98d" event={"ID":"957cfae4-9b1a-4e7f-b526-60bc1ee1a1b4","Type":"ContainerStarted","Data":"fb969b9bf90fa52524547e416ef64f049b665c02a0cee1fda383ad6afb853030"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.788472 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:42 crc kubenswrapper[4764]: E0309 13:24:42.789316 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:43.289263301 +0000 UTC m=+238.539435209 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.802939 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:42 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:42 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:42 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.803068 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.823993 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-x927s" event={"ID":"d245b116-e47d-4b15-a40d-0a9fa34cf1df","Type":"ContainerStarted","Data":"fb8983fcce0c65bb236e4bf6391a7232453b84690f04abd146b7159e644824b5"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.837131 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-qddjs" event={"ID":"24df0ad8-c9b4-46b6-8751-23e26fc391c5","Type":"ContainerStarted","Data":"aa04e64eb67113f09ca7b1961f1e7e1abe40f9ad1e80aa1a5c846db9de1e02ba"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.847683 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-svr2n" event={"ID":"8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497","Type":"ContainerStarted","Data":"bc0446f3c4009aefe59e54fcf22af3e8b6492a5f7a324e98c3ceaef95c9107b9"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.847998 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-svr2n" event={"ID":"8d3de7d9-c3ef-4ead-8ff7-81d34bf2e497","Type":"ContainerStarted","Data":"acc1adb0f9a7d2151a697693548447578320be7a5c1a7959581039acf6e2722f"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.856466 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5g2pp" event={"ID":"ad307984-e46b-466b-8a5c-63a00976fbbf","Type":"ContainerStarted","Data":"6ec0ff7f4f3afbfa3ba79233bf901cbdeab145e0b36f5500af8fdfe0fae981f4"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.856509 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5g2pp" event={"ID":"ad307984-e46b-466b-8a5c-63a00976fbbf","Type":"ContainerStarted","Data":"a77f85524f39dfdce8514bbbba00b0efcc11abc1b64d546d6fe4425c458d4b10"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.866185 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" event={"ID":"b72bd4db-e5ea-44f6-bdce-81df2966acfb","Type":"ContainerStarted","Data":"b9c2d68b1b05bc39c19dc9c0d1b5c51309a4b0fc69bc58ae3f2b3b3980758063"} Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.868859 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.893056 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-svr2n" podStartSLOduration=171.893037092 podStartE2EDuration="2m51.893037092s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:42.883564507 +0000 UTC m=+238.133736425" watchObservedRunningTime="2026-03-09 13:24:42.893037092 +0000 UTC m=+238.143209000" Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.893441 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:42 crc kubenswrapper[4764]: E0309 13:24:42.907667 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:43.407651525 +0000 UTC m=+238.657823433 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.943086 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-sp2mq" Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.960194 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.997755 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m9kmt" podStartSLOduration=171.997738907 podStartE2EDuration="2m51.997738907s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:42.997043388 +0000 UTC m=+238.247215296" watchObservedRunningTime="2026-03-09 13:24:42.997738907 +0000 UTC m=+238.247910815" Mar 09 13:24:42 crc kubenswrapper[4764]: I0309 13:24:42.999937 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:43 crc kubenswrapper[4764]: E0309 13:24:43.000111 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:43.50008812 +0000 UTC m=+238.750260038 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.000734 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:43 crc kubenswrapper[4764]: E0309 13:24:43.001067 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:43.501054216 +0000 UTC m=+238.751226124 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.105801 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:43 crc kubenswrapper[4764]: E0309 13:24:43.106010 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:43.605980607 +0000 UTC m=+238.856152515 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.106473 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:43 crc kubenswrapper[4764]: E0309 13:24:43.106828 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:43.606815959 +0000 UTC m=+238.856987857 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.124240 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5g2pp" podStartSLOduration=173.124223257 podStartE2EDuration="2m53.124223257s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:43.032909242 +0000 UTC m=+238.283081150" watchObservedRunningTime="2026-03-09 13:24:43.124223257 +0000 UTC m=+238.374395165" Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.208485 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:43 crc kubenswrapper[4764]: E0309 13:24:43.208955 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:43.708941455 +0000 UTC m=+238.959113363 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.242503 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.242928 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.287314 4764 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-mp9p7 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.32:8443/livez\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.287391 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" podUID="f95ed010-a6a4-49ab-b61b-fc4ee2d856bb" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.32:8443/livez\": dial tcp 10.217.0.32:8443: connect: connection refused" Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.309842 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:43 crc kubenswrapper[4764]: E0309 13:24:43.310195 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:43.810181247 +0000 UTC m=+239.060353155 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.411637 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:43 crc kubenswrapper[4764]: E0309 13:24:43.412094 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:43.912077747 +0000 UTC m=+239.162249655 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.416891 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:43 crc kubenswrapper[4764]: E0309 13:24:43.417433 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:43.91741677 +0000 UTC m=+239.167588678 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.518256 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:43 crc kubenswrapper[4764]: E0309 13:24:43.518695 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:44.018660843 +0000 UTC m=+239.268832751 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.623480 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:43 crc kubenswrapper[4764]: E0309 13:24:43.624203 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:44.12418878 +0000 UTC m=+239.374360688 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.725005 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:43 crc kubenswrapper[4764]: E0309 13:24:43.725465 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:44.225448593 +0000 UTC m=+239.475620501 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.814405 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:43 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:43 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:43 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.814448 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.827317 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:43 crc kubenswrapper[4764]: E0309 13:24:43.827652 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:44.32763511 +0000 UTC m=+239.577807018 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.898064 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-k96kg" event={"ID":"b7bf830c-e91a-4dd1-a3b5-64ca95d57e44","Type":"ContainerStarted","Data":"5a4c9d2a72e910ac851b9d01334a9a52e320d29fcd19cbd0b7048b2d972bdb8b"} Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.914147 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" event={"ID":"39249ace-8681-491c-a547-869e72d297a7","Type":"ContainerStarted","Data":"9dbfdb319cfbda4d2363de45293073dc4f68963073184760bddb42b83b7fd959"} Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.928461 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:43 crc kubenswrapper[4764]: E0309 13:24:43.929140 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:44.429109028 +0000 UTC m=+239.679280936 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.949683 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" event={"ID":"14b4aa8c-1066-4388-9442-07722e4c76c2","Type":"ContainerStarted","Data":"239a4bfbac809dccd0d77ee5ebd983674866554d54f688f9884e5532b5af4bf3"} Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.950926 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.952290 4764 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-m4bs6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:5443/healthz\": dial tcp 10.217.0.27:5443: connect: connection refused" start-of-body= Mar 09 13:24:43 crc kubenswrapper[4764]: I0309 13:24:43.952332 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" podUID="14b4aa8c-1066-4388-9442-07722e4c76c2" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.27:5443/healthz\": dial tcp 10.217.0.27:5443: connect: connection refused" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.002047 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" podStartSLOduration=173.002030439 podStartE2EDuration="2m53.002030439s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:44.000685593 +0000 UTC m=+239.250857511" watchObservedRunningTime="2026-03-09 13:24:44.002030439 +0000 UTC m=+239.252202347" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.003189 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-k96kg" podStartSLOduration=9.00318289 podStartE2EDuration="9.00318289s" podCreationTimestamp="2026-03-09 13:24:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:43.945750686 +0000 UTC m=+239.195922594" watchObservedRunningTime="2026-03-09 13:24:44.00318289 +0000 UTC m=+239.253354798" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.008911 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b6nrf" event={"ID":"2c795327-bff9-453d-b1b2-d48a2ef2b48f","Type":"ContainerStarted","Data":"0092591463bed38f0d9307f831756d3403c2c467907ed9ed4adf7df61b3e8e83"} Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.037991 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:44 crc kubenswrapper[4764]: E0309 13:24:44.039764 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:44.539747823 +0000 UTC m=+239.789919731 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.042336 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hf98d" event={"ID":"957cfae4-9b1a-4e7f-b526-60bc1ee1a1b4","Type":"ContainerStarted","Data":"0247f0ecbef2c5254057c30c71649d8fc73c61d8130087fef46f695974b0c5d0"} Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.083310 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" event={"ID":"30a07c97-9d99-41be-956e-ba3d6505d318","Type":"ContainerStarted","Data":"9ab3fcffeccd0977d687bad1b407d063e8a5dbd95b934ebdbc0879da24a9d363"} Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.084388 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.091565 4764 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-ptbpd container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.091614 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" podUID="30a07c97-9d99-41be-956e-ba3d6505d318" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.129750 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" event={"ID":"1ccc5b44-95ad-4f4c-8086-c176c41bbd19","Type":"ContainerStarted","Data":"2b23f753224c479f8f3f33754ceac00343e6409f6988048cb245d41d456b1666"} Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.129810 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.141196 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:44 crc kubenswrapper[4764]: E0309 13:24:44.141871 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:44.641855198 +0000 UTC m=+239.892027106 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.142661 4764 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-d4gwh container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.142718 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" podUID="1ccc5b44-95ad-4f4c-8086-c176c41bbd19" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.147294 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-qddjs" event={"ID":"24df0ad8-c9b4-46b6-8751-23e26fc391c5","Type":"ContainerStarted","Data":"c0a5e0c4e983568b143fbc0bc2bb99af436b4e6317ddf3b8ba9ae7a67dfad358"} Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.150395 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6" podStartSLOduration=173.150382698 podStartE2EDuration="2m53.150382698s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:44.038487849 +0000 UTC m=+239.288659757" watchObservedRunningTime="2026-03-09 13:24:44.150382698 +0000 UTC m=+239.400554606" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.151844 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" podStartSLOduration=173.151837497 podStartE2EDuration="2m53.151837497s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:44.146654637 +0000 UTC m=+239.396826555" watchObservedRunningTime="2026-03-09 13:24:44.151837497 +0000 UTC m=+239.402009405" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.194316 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" event={"ID":"711447e6-e7cf-4577-8050-b5a391f96f6a","Type":"ContainerStarted","Data":"cb07cbdd389629d8ee18f07cc12b6c2c3505d455786f0a1b1e67b9ed8a5d4b78"} Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.220404 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-qddjs" podStartSLOduration=173.22038963 podStartE2EDuration="2m53.22038963s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:44.218686534 +0000 UTC m=+239.468858442" watchObservedRunningTime="2026-03-09 13:24:44.22038963 +0000 UTC m=+239.470561538" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.220834 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" podStartSLOduration=173.220826212 podStartE2EDuration="2m53.220826212s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:44.178091893 +0000 UTC m=+239.428263811" watchObservedRunningTime="2026-03-09 13:24:44.220826212 +0000 UTC m=+239.470998110" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.243215 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:44 crc kubenswrapper[4764]: E0309 13:24:44.244650 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:44.744631922 +0000 UTC m=+239.994803830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.260078 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xfvtf" podStartSLOduration=173.260061117 podStartE2EDuration="2m53.260061117s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:44.257516828 +0000 UTC m=+239.507688736" watchObservedRunningTime="2026-03-09 13:24:44.260061117 +0000 UTC m=+239.510233025" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.286457 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-x927s" event={"ID":"d245b116-e47d-4b15-a40d-0a9fa34cf1df","Type":"ContainerStarted","Data":"4ed619baf68f65746ed98427321c72cd4adc6131be0c5ab50bca4ff635aa848c"} Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.286528 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-x927s" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.311403 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss" event={"ID":"d0959a00-2a83-457f-bcba-7d4af48b11c3","Type":"ContainerStarted","Data":"650632f786d1cc04ce84b188d86dd1f51067920d278adb844500f8bb3c442e34"} Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.320626 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-x927s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.320710 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.323872 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-x927s" podStartSLOduration=174.323853342 podStartE2EDuration="2m54.323853342s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:44.312010103 +0000 UTC m=+239.562182011" watchObservedRunningTime="2026-03-09 13:24:44.323853342 +0000 UTC m=+239.574025260" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.344727 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:44 crc kubenswrapper[4764]: E0309 13:24:44.345441 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:44.845387111 +0000 UTC m=+240.095559019 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.347333 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m" event={"ID":"f4f31f9d-d2da-4e3d-b002-2f477d344fc0","Type":"ContainerStarted","Data":"4b48ee04fb4d9cd72ec995aff59d66de6aa64b9977090159021e71e6e46f1c0a"} Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.347374 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m" event={"ID":"f4f31f9d-d2da-4e3d-b002-2f477d344fc0","Type":"ContainerStarted","Data":"228541715f137f4256dfda489265aa94898f79ab8cc9bbc96f152ad8f9b36b00"} Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.370471 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8" event={"ID":"db0ec273-54d8-4753-b519-243b727a9efd","Type":"ContainerStarted","Data":"297d9673d6c5a5a84640d3d03df6d12bc6a8bee1a2f228d1b78cc69911a31952"} Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.370526 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8" event={"ID":"db0ec273-54d8-4753-b519-243b727a9efd","Type":"ContainerStarted","Data":"103dbe40f99ec961f84f53a9e285ee4190ff7195b2a42f0029837297da92234c"} Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.370542 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.372120 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" podUID="b625331d-48ab-4d48-86fd-fe73466305ff" containerName="controller-manager" containerID="cri-o://878ba1fbacc5d195f2c3f5d1f7ed102a9da213e083eb5f15b669f308393abf9d" gracePeriod=30 Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.373673 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" podUID="ad49b4d8-1218-4a34-8455-831d0f563cbf" containerName="route-controller-manager" containerID="cri-o://87b7e8df716d6db2234cb0d9f4b5435733eebf3453983d174cfd8ed4a517ff0c" gracePeriod=30 Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.386765 4764 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-fxv7j container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.386817 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" podUID="61b85db0-a292-42a8-8296-d0e476d80c89" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.396180 4764 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-fxv7j container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.396224 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" podUID="61b85db0-a292-42a8-8296-d0e476d80c89" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.396656 4764 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-fxv7j container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.396688 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" podUID="61b85db0-a292-42a8-8296-d0e476d80c89" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.410976 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tvsss" podStartSLOduration=173.410959464 podStartE2EDuration="2m53.410959464s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:44.371817991 +0000 UTC m=+239.621989899" watchObservedRunningTime="2026-03-09 13:24:44.410959464 +0000 UTC m=+239.661131372" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.411071 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kt67m" podStartSLOduration=173.411066727 podStartE2EDuration="2m53.411066727s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:44.410540832 +0000 UTC m=+239.660712740" watchObservedRunningTime="2026-03-09 13:24:44.411066727 +0000 UTC m=+239.661238645" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.447903 4764 ???:1] "http: TLS handshake error from 192.168.126.11:48390: no serving certificate available for the kubelet" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.457215 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:44 crc kubenswrapper[4764]: E0309 13:24:44.458339 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:44.958325287 +0000 UTC m=+240.208497285 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.493267 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8" podStartSLOduration=173.493253516 podStartE2EDuration="2m53.493253516s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:44.486737101 +0000 UTC m=+239.736909009" watchObservedRunningTime="2026-03-09 13:24:44.493253516 +0000 UTC m=+239.743425424" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.507293 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nrc8s"] Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.514152 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.517020 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.517633 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nrc8s"] Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.536990 4764 ???:1] "http: TLS handshake error from 192.168.126.11:48400: no serving certificate available for the kubelet" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.563960 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.564107 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-utilities\") pod \"community-operators-nrc8s\" (UID: \"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97\") " pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.564203 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff292\" (UniqueName: \"kubernetes.io/projected/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-kube-api-access-ff292\") pod \"community-operators-nrc8s\" (UID: \"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97\") " pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.564221 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-catalog-content\") pod \"community-operators-nrc8s\" (UID: \"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97\") " pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:24:44 crc kubenswrapper[4764]: E0309 13:24:44.564309 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:45.064295046 +0000 UTC m=+240.314466954 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.622982 4764 ???:1] "http: TLS handshake error from 192.168.126.11:48414: no serving certificate available for the kubelet" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.665630 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-utilities\") pod \"community-operators-nrc8s\" (UID: \"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97\") " pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.666024 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.666119 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff292\" (UniqueName: \"kubernetes.io/projected/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-kube-api-access-ff292\") pod \"community-operators-nrc8s\" (UID: \"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97\") " pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.666147 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-catalog-content\") pod \"community-operators-nrc8s\" (UID: \"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97\") " pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.666942 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-catalog-content\") pod \"community-operators-nrc8s\" (UID: \"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97\") " pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.667167 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-utilities\") pod \"community-operators-nrc8s\" (UID: \"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97\") " pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:24:44 crc kubenswrapper[4764]: E0309 13:24:44.667451 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:45.16743924 +0000 UTC m=+240.417611148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.700551 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8d627"] Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.702127 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.705345 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff292\" (UniqueName: \"kubernetes.io/projected/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-kube-api-access-ff292\") pod \"community-operators-nrc8s\" (UID: \"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97\") " pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.709711 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.712937 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8d627"] Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.725002 4764 ???:1] "http: TLS handshake error from 192.168.126.11:48430: no serving certificate available for the kubelet" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.769139 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.769421 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-catalog-content\") pod \"certified-operators-8d627\" (UID: \"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b\") " pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.769530 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fsvt\" (UniqueName: \"kubernetes.io/projected/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-kube-api-access-5fsvt\") pod \"certified-operators-8d627\" (UID: \"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b\") " pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.769560 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-utilities\") pod \"certified-operators-8d627\" (UID: \"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b\") " pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:24:44 crc kubenswrapper[4764]: E0309 13:24:44.769696 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:45.269661828 +0000 UTC m=+240.519833736 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.793906 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:44 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:44 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:44 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.793956 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.817958 4764 ???:1] "http: TLS handshake error from 192.168.126.11:48446: no serving certificate available for the kubelet" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.870508 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-catalog-content\") pod \"certified-operators-8d627\" (UID: \"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b\") " pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.870955 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-catalog-content\") pod \"certified-operators-8d627\" (UID: \"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b\") " pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.871057 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.871509 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fsvt\" (UniqueName: \"kubernetes.io/projected/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-kube-api-access-5fsvt\") pod \"certified-operators-8d627\" (UID: \"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b\") " pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.871540 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-utilities\") pod \"certified-operators-8d627\" (UID: \"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b\") " pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:24:44 crc kubenswrapper[4764]: E0309 13:24:44.871763 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:45.371718722 +0000 UTC m=+240.621890630 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.872071 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-utilities\") pod \"certified-operators-8d627\" (UID: \"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b\") " pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.912586 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mbm5b"] Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.914247 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.917853 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.918199 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mbm5b"] Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.923103 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fsvt\" (UniqueName: \"kubernetes.io/projected/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-kube-api-access-5fsvt\") pod \"certified-operators-8d627\" (UID: \"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b\") " pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.948177 4764 ???:1] "http: TLS handshake error from 192.168.126.11:48452: no serving certificate available for the kubelet" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.974084 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.974304 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20acdcb5-ea78-435e-b472-e102d5553c75-catalog-content\") pod \"community-operators-mbm5b\" (UID: \"20acdcb5-ea78-435e-b472-e102d5553c75\") " pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.974363 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4s9m\" (UniqueName: \"kubernetes.io/projected/20acdcb5-ea78-435e-b472-e102d5553c75-kube-api-access-f4s9m\") pod \"community-operators-mbm5b\" (UID: \"20acdcb5-ea78-435e-b472-e102d5553c75\") " pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:24:44 crc kubenswrapper[4764]: I0309 13:24:44.974387 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20acdcb5-ea78-435e-b472-e102d5553c75-utilities\") pod \"community-operators-mbm5b\" (UID: \"20acdcb5-ea78-435e-b472-e102d5553c75\") " pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:24:44 crc kubenswrapper[4764]: E0309 13:24:44.974522 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:45.474507035 +0000 UTC m=+240.724678943 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.047033 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.052336 4764 ???:1] "http: TLS handshake error from 192.168.126.11:48460: no serving certificate available for the kubelet" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.080617 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.080692 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20acdcb5-ea78-435e-b472-e102d5553c75-catalog-content\") pod \"community-operators-mbm5b\" (UID: \"20acdcb5-ea78-435e-b472-e102d5553c75\") " pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.080724 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4s9m\" (UniqueName: \"kubernetes.io/projected/20acdcb5-ea78-435e-b472-e102d5553c75-kube-api-access-f4s9m\") pod \"community-operators-mbm5b\" (UID: \"20acdcb5-ea78-435e-b472-e102d5553c75\") " pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.080741 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20acdcb5-ea78-435e-b472-e102d5553c75-utilities\") pod \"community-operators-mbm5b\" (UID: \"20acdcb5-ea78-435e-b472-e102d5553c75\") " pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.081537 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20acdcb5-ea78-435e-b472-e102d5553c75-utilities\") pod \"community-operators-mbm5b\" (UID: \"20acdcb5-ea78-435e-b472-e102d5553c75\") " pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.081784 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20acdcb5-ea78-435e-b472-e102d5553c75-catalog-content\") pod \"community-operators-mbm5b\" (UID: \"20acdcb5-ea78-435e-b472-e102d5553c75\") " pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:24:45 crc kubenswrapper[4764]: E0309 13:24:45.082012 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:45.581999126 +0000 UTC m=+240.832171034 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.121787 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jn8f5"] Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.122919 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.127271 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jn8f5"] Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.150006 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4s9m\" (UniqueName: \"kubernetes.io/projected/20acdcb5-ea78-435e-b472-e102d5553c75-kube-api-access-f4s9m\") pod \"community-operators-mbm5b\" (UID: \"20acdcb5-ea78-435e-b472-e102d5553c75\") " pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.161637 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.182032 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:45 crc kubenswrapper[4764]: E0309 13:24:45.182228 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:45.68220007 +0000 UTC m=+240.932371988 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.182653 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.182704 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a76121be-d090-4f2a-9e57-1a160a4bb4f2-utilities\") pod \"certified-operators-jn8f5\" (UID: \"a76121be-d090-4f2a-9e57-1a160a4bb4f2\") " pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.182759 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a76121be-d090-4f2a-9e57-1a160a4bb4f2-catalog-content\") pod \"certified-operators-jn8f5\" (UID: \"a76121be-d090-4f2a-9e57-1a160a4bb4f2\") " pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.182777 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc5m2\" (UniqueName: \"kubernetes.io/projected/a76121be-d090-4f2a-9e57-1a160a4bb4f2-kube-api-access-dc5m2\") pod \"certified-operators-jn8f5\" (UID: \"a76121be-d090-4f2a-9e57-1a160a4bb4f2\") " pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:24:45 crc kubenswrapper[4764]: E0309 13:24:45.183050 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:45.683038072 +0000 UTC m=+240.933209980 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.203613 4764 ???:1] "http: TLS handshake error from 192.168.126.11:48476: no serving certificate available for the kubelet" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.213823 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5d896c677f-8g9dz"] Mar 09 13:24:45 crc kubenswrapper[4764]: E0309 13:24:45.214023 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b625331d-48ab-4d48-86fd-fe73466305ff" containerName="controller-manager" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.214033 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b625331d-48ab-4d48-86fd-fe73466305ff" containerName="controller-manager" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.214117 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b625331d-48ab-4d48-86fd-fe73466305ff" containerName="controller-manager" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.214456 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.241505 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d896c677f-8g9dz"] Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.273161 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.284044 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-client-ca\") pod \"b625331d-48ab-4d48-86fd-fe73466305ff\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.284129 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-config\") pod \"b625331d-48ab-4d48-86fd-fe73466305ff\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.284148 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b625331d-48ab-4d48-86fd-fe73466305ff-serving-cert\") pod \"b625331d-48ab-4d48-86fd-fe73466305ff\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.284181 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-proxy-ca-bundles\") pod \"b625331d-48ab-4d48-86fd-fe73466305ff\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.284212 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p79zf\" (UniqueName: \"kubernetes.io/projected/b625331d-48ab-4d48-86fd-fe73466305ff-kube-api-access-p79zf\") pod \"b625331d-48ab-4d48-86fd-fe73466305ff\" (UID: \"b625331d-48ab-4d48-86fd-fe73466305ff\") " Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.284351 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.284503 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp2v7\" (UniqueName: \"kubernetes.io/projected/ef84d4f2-b722-415f-bc23-d472e00474b4-kube-api-access-jp2v7\") pod \"controller-manager-5d896c677f-8g9dz\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.284539 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a76121be-d090-4f2a-9e57-1a160a4bb4f2-catalog-content\") pod \"certified-operators-jn8f5\" (UID: \"a76121be-d090-4f2a-9e57-1a160a4bb4f2\") " pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.284560 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc5m2\" (UniqueName: \"kubernetes.io/projected/a76121be-d090-4f2a-9e57-1a160a4bb4f2-kube-api-access-dc5m2\") pod \"certified-operators-jn8f5\" (UID: \"a76121be-d090-4f2a-9e57-1a160a4bb4f2\") " pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.284585 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-config\") pod \"controller-manager-5d896c677f-8g9dz\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.284616 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-proxy-ca-bundles\") pod \"controller-manager-5d896c677f-8g9dz\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.284682 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-client-ca\") pod \"controller-manager-5d896c677f-8g9dz\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.284701 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a76121be-d090-4f2a-9e57-1a160a4bb4f2-utilities\") pod \"certified-operators-jn8f5\" (UID: \"a76121be-d090-4f2a-9e57-1a160a4bb4f2\") " pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.284719 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef84d4f2-b722-415f-bc23-d472e00474b4-serving-cert\") pod \"controller-manager-5d896c677f-8g9dz\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.285471 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-client-ca" (OuterVolumeSpecName: "client-ca") pod "b625331d-48ab-4d48-86fd-fe73466305ff" (UID: "b625331d-48ab-4d48-86fd-fe73466305ff"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.286031 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a76121be-d090-4f2a-9e57-1a160a4bb4f2-catalog-content\") pod \"certified-operators-jn8f5\" (UID: \"a76121be-d090-4f2a-9e57-1a160a4bb4f2\") " pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.290954 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b625331d-48ab-4d48-86fd-fe73466305ff" (UID: "b625331d-48ab-4d48-86fd-fe73466305ff"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:45 crc kubenswrapper[4764]: E0309 13:24:45.291071 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:45.791047216 +0000 UTC m=+241.041219744 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.291166 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a76121be-d090-4f2a-9e57-1a160a4bb4f2-utilities\") pod \"certified-operators-jn8f5\" (UID: \"a76121be-d090-4f2a-9e57-1a160a4bb4f2\") " pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.306486 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b625331d-48ab-4d48-86fd-fe73466305ff-kube-api-access-p79zf" (OuterVolumeSpecName: "kube-api-access-p79zf") pod "b625331d-48ab-4d48-86fd-fe73466305ff" (UID: "b625331d-48ab-4d48-86fd-fe73466305ff"). InnerVolumeSpecName "kube-api-access-p79zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.310963 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-config" (OuterVolumeSpecName: "config") pod "b625331d-48ab-4d48-86fd-fe73466305ff" (UID: "b625331d-48ab-4d48-86fd-fe73466305ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.312239 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b625331d-48ab-4d48-86fd-fe73466305ff-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b625331d-48ab-4d48-86fd-fe73466305ff" (UID: "b625331d-48ab-4d48-86fd-fe73466305ff"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.327529 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc5m2\" (UniqueName: \"kubernetes.io/projected/a76121be-d090-4f2a-9e57-1a160a4bb4f2-kube-api-access-dc5m2\") pod \"certified-operators-jn8f5\" (UID: \"a76121be-d090-4f2a-9e57-1a160a4bb4f2\") " pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.387376 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-config\") pod \"controller-manager-5d896c677f-8g9dz\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.387425 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-proxy-ca-bundles\") pod \"controller-manager-5d896c677f-8g9dz\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.387502 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.387538 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-client-ca\") pod \"controller-manager-5d896c677f-8g9dz\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.387566 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef84d4f2-b722-415f-bc23-d472e00474b4-serving-cert\") pod \"controller-manager-5d896c677f-8g9dz\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.387618 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp2v7\" (UniqueName: \"kubernetes.io/projected/ef84d4f2-b722-415f-bc23-d472e00474b4-kube-api-access-jp2v7\") pod \"controller-manager-5d896c677f-8g9dz\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.387638 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:45 crc kubenswrapper[4764]: E0309 13:24:45.387948 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:45.887932501 +0000 UTC m=+241.138104409 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.389819 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.389830 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b625331d-48ab-4d48-86fd-fe73466305ff-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.389839 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.389847 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b625331d-48ab-4d48-86fd-fe73466305ff-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.389856 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p79zf\" (UniqueName: \"kubernetes.io/projected/b625331d-48ab-4d48-86fd-fe73466305ff-kube-api-access-p79zf\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.388953 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-client-ca\") pod \"controller-manager-5d896c677f-8g9dz\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.392045 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef84d4f2-b722-415f-bc23-d472e00474b4-serving-cert\") pod \"controller-manager-5d896c677f-8g9dz\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.395232 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-proxy-ca-bundles\") pod \"controller-manager-5d896c677f-8g9dz\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.398041 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-config\") pod \"controller-manager-5d896c677f-8g9dz\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.420785 4764 generic.go:334] "Generic (PLEG): container finished" podID="ad49b4d8-1218-4a34-8455-831d0f563cbf" containerID="87b7e8df716d6db2234cb0d9f4b5435733eebf3453983d174cfd8ed4a517ff0c" exitCode=0 Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.421014 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" event={"ID":"ad49b4d8-1218-4a34-8455-831d0f563cbf","Type":"ContainerDied","Data":"87b7e8df716d6db2234cb0d9f4b5435733eebf3453983d174cfd8ed4a517ff0c"} Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.421273 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" event={"ID":"ad49b4d8-1218-4a34-8455-831d0f563cbf","Type":"ContainerDied","Data":"3ebce9784954b6d704d045dc1cbbc83b452607e4967b3e3637a00f82a3e69729"} Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.421299 4764 scope.go:117] "RemoveContainer" containerID="87b7e8df716d6db2234cb0d9f4b5435733eebf3453983d174cfd8ed4a517ff0c" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.426409 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp2v7\" (UniqueName: \"kubernetes.io/projected/ef84d4f2-b722-415f-bc23-d472e00474b4-kube-api-access-jp2v7\") pod \"controller-manager-5d896c677f-8g9dz\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.481489 4764 generic.go:334] "Generic (PLEG): container finished" podID="b625331d-48ab-4d48-86fd-fe73466305ff" containerID="878ba1fbacc5d195f2c3f5d1f7ed102a9da213e083eb5f15b669f308393abf9d" exitCode=0 Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.481696 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.482294 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" event={"ID":"b625331d-48ab-4d48-86fd-fe73466305ff","Type":"ContainerDied","Data":"878ba1fbacc5d195f2c3f5d1f7ed102a9da213e083eb5f15b669f308393abf9d"} Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.482341 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tgqwl" event={"ID":"b625331d-48ab-4d48-86fd-fe73466305ff","Type":"ContainerDied","Data":"22fd4fd8035fc57032bbb13916cd7b7736a6fafcf098f02294f9432a8954ab17"} Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.490295 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn5w6" event={"ID":"36228b47-79c7-484d-9753-6f36806aa344","Type":"ContainerStarted","Data":"692e2aeac6cc8771f4971ec37256a0c666f20ffafaa1026487679c570aa94d67"} Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.491588 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48tpb\" (UniqueName: \"kubernetes.io/projected/ad49b4d8-1218-4a34-8455-831d0f563cbf-kube-api-access-48tpb\") pod \"ad49b4d8-1218-4a34-8455-831d0f563cbf\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.491678 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad49b4d8-1218-4a34-8455-831d0f563cbf-config\") pod \"ad49b4d8-1218-4a34-8455-831d0f563cbf\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.491860 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.491916 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ad49b4d8-1218-4a34-8455-831d0f563cbf-client-ca\") pod \"ad49b4d8-1218-4a34-8455-831d0f563cbf\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.491978 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad49b4d8-1218-4a34-8455-831d0f563cbf-serving-cert\") pod \"ad49b4d8-1218-4a34-8455-831d0f563cbf\" (UID: \"ad49b4d8-1218-4a34-8455-831d0f563cbf\") " Mar 09 13:24:45 crc kubenswrapper[4764]: E0309 13:24:45.492718 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:45.992691208 +0000 UTC m=+241.242863116 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.493580 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad49b4d8-1218-4a34-8455-831d0f563cbf-config" (OuterVolumeSpecName: "config") pod "ad49b4d8-1218-4a34-8455-831d0f563cbf" (UID: "ad49b4d8-1218-4a34-8455-831d0f563cbf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.494222 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad49b4d8-1218-4a34-8455-831d0f563cbf-client-ca" (OuterVolumeSpecName: "client-ca") pod "ad49b4d8-1218-4a34-8455-831d0f563cbf" (UID: "ad49b4d8-1218-4a34-8455-831d0f563cbf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.494406 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad49b4d8-1218-4a34-8455-831d0f563cbf-kube-api-access-48tpb" (OuterVolumeSpecName: "kube-api-access-48tpb") pod "ad49b4d8-1218-4a34-8455-831d0f563cbf" (UID: "ad49b4d8-1218-4a34-8455-831d0f563cbf"). InnerVolumeSpecName "kube-api-access-48tpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.495945 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.502616 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad49b4d8-1218-4a34-8455-831d0f563cbf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ad49b4d8-1218-4a34-8455-831d0f563cbf" (UID: "ad49b4d8-1218-4a34-8455-831d0f563cbf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.523381 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b6nrf" event={"ID":"2c795327-bff9-453d-b1b2-d48a2ef2b48f","Type":"ContainerStarted","Data":"046d9711018c15dc8f5ae3ada6a1affa5fbeb6686e7e7b8ab7b0558e72873174"} Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.569847 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tgqwl"] Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.582637 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b6nrf" podStartSLOduration=174.582619335 podStartE2EDuration="2m54.582619335s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:45.570399757 +0000 UTC m=+240.820571665" watchObservedRunningTime="2026-03-09 13:24:45.582619335 +0000 UTC m=+240.832791243" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.594720 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.594819 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad49b4d8-1218-4a34-8455-831d0f563cbf-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.594842 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48tpb\" (UniqueName: \"kubernetes.io/projected/ad49b4d8-1218-4a34-8455-831d0f563cbf-kube-api-access-48tpb\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.594858 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad49b4d8-1218-4a34-8455-831d0f563cbf-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.594869 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ad49b4d8-1218-4a34-8455-831d0f563cbf-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.600624 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tgqwl"] Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.604474 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:45 crc kubenswrapper[4764]: E0309 13:24:45.616160 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:46.116121806 +0000 UTC m=+241.366293714 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.642506 4764 scope.go:117] "RemoveContainer" containerID="87b7e8df716d6db2234cb0d9f4b5435733eebf3453983d174cfd8ed4a517ff0c" Mar 09 13:24:45 crc kubenswrapper[4764]: E0309 13:24:45.660508 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87b7e8df716d6db2234cb0d9f4b5435733eebf3453983d174cfd8ed4a517ff0c\": container with ID starting with 87b7e8df716d6db2234cb0d9f4b5435733eebf3453983d174cfd8ed4a517ff0c not found: ID does not exist" containerID="87b7e8df716d6db2234cb0d9f4b5435733eebf3453983d174cfd8ed4a517ff0c" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.660562 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87b7e8df716d6db2234cb0d9f4b5435733eebf3453983d174cfd8ed4a517ff0c"} err="failed to get container status \"87b7e8df716d6db2234cb0d9f4b5435733eebf3453983d174cfd8ed4a517ff0c\": rpc error: code = NotFound desc = could not find container \"87b7e8df716d6db2234cb0d9f4b5435733eebf3453983d174cfd8ed4a517ff0c\": container with ID starting with 87b7e8df716d6db2234cb0d9f4b5435733eebf3453983d174cfd8ed4a517ff0c not found: ID does not exist" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.660588 4764 scope.go:117] "RemoveContainer" containerID="878ba1fbacc5d195f2c3f5d1f7ed102a9da213e083eb5f15b669f308393abf9d" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.697238 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:45 crc kubenswrapper[4764]: E0309 13:24:45.698193 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:46.198176042 +0000 UTC m=+241.448347940 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.732053 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b625331d-48ab-4d48-86fd-fe73466305ff" path="/var/lib/kubelet/pods/b625331d-48ab-4d48-86fd-fe73466305ff/volumes" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.732624 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" event={"ID":"e56f29e6-292c-48d2-a5d7-531cfa6689f5","Type":"ContainerStarted","Data":"cb937b57e16e88d19c62c3e17a738b76cd16d7fabca3520c97db154048373031"} Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.732716 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hf98d" event={"ID":"957cfae4-9b1a-4e7f-b526-60bc1ee1a1b4","Type":"ContainerStarted","Data":"9736db70b45f7e6f3f750ae091237076377a90d5b47d639816161eca9a1ab904"} Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.732737 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nrc8s"] Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.775121 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" event={"ID":"39249ace-8681-491c-a547-869e72d297a7","Type":"ContainerStarted","Data":"36c78f51d09476173476e5ef9244865c7c61164271085db7b7de7ac5ba60c53f"} Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.795169 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:45 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:45 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:45 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.795230 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.798894 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:45 crc kubenswrapper[4764]: E0309 13:24:45.799346 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:46.299331461 +0000 UTC m=+241.549503369 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.828112 4764 scope.go:117] "RemoveContainer" containerID="878ba1fbacc5d195f2c3f5d1f7ed102a9da213e083eb5f15b669f308393abf9d" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.829748 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6nhpx" event={"ID":"0c464093-f30e-4c79-84cd-98a6d49b813c","Type":"ContainerStarted","Data":"e2bb9836472cc420ee3d690a74378dd966f0cdbcabd65d5906255c9b974e9a59"} Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.829773 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-6nhpx" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.829783 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6nhpx" event={"ID":"0c464093-f30e-4c79-84cd-98a6d49b813c","Type":"ContainerStarted","Data":"c03e462730d7f8fe62be91c1a0cefc3fe604ceae04ca465686d1c317eddc75b9"} Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.831196 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-x927s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.831233 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 09 13:24:45 crc kubenswrapper[4764]: E0309 13:24:45.834577 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"878ba1fbacc5d195f2c3f5d1f7ed102a9da213e083eb5f15b669f308393abf9d\": container with ID starting with 878ba1fbacc5d195f2c3f5d1f7ed102a9da213e083eb5f15b669f308393abf9d not found: ID does not exist" containerID="878ba1fbacc5d195f2c3f5d1f7ed102a9da213e083eb5f15b669f308393abf9d" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.834609 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"878ba1fbacc5d195f2c3f5d1f7ed102a9da213e083eb5f15b669f308393abf9d"} err="failed to get container status \"878ba1fbacc5d195f2c3f5d1f7ed102a9da213e083eb5f15b669f308393abf9d\": rpc error: code = NotFound desc = could not find container \"878ba1fbacc5d195f2c3f5d1f7ed102a9da213e083eb5f15b669f308393abf9d\": container with ID starting with 878ba1fbacc5d195f2c3f5d1f7ed102a9da213e083eb5f15b669f308393abf9d not found: ID does not exist" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.834670 4764 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-d4gwh container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.834691 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" podUID="1ccc5b44-95ad-4f4c-8086-c176c41bbd19" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.854484 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ptbpd" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.901043 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:45 crc kubenswrapper[4764]: E0309 13:24:45.902090 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:46.402072904 +0000 UTC m=+241.652244812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.913449 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cd5zc" podStartSLOduration=175.913426518 podStartE2EDuration="2m55.913426518s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:45.908435585 +0000 UTC m=+241.158607503" watchObservedRunningTime="2026-03-09 13:24:45.913426518 +0000 UTC m=+241.163598426" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.933111 4764 ???:1] "http: TLS handshake error from 192.168.126.11:48480: no serving certificate available for the kubelet" Mar 09 13:24:45 crc kubenswrapper[4764]: I0309 13:24:45.952632 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6nhpx" podStartSLOduration=9.952613262 podStartE2EDuration="9.952613262s" podCreationTimestamp="2026-03-09 13:24:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:45.952380176 +0000 UTC m=+241.202552104" watchObservedRunningTime="2026-03-09 13:24:45.952613262 +0000 UTC m=+241.202785170" Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.002904 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:46 crc kubenswrapper[4764]: E0309 13:24:46.006701 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:46.506681966 +0000 UTC m=+241.756853954 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.107187 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:46 crc kubenswrapper[4764]: E0309 13:24:46.107841 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:46.607824875 +0000 UTC m=+241.857996783 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:46 crc kubenswrapper[4764]: W0309 13:24:46.154107 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88ba6041_7f8f_48f0_840c_8ea2a9bdc87b.slice/crio-afb89024f1733e90994f05a617baca8bd2578c08f57794c20e8e0031fa2f63e4 WatchSource:0}: Error finding container afb89024f1733e90994f05a617baca8bd2578c08f57794c20e8e0031fa2f63e4: Status 404 returned error can't find the container with id afb89024f1733e90994f05a617baca8bd2578c08f57794c20e8e0031fa2f63e4 Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.155756 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8d627"] Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.158074 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-hf98d" podStartSLOduration=175.158061216 podStartE2EDuration="2m55.158061216s" podCreationTimestamp="2026-03-09 13:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:46.106403857 +0000 UTC m=+241.356575765" watchObservedRunningTime="2026-03-09 13:24:46.158061216 +0000 UTC m=+241.408233124" Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.208897 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:46 crc kubenswrapper[4764]: E0309 13:24:46.209266 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:46.709254943 +0000 UTC m=+241.959426851 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.258722 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.276870 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-gst9d" Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.311173 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:46 crc kubenswrapper[4764]: E0309 13:24:46.311364 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:46.811333917 +0000 UTC m=+242.061505825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.311503 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:46 crc kubenswrapper[4764]: E0309 13:24:46.311817 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:46.81180549 +0000 UTC m=+242.061977398 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.412126 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:46 crc kubenswrapper[4764]: E0309 13:24:46.424924 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:46.924900491 +0000 UTC m=+242.175072399 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.529885 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:46 crc kubenswrapper[4764]: E0309 13:24:46.530581 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:47.030566762 +0000 UTC m=+242.280738670 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.535294 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mbm5b"] Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.548234 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jn8f5"] Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.632791 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:46 crc kubenswrapper[4764]: E0309 13:24:46.633247 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:47.133231172 +0000 UTC m=+242.383403080 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.657480 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fxv7j" Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.734345 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:46 crc kubenswrapper[4764]: E0309 13:24:46.734871 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:47.234856914 +0000 UTC m=+242.485028822 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.776450 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qhs57"] Mar 09 13:24:46 crc kubenswrapper[4764]: E0309 13:24:46.782849 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad49b4d8-1218-4a34-8455-831d0f563cbf" containerName="route-controller-manager" Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.782940 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad49b4d8-1218-4a34-8455-831d0f563cbf" containerName="route-controller-manager" Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.783105 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad49b4d8-1218-4a34-8455-831d0f563cbf" containerName="route-controller-manager" Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.784070 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.786401 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.800857 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:46 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:46 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:46 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.800912 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.820484 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d896c677f-8g9dz"] Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.830778 4764 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-m4bs6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.831114 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" podUID="14b4aa8c-1066-4388-9442-07722e4c76c2" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.27:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.833562 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qhs57"] Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.845802 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:46 crc kubenswrapper[4764]: E0309 13:24:46.846183 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:47.346169037 +0000 UTC m=+242.596340945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.947367 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4z7x\" (UniqueName: \"kubernetes.io/projected/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-kube-api-access-j4z7x\") pod \"redhat-marketplace-qhs57\" (UID: \"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df\") " pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.947439 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-catalog-content\") pod \"redhat-marketplace-qhs57\" (UID: \"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df\") " pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.947461 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.947498 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-utilities\") pod \"redhat-marketplace-qhs57\" (UID: \"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df\") " pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:24:46 crc kubenswrapper[4764]: E0309 13:24:46.947834 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:47.44781972 +0000 UTC m=+242.697991628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.979569 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbm5b" event={"ID":"20acdcb5-ea78-435e-b472-e102d5553c75","Type":"ContainerStarted","Data":"36b8a908fc96eec5fd19468146038ec7f847f96484b3a606a41defe1a23a894e"} Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.985799 4764 generic.go:334] "Generic (PLEG): container finished" podID="be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" containerID="fbe32af2436a64ccf4bd2766ca3822d5d50061459cac93fe26813bb870fd850a" exitCode=0 Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.986935 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrc8s" event={"ID":"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97","Type":"ContainerDied","Data":"fbe32af2436a64ccf4bd2766ca3822d5d50061459cac93fe26813bb870fd850a"} Mar 09 13:24:46 crc kubenswrapper[4764]: I0309 13:24:46.987086 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrc8s" event={"ID":"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97","Type":"ContainerStarted","Data":"32332cee515b03550931490beaabd836e1f122b91e9186c7afe19395bde21caa"} Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.016678 4764 generic.go:334] "Generic (PLEG): container finished" podID="88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" containerID="1ac1423999a9eaea944b592473f076b1d7f0bf8296be44b2cb8bd7cc41e23c1f" exitCode=0 Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.016787 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d627" event={"ID":"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b","Type":"ContainerDied","Data":"1ac1423999a9eaea944b592473f076b1d7f0bf8296be44b2cb8bd7cc41e23c1f"} Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.016820 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d627" event={"ID":"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b","Type":"ContainerStarted","Data":"afb89024f1733e90994f05a617baca8bd2578c08f57794c20e8e0031fa2f63e4"} Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.054730 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn8f5" event={"ID":"a76121be-d090-4f2a-9e57-1a160a4bb4f2","Type":"ContainerStarted","Data":"a07c170a29ea8bcf9be266201f1dd0580d7bdb690c3b989b62809138bb677d6e"} Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.055891 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.055982 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4z7x\" (UniqueName: \"kubernetes.io/projected/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-kube-api-access-j4z7x\") pod \"redhat-marketplace-qhs57\" (UID: \"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df\") " pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.056034 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-catalog-content\") pod \"redhat-marketplace-qhs57\" (UID: \"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df\") " pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.056078 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-utilities\") pod \"redhat-marketplace-qhs57\" (UID: \"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df\") " pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.057542 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-catalog-content\") pod \"redhat-marketplace-qhs57\" (UID: \"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df\") " pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.057566 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-utilities\") pod \"redhat-marketplace-qhs57\" (UID: \"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df\") " pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:24:47 crc kubenswrapper[4764]: E0309 13:24:47.057802 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:47.557781857 +0000 UTC m=+242.807953765 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.078979 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.114973 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g7k9k"] Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.120253 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.131914 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4z7x\" (UniqueName: \"kubernetes.io/projected/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-kube-api-access-j4z7x\") pod \"redhat-marketplace-qhs57\" (UID: \"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df\") " pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.148172 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g7k9k"] Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.156908 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.164984 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:47 crc kubenswrapper[4764]: E0309 13:24:47.165296 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:47.665281727 +0000 UTC m=+242.915453635 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.209968 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r"] Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.220810 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mvq6r"] Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.252939 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4bs6" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.270997 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.271276 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a967c79-e11e-4c58-b42e-652d1406ac88-catalog-content\") pod \"redhat-marketplace-g7k9k\" (UID: \"7a967c79-e11e-4c58-b42e-652d1406ac88\") " pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.271407 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a967c79-e11e-4c58-b42e-652d1406ac88-utilities\") pod \"redhat-marketplace-g7k9k\" (UID: \"7a967c79-e11e-4c58-b42e-652d1406ac88\") " pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.271582 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx2jl\" (UniqueName: \"kubernetes.io/projected/7a967c79-e11e-4c58-b42e-652d1406ac88-kube-api-access-gx2jl\") pod \"redhat-marketplace-g7k9k\" (UID: \"7a967c79-e11e-4c58-b42e-652d1406ac88\") " pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:24:47 crc kubenswrapper[4764]: E0309 13:24:47.272255 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:47.772230612 +0000 UTC m=+243.022402560 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.324309 4764 ???:1] "http: TLS handshake error from 192.168.126.11:48496: no serving certificate available for the kubelet" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.372655 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a967c79-e11e-4c58-b42e-652d1406ac88-utilities\") pod \"redhat-marketplace-g7k9k\" (UID: \"7a967c79-e11e-4c58-b42e-652d1406ac88\") " pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.372830 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.372938 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx2jl\" (UniqueName: \"kubernetes.io/projected/7a967c79-e11e-4c58-b42e-652d1406ac88-kube-api-access-gx2jl\") pod \"redhat-marketplace-g7k9k\" (UID: \"7a967c79-e11e-4c58-b42e-652d1406ac88\") " pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.373037 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a967c79-e11e-4c58-b42e-652d1406ac88-catalog-content\") pod \"redhat-marketplace-g7k9k\" (UID: \"7a967c79-e11e-4c58-b42e-652d1406ac88\") " pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:24:47 crc kubenswrapper[4764]: E0309 13:24:47.373152 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:47.873139086 +0000 UTC m=+243.123310994 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.374337 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a967c79-e11e-4c58-b42e-652d1406ac88-utilities\") pod \"redhat-marketplace-g7k9k\" (UID: \"7a967c79-e11e-4c58-b42e-652d1406ac88\") " pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.374372 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a967c79-e11e-4c58-b42e-652d1406ac88-catalog-content\") pod \"redhat-marketplace-g7k9k\" (UID: \"7a967c79-e11e-4c58-b42e-652d1406ac88\") " pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.417559 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx2jl\" (UniqueName: \"kubernetes.io/projected/7a967c79-e11e-4c58-b42e-652d1406ac88-kube-api-access-gx2jl\") pod \"redhat-marketplace-g7k9k\" (UID: \"7a967c79-e11e-4c58-b42e-652d1406ac88\") " pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.478956 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:47 crc kubenswrapper[4764]: E0309 13:24:47.479166 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:47.979131965 +0000 UTC m=+243.229303883 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.479266 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:47 crc kubenswrapper[4764]: E0309 13:24:47.479607 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:47.979594698 +0000 UTC m=+243.229766596 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.579901 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad49b4d8-1218-4a34-8455-831d0f563cbf" path="/var/lib/kubelet/pods/ad49b4d8-1218-4a34-8455-831d0f563cbf/volumes" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.581078 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:47 crc kubenswrapper[4764]: E0309 13:24:47.581499 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:48.081477467 +0000 UTC m=+243.331649375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.581581 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:47 crc kubenswrapper[4764]: E0309 13:24:47.582015 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:48.082006941 +0000 UTC m=+243.332178849 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.602794 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.683063 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:47 crc kubenswrapper[4764]: E0309 13:24:47.683907 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:48.183841319 +0000 UTC m=+243.434013227 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.757980 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qhs57"] Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.784763 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:47 crc kubenswrapper[4764]: E0309 13:24:47.785222 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:48.285209095 +0000 UTC m=+243.535381003 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.791399 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:47 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:47 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:47 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.791449 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.831850 4764 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.891649 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:47 crc kubenswrapper[4764]: E0309 13:24:47.892108 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:48.392078108 +0000 UTC m=+243.642250016 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.892550 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:47 crc kubenswrapper[4764]: E0309 13:24:47.893020 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:48.393002133 +0000 UTC m=+243.643174041 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.894098 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tll5t"] Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.895233 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.900145 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.942380 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tll5t"] Mar 09 13:24:47 crc kubenswrapper[4764]: I0309 13:24:47.994154 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:47 crc kubenswrapper[4764]: E0309 13:24:47.994741 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:48.494715168 +0000 UTC m=+243.744887076 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.078388 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9"] Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.079122 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.092185 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.093010 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.093344 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.093732 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.094023 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.095990 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-utilities\") pod \"redhat-operators-tll5t\" (UID: \"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6\") " pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.096026 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-catalog-content\") pod \"redhat-operators-tll5t\" (UID: \"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6\") " pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.096057 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v76nj\" (UniqueName: \"kubernetes.io/projected/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-kube-api-access-v76nj\") pod \"redhat-operators-tll5t\" (UID: \"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6\") " pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.096113 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:48 crc kubenswrapper[4764]: E0309 13:24:48.096483 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:48.596468824 +0000 UTC m=+243.846640732 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.127561 4764 generic.go:334] "Generic (PLEG): container finished" podID="a76121be-d090-4f2a-9e57-1a160a4bb4f2" containerID="851e5f42b5e692a4d7bb4a0fda84945ae1aa93c4dc2838b29856d0c9ed98624f" exitCode=0 Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.128635 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn8f5" event={"ID":"a76121be-d090-4f2a-9e57-1a160a4bb4f2","Type":"ContainerDied","Data":"851e5f42b5e692a4d7bb4a0fda84945ae1aa93c4dc2838b29856d0c9ed98624f"} Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.143770 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.207294 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.207584 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35b2abcd-af84-40fe-8b37-90139612d63e-serving-cert\") pod \"route-controller-manager-866db9688c-qwkl9\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.207611 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-utilities\") pod \"redhat-operators-tll5t\" (UID: \"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6\") " pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.207632 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/35b2abcd-af84-40fe-8b37-90139612d63e-client-ca\") pod \"route-controller-manager-866db9688c-qwkl9\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.207673 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-catalog-content\") pod \"redhat-operators-tll5t\" (UID: \"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6\") " pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.207720 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zpjf\" (UniqueName: \"kubernetes.io/projected/35b2abcd-af84-40fe-8b37-90139612d63e-kube-api-access-9zpjf\") pod \"route-controller-manager-866db9688c-qwkl9\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.207748 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v76nj\" (UniqueName: \"kubernetes.io/projected/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-kube-api-access-v76nj\") pod \"redhat-operators-tll5t\" (UID: \"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6\") " pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.207769 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35b2abcd-af84-40fe-8b37-90139612d63e-config\") pod \"route-controller-manager-866db9688c-qwkl9\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:48 crc kubenswrapper[4764]: E0309 13:24:48.207916 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:48.70789895 +0000 UTC m=+243.958070858 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.208270 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-utilities\") pod \"redhat-operators-tll5t\" (UID: \"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6\") " pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.208510 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-catalog-content\") pod \"redhat-operators-tll5t\" (UID: \"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6\") " pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.209543 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9"] Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.254915 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v76nj\" (UniqueName: \"kubernetes.io/projected/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-kube-api-access-v76nj\") pod \"redhat-operators-tll5t\" (UID: \"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6\") " pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.279894 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.287807 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mp9p7" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.309074 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zpjf\" (UniqueName: \"kubernetes.io/projected/35b2abcd-af84-40fe-8b37-90139612d63e-kube-api-access-9zpjf\") pod \"route-controller-manager-866db9688c-qwkl9\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.309123 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35b2abcd-af84-40fe-8b37-90139612d63e-config\") pod \"route-controller-manager-866db9688c-qwkl9\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.309192 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.309266 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35b2abcd-af84-40fe-8b37-90139612d63e-serving-cert\") pod \"route-controller-manager-866db9688c-qwkl9\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.309296 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/35b2abcd-af84-40fe-8b37-90139612d63e-client-ca\") pod \"route-controller-manager-866db9688c-qwkl9\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.310886 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/35b2abcd-af84-40fe-8b37-90139612d63e-client-ca\") pod \"route-controller-manager-866db9688c-qwkl9\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:48 crc kubenswrapper[4764]: E0309 13:24:48.310991 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:48.810961531 +0000 UTC m=+244.061133589 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.317253 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35b2abcd-af84-40fe-8b37-90139612d63e-serving-cert\") pod \"route-controller-manager-866db9688c-qwkl9\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.321120 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35b2abcd-af84-40fe-8b37-90139612d63e-config\") pod \"route-controller-manager-866db9688c-qwkl9\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.337779 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" event={"ID":"ef84d4f2-b722-415f-bc23-d472e00474b4","Type":"ContainerStarted","Data":"ac78e9706513103621b9ee866b0a26306b3671bd6884454e15e781cf3cf27583"} Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.337820 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" event={"ID":"ef84d4f2-b722-415f-bc23-d472e00474b4","Type":"ContainerStarted","Data":"bd1be1047066fde143bce3e434912476e75a2c14646016c14c7e52ccd0c2869e"} Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.337833 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d9z59"] Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.342977 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.343097 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.374740 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d9z59"] Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.380353 4764 generic.go:334] "Generic (PLEG): container finished" podID="20acdcb5-ea78-435e-b472-e102d5553c75" containerID="8e85cc77a9a235fb4dc23dc9451690c29cdcf977870a98d1bc077bac0fa992d1" exitCode=0 Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.380542 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbm5b" event={"ID":"20acdcb5-ea78-435e-b472-e102d5553c75","Type":"ContainerDied","Data":"8e85cc77a9a235fb4dc23dc9451690c29cdcf977870a98d1bc077bac0fa992d1"} Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.405274 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.408839 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zpjf\" (UniqueName: \"kubernetes.io/projected/35b2abcd-af84-40fe-8b37-90139612d63e-kube-api-access-9zpjf\") pod \"route-controller-manager-866db9688c-qwkl9\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.410696 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:48 crc kubenswrapper[4764]: E0309 13:24:48.412834 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:48.912807749 +0000 UTC m=+244.162979657 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.417494 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" event={"ID":"e56f29e6-292c-48d2-a5d7-531cfa6689f5","Type":"ContainerStarted","Data":"8731eb91c9abf063f4e86c70dd77bf09c704c3d37e8f4078169fb9fff1053e33"} Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.417540 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" event={"ID":"e56f29e6-292c-48d2-a5d7-531cfa6689f5","Type":"ContainerStarted","Data":"2f35cd90db2216d308f85082aea56417dc1c891ee59bb6d91d54ae1fc4548ab6"} Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.440890 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhs57" event={"ID":"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df","Type":"ContainerStarted","Data":"05c2ccc0c11f9b7d9a5361cced5932ce9d54a4f6f29e12db9d17e75834fe8ead"} Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.440941 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhs57" event={"ID":"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df","Type":"ContainerStarted","Data":"eabffbe2f3a51c427a01ad46e2c40728c19297f3e8e305f2763268cbfbeb6ba0"} Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.462270 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" podStartSLOduration=5.462255909 podStartE2EDuration="5.462255909s" podCreationTimestamp="2026-03-09 13:24:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:48.461389845 +0000 UTC m=+243.711561763" watchObservedRunningTime="2026-03-09 13:24:48.462255909 +0000 UTC m=+243.712427817" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.465612 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.512919 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-catalog-content\") pod \"redhat-operators-d9z59\" (UID: \"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2\") " pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.513026 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-utilities\") pod \"redhat-operators-d9z59\" (UID: \"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2\") " pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.513077 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.513181 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stkxn\" (UniqueName: \"kubernetes.io/projected/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-kube-api-access-stkxn\") pod \"redhat-operators-d9z59\" (UID: \"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2\") " pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:24:48 crc kubenswrapper[4764]: E0309 13:24:48.515491 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:49.015476219 +0000 UTC m=+244.265648127 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.552794 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.552861 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.552878 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g7k9k"] Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.556151 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.612542 4764 patch_prober.go:28] interesting pod/console-f9d7485db-8g9lj container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.612606 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-8g9lj" podUID="a1b4dc0b-edea-4c0d-8d61-3e3d3133605d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.617182 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.619586 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-catalog-content\") pod \"redhat-operators-d9z59\" (UID: \"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2\") " pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.620316 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-utilities\") pod \"redhat-operators-d9z59\" (UID: \"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2\") " pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.620428 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs\") pod \"network-metrics-daemon-wkwdz\" (UID: \"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\") " pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.620574 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stkxn\" (UniqueName: \"kubernetes.io/projected/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-kube-api-access-stkxn\") pod \"redhat-operators-d9z59\" (UID: \"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2\") " pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:24:48 crc kubenswrapper[4764]: E0309 13:24:48.622815 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:49.122782715 +0000 UTC m=+244.372954623 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.623988 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-catalog-content\") pod \"redhat-operators-d9z59\" (UID: \"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2\") " pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.624443 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-utilities\") pod \"redhat-operators-d9z59\" (UID: \"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2\") " pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.628236 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.647461 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stkxn\" (UniqueName: \"kubernetes.io/projected/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-kube-api-access-stkxn\") pod \"redhat-operators-d9z59\" (UID: \"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2\") " pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.654337 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6597fc34-10ee-4984-9c69-f4b7c0d46e2a-metrics-certs\") pod \"network-metrics-daemon-wkwdz\" (UID: \"6597fc34-10ee-4984-9c69-f4b7c0d46e2a\") " pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.683030 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.689465 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wkwdz" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.724044 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:48 crc kubenswrapper[4764]: E0309 13:24:48.724471 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:49.224451518 +0000 UTC m=+244.474623426 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.763861 4764 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-09T13:24:47.83187509Z","Handler":null,"Name":""} Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.763998 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.787837 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.793440 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:48 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:48 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:48 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.793501 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.826228 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:48 crc kubenswrapper[4764]: E0309 13:24:48.826348 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:24:49.326329798 +0000 UTC m=+244.576501706 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.826678 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:48 crc kubenswrapper[4764]: E0309 13:24:48.827229 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:24:49.327219952 +0000 UTC m=+244.577391860 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-w49hn" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.830523 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.854420 4764 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.854473 4764 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.928960 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:24:48 crc kubenswrapper[4764]: I0309 13:24:48.964063 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.030197 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.111118 4764 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.111162 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.218448 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-x927s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.218901 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.219248 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-x927s container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.219265 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.250619 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-w49hn\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.446873 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.455340 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.509066 4764 generic.go:334] "Generic (PLEG): container finished" podID="691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" containerID="05c2ccc0c11f9b7d9a5361cced5932ce9d54a4f6f29e12db9d17e75834fe8ead" exitCode=0 Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.509137 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhs57" event={"ID":"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df","Type":"ContainerDied","Data":"05c2ccc0c11f9b7d9a5361cced5932ce9d54a4f6f29e12db9d17e75834fe8ead"} Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.531064 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" event={"ID":"e56f29e6-292c-48d2-a5d7-531cfa6689f5","Type":"ContainerStarted","Data":"b83d080d807453dffd1053a90dd5bc9e2fdddf2a8f80094f255f56ecd12499fc"} Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.539446 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9"] Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.543541 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tll5t"] Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.548455 4764 generic.go:334] "Generic (PLEG): container finished" podID="7a967c79-e11e-4c58-b42e-652d1406ac88" containerID="a7592b9bea39d66aa3632078215711f53e09d218ed98410b531203d64bd85eca" exitCode=0 Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.548692 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7k9k" event={"ID":"7a967c79-e11e-4c58-b42e-652d1406ac88","Type":"ContainerDied","Data":"a7592b9bea39d66aa3632078215711f53e09d218ed98410b531203d64bd85eca"} Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.548736 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7k9k" event={"ID":"7a967c79-e11e-4c58-b42e-652d1406ac88","Type":"ContainerStarted","Data":"2183e838c5144408fdc015b8deb0cb2c5e715404d51e8b64aa5f21859f0ebf3c"} Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.580865 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-2bqx9" podStartSLOduration=13.580823842000001 podStartE2EDuration="13.580823842s" podCreationTimestamp="2026-03-09 13:24:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:49.559041306 +0000 UTC m=+244.809213224" watchObservedRunningTime="2026-03-09 13:24:49.580823842 +0000 UTC m=+244.830995770" Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.617082 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.796660 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:49 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:49 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:49 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.796741 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.855788 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d9z59"] Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.954295 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wkwdz"] Mar 09 13:24:49 crc kubenswrapper[4764]: I0309 13:24:49.971987 4764 ???:1] "http: TLS handshake error from 192.168.126.11:57610: no serving certificate available for the kubelet" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.126269 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-w49hn"] Mar 09 13:24:50 crc kubenswrapper[4764]: W0309 13:24:50.173115 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3652fe0_4889_432f_af3f_787dd19c60d6.slice/crio-f63eb7186d86d6bf6062656b177ec8adc1e47100bcb490f0e226ebe524f4ea53 WatchSource:0}: Error finding container f63eb7186d86d6bf6062656b177ec8adc1e47100bcb490f0e226ebe524f4ea53: Status 404 returned error can't find the container with id f63eb7186d86d6bf6062656b177ec8adc1e47100bcb490f0e226ebe524f4ea53 Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.603284 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.604892 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.610386 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.610859 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.618776 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.675152 4764 generic.go:334] "Generic (PLEG): container finished" podID="39da5087-79bc-4154-b340-22183d9e4417" containerID="45383c97959a17ffdeaa9f0ab6e8a1110b113c90d84de0fa490663169c04fa26" exitCode=0 Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.675247 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" event={"ID":"39da5087-79bc-4154-b340-22183d9e4417","Type":"ContainerDied","Data":"45383c97959a17ffdeaa9f0ab6e8a1110b113c90d84de0fa490663169c04fa26"} Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.680432 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" event={"ID":"d3652fe0-4889-432f-af3f-787dd19c60d6","Type":"ContainerStarted","Data":"d4f1dae0a1e050bab0d3f5cbb5a68f2b2bcb5dd4773d77210e4e118d9cdca6b4"} Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.680479 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" event={"ID":"d3652fe0-4889-432f-af3f-787dd19c60d6","Type":"ContainerStarted","Data":"f63eb7186d86d6bf6062656b177ec8adc1e47100bcb490f0e226ebe524f4ea53"} Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.681314 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.683421 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" event={"ID":"6597fc34-10ee-4984-9c69-f4b7c0d46e2a","Type":"ContainerStarted","Data":"97f66587a61051c142b6607c2b2611b4d08ddb23635aab32a8f90392921c094a"} Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.686127 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" event={"ID":"35b2abcd-af84-40fe-8b37-90139612d63e","Type":"ContainerStarted","Data":"3bce33547ff50496fc00d452193e1617ac80b7c148e9afbaf542b8ac329a6ecf"} Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.686170 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" event={"ID":"35b2abcd-af84-40fe-8b37-90139612d63e","Type":"ContainerStarted","Data":"99e3df704f16abbd89cfa68b2dbe7a3e325602c3ff0a20c7684b7b091fc44203"} Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.686442 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.705201 4764 generic.go:334] "Generic (PLEG): container finished" podID="c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" containerID="d8b26ed27d9d828e2d55afc5c28aac89bdf5c58d97e8a4288af402a3789e0cd2" exitCode=0 Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.705411 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d9z59" event={"ID":"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2","Type":"ContainerDied","Data":"d8b26ed27d9d828e2d55afc5c28aac89bdf5c58d97e8a4288af402a3789e0cd2"} Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.705469 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d9z59" event={"ID":"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2","Type":"ContainerStarted","Data":"63639f892c3d7cc35dde0976454fc20f0b0dcd4c9977b4b39ee9f80a34190631"} Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.712347 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f756dee9-7011-49b3-8a60-b7e08f01972d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f756dee9-7011-49b3-8a60-b7e08f01972d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.712434 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f756dee9-7011-49b3-8a60-b7e08f01972d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f756dee9-7011-49b3-8a60-b7e08f01972d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.732947 4764 generic.go:334] "Generic (PLEG): container finished" podID="41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" containerID="ad5ce40518308f615c3da026f6c3fbf2d4af3ef9e2a050d739fa7a496bac2d88" exitCode=0 Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.733047 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tll5t" event={"ID":"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6","Type":"ContainerDied","Data":"ad5ce40518308f615c3da026f6c3fbf2d4af3ef9e2a050d739fa7a496bac2d88"} Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.733112 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tll5t" event={"ID":"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6","Type":"ContainerStarted","Data":"adf303b563119dba790a00aa6f3db90393d7970196ac5ffecf8fea14de83b469"} Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.764131 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" podStartSLOduration=180.764103366 podStartE2EDuration="3m0.764103366s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:50.755800823 +0000 UTC m=+246.005972731" watchObservedRunningTime="2026-03-09 13:24:50.764103366 +0000 UTC m=+246.014275274" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.803698 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:50 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:50 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:50 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.803755 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.813412 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f756dee9-7011-49b3-8a60-b7e08f01972d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f756dee9-7011-49b3-8a60-b7e08f01972d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.813643 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f756dee9-7011-49b3-8a60-b7e08f01972d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f756dee9-7011-49b3-8a60-b7e08f01972d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.815267 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f756dee9-7011-49b3-8a60-b7e08f01972d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f756dee9-7011-49b3-8a60-b7e08f01972d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.844002 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.872337 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f756dee9-7011-49b3-8a60-b7e08f01972d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f756dee9-7011-49b3-8a60-b7e08f01972d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.875379 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" podStartSLOduration=7.875366288 podStartE2EDuration="7.875366288s" podCreationTimestamp="2026-03-09 13:24:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:50.872756327 +0000 UTC m=+246.122928235" watchObservedRunningTime="2026-03-09 13:24:50.875366288 +0000 UTC m=+246.125538196" Mar 09 13:24:50 crc kubenswrapper[4764]: I0309 13:24:50.944338 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.488985 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.492686 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.496700 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.496869 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.497247 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.529386 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27bcfe8c-a29f-4f0b-9f73-3e075a201db7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"27bcfe8c-a29f-4f0b-9f73-3e075a201db7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.529446 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/27bcfe8c-a29f-4f0b-9f73-3e075a201db7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"27bcfe8c-a29f-4f0b-9f73-3e075a201db7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.615764 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.631153 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27bcfe8c-a29f-4f0b-9f73-3e075a201db7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"27bcfe8c-a29f-4f0b-9f73-3e075a201db7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.631197 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/27bcfe8c-a29f-4f0b-9f73-3e075a201db7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"27bcfe8c-a29f-4f0b-9f73-3e075a201db7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.631289 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27bcfe8c-a29f-4f0b-9f73-3e075a201db7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"27bcfe8c-a29f-4f0b-9f73-3e075a201db7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:24:51 crc kubenswrapper[4764]: W0309 13:24:51.636460 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf756dee9_7011_49b3_8a60_b7e08f01972d.slice/crio-f47e9b814d8c689b9a403610e6c688065074a6f78e937c6390cbb84def8a7306 WatchSource:0}: Error finding container f47e9b814d8c689b9a403610e6c688065074a6f78e937c6390cbb84def8a7306: Status 404 returned error can't find the container with id f47e9b814d8c689b9a403610e6c688065074a6f78e937c6390cbb84def8a7306 Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.656730 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/27bcfe8c-a29f-4f0b-9f73-3e075a201db7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"27bcfe8c-a29f-4f0b-9f73-3e075a201db7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.792181 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:51 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:51 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:51 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.792253 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.803310 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" event={"ID":"6597fc34-10ee-4984-9c69-f4b7c0d46e2a","Type":"ContainerStarted","Data":"75d6a85048a1e2eb11458ae8be437eeb0a03f7941f5d32a79052f616b848b320"} Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.803352 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wkwdz" event={"ID":"6597fc34-10ee-4984-9c69-f4b7c0d46e2a","Type":"ContainerStarted","Data":"11436f5431ce19fb1a9c2495b97d4e18b440349aa7e01e3feb8b315186104b47"} Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.820192 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f756dee9-7011-49b3-8a60-b7e08f01972d","Type":"ContainerStarted","Data":"f47e9b814d8c689b9a403610e6c688065074a6f78e937c6390cbb84def8a7306"} Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.824490 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wkwdz" podStartSLOduration=181.824478676 podStartE2EDuration="3m1.824478676s" podCreationTimestamp="2026-03-09 13:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:51.823601143 +0000 UTC m=+247.073773071" watchObservedRunningTime="2026-03-09 13:24:51.824478676 +0000 UTC m=+247.074650584" Mar 09 13:24:51 crc kubenswrapper[4764]: I0309 13:24:51.836561 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.523477 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.570526 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39da5087-79bc-4154-b340-22183d9e4417-config-volume\") pod \"39da5087-79bc-4154-b340-22183d9e4417\" (UID: \"39da5087-79bc-4154-b340-22183d9e4417\") " Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.570633 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss45h\" (UniqueName: \"kubernetes.io/projected/39da5087-79bc-4154-b340-22183d9e4417-kube-api-access-ss45h\") pod \"39da5087-79bc-4154-b340-22183d9e4417\" (UID: \"39da5087-79bc-4154-b340-22183d9e4417\") " Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.570774 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39da5087-79bc-4154-b340-22183d9e4417-secret-volume\") pod \"39da5087-79bc-4154-b340-22183d9e4417\" (UID: \"39da5087-79bc-4154-b340-22183d9e4417\") " Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.571445 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39da5087-79bc-4154-b340-22183d9e4417-config-volume" (OuterVolumeSpecName: "config-volume") pod "39da5087-79bc-4154-b340-22183d9e4417" (UID: "39da5087-79bc-4154-b340-22183d9e4417"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.588830 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39da5087-79bc-4154-b340-22183d9e4417-kube-api-access-ss45h" (OuterVolumeSpecName: "kube-api-access-ss45h") pod "39da5087-79bc-4154-b340-22183d9e4417" (UID: "39da5087-79bc-4154-b340-22183d9e4417"). InnerVolumeSpecName "kube-api-access-ss45h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.593854 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39da5087-79bc-4154-b340-22183d9e4417-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "39da5087-79bc-4154-b340-22183d9e4417" (UID: "39da5087-79bc-4154-b340-22183d9e4417"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.674574 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39da5087-79bc-4154-b340-22183d9e4417-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.674612 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39da5087-79bc-4154-b340-22183d9e4417-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.674624 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss45h\" (UniqueName: \"kubernetes.io/projected/39da5087-79bc-4154-b340-22183d9e4417-kube-api-access-ss45h\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.730087 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.790631 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:52 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:52 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:52 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.790790 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.868192 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f756dee9-7011-49b3-8a60-b7e08f01972d","Type":"ContainerStarted","Data":"26fbafd0831fd9da487038da2d1a7850d1c8bfa104a1f78dfc720ff7c56f07b6"} Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.872876 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"27bcfe8c-a29f-4f0b-9f73-3e075a201db7","Type":"ContainerStarted","Data":"11da5fd652d318a343ea97d905dc9b8fda776729b232bdd361ea2f64c5a90799"} Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.887183 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" event={"ID":"39da5087-79bc-4154-b340-22183d9e4417","Type":"ContainerDied","Data":"c05318f0e67358bb015e1f76742485abff9f53469c740dc411704a1af2febb07"} Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.887232 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c05318f0e67358bb015e1f76742485abff9f53469c740dc411704a1af2febb07" Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.887326 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn" Mar 09 13:24:52 crc kubenswrapper[4764]: I0309 13:24:52.889621 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.8895801629999998 podStartE2EDuration="2.889580163s" podCreationTimestamp="2026-03-09 13:24:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:52.886317966 +0000 UTC m=+248.136489894" watchObservedRunningTime="2026-03-09 13:24:52.889580163 +0000 UTC m=+248.139752081" Mar 09 13:24:53 crc kubenswrapper[4764]: I0309 13:24:53.311114 4764 ???:1] "http: TLS handshake error from 192.168.126.11:57624: no serving certificate available for the kubelet" Mar 09 13:24:53 crc kubenswrapper[4764]: I0309 13:24:53.791077 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:53 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:53 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:53 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:53 crc kubenswrapper[4764]: I0309 13:24:53.791138 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:53 crc kubenswrapper[4764]: I0309 13:24:53.897125 4764 generic.go:334] "Generic (PLEG): container finished" podID="f756dee9-7011-49b3-8a60-b7e08f01972d" containerID="26fbafd0831fd9da487038da2d1a7850d1c8bfa104a1f78dfc720ff7c56f07b6" exitCode=0 Mar 09 13:24:53 crc kubenswrapper[4764]: I0309 13:24:53.897187 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f756dee9-7011-49b3-8a60-b7e08f01972d","Type":"ContainerDied","Data":"26fbafd0831fd9da487038da2d1a7850d1c8bfa104a1f78dfc720ff7c56f07b6"} Mar 09 13:24:54 crc kubenswrapper[4764]: I0309 13:24:54.495475 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6nhpx" Mar 09 13:24:54 crc kubenswrapper[4764]: I0309 13:24:54.790445 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:54 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:54 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:54 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:54 crc kubenswrapper[4764]: I0309 13:24:54.790502 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:54 crc kubenswrapper[4764]: I0309 13:24:54.930361 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"27bcfe8c-a29f-4f0b-9f73-3e075a201db7","Type":"ContainerStarted","Data":"bd952e333a172908437fe8558e8fefd44c43923bd54e11143d085c1a5dd583ed"} Mar 09 13:24:54 crc kubenswrapper[4764]: I0309 13:24:54.948394 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.9483779759999997 podStartE2EDuration="3.948377976s" podCreationTimestamp="2026-03-09 13:24:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:54.943598808 +0000 UTC m=+250.193770716" watchObservedRunningTime="2026-03-09 13:24:54.948377976 +0000 UTC m=+250.198549884" Mar 09 13:24:55 crc kubenswrapper[4764]: I0309 13:24:55.138691 4764 ???:1] "http: TLS handshake error from 192.168.126.11:57640: no serving certificate available for the kubelet" Mar 09 13:24:55 crc kubenswrapper[4764]: I0309 13:24:55.479562 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:24:55 crc kubenswrapper[4764]: I0309 13:24:55.660503 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f756dee9-7011-49b3-8a60-b7e08f01972d-kubelet-dir\") pod \"f756dee9-7011-49b3-8a60-b7e08f01972d\" (UID: \"f756dee9-7011-49b3-8a60-b7e08f01972d\") " Mar 09 13:24:55 crc kubenswrapper[4764]: I0309 13:24:55.660674 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f756dee9-7011-49b3-8a60-b7e08f01972d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f756dee9-7011-49b3-8a60-b7e08f01972d" (UID: "f756dee9-7011-49b3-8a60-b7e08f01972d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:24:55 crc kubenswrapper[4764]: I0309 13:24:55.660718 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f756dee9-7011-49b3-8a60-b7e08f01972d-kube-api-access\") pod \"f756dee9-7011-49b3-8a60-b7e08f01972d\" (UID: \"f756dee9-7011-49b3-8a60-b7e08f01972d\") " Mar 09 13:24:55 crc kubenswrapper[4764]: I0309 13:24:55.661277 4764 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f756dee9-7011-49b3-8a60-b7e08f01972d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:55 crc kubenswrapper[4764]: I0309 13:24:55.670431 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f756dee9-7011-49b3-8a60-b7e08f01972d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f756dee9-7011-49b3-8a60-b7e08f01972d" (UID: "f756dee9-7011-49b3-8a60-b7e08f01972d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:24:55 crc kubenswrapper[4764]: I0309 13:24:55.762133 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f756dee9-7011-49b3-8a60-b7e08f01972d-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:55 crc kubenswrapper[4764]: I0309 13:24:55.794862 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:55 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:55 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:55 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:55 crc kubenswrapper[4764]: I0309 13:24:55.794995 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:56 crc kubenswrapper[4764]: I0309 13:24:56.002701 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f756dee9-7011-49b3-8a60-b7e08f01972d","Type":"ContainerDied","Data":"f47e9b814d8c689b9a403610e6c688065074a6f78e937c6390cbb84def8a7306"} Mar 09 13:24:56 crc kubenswrapper[4764]: I0309 13:24:56.004799 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f47e9b814d8c689b9a403610e6c688065074a6f78e937c6390cbb84def8a7306" Mar 09 13:24:56 crc kubenswrapper[4764]: I0309 13:24:56.002719 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:24:56 crc kubenswrapper[4764]: I0309 13:24:56.796004 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:56 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:56 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:56 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:56 crc kubenswrapper[4764]: I0309 13:24:56.796086 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:57 crc kubenswrapper[4764]: I0309 13:24:57.042346 4764 generic.go:334] "Generic (PLEG): container finished" podID="27bcfe8c-a29f-4f0b-9f73-3e075a201db7" containerID="bd952e333a172908437fe8558e8fefd44c43923bd54e11143d085c1a5dd583ed" exitCode=0 Mar 09 13:24:57 crc kubenswrapper[4764]: I0309 13:24:57.042407 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"27bcfe8c-a29f-4f0b-9f73-3e075a201db7","Type":"ContainerDied","Data":"bd952e333a172908437fe8558e8fefd44c43923bd54e11143d085c1a5dd583ed"} Mar 09 13:24:57 crc kubenswrapper[4764]: I0309 13:24:57.793035 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:57 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 09 13:24:57 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:57 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:57 crc kubenswrapper[4764]: I0309 13:24:57.793103 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:58 crc kubenswrapper[4764]: I0309 13:24:58.371073 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:24:58 crc kubenswrapper[4764]: I0309 13:24:58.371350 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:24:58 crc kubenswrapper[4764]: I0309 13:24:58.530688 4764 patch_prober.go:28] interesting pod/console-f9d7485db-8g9lj container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 09 13:24:58 crc kubenswrapper[4764]: I0309 13:24:58.531454 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-8g9lj" podUID="a1b4dc0b-edea-4c0d-8d61-3e3d3133605d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 09 13:24:58 crc kubenswrapper[4764]: I0309 13:24:58.790212 4764 patch_prober.go:28] interesting pod/router-default-5444994796-gnnbl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:58 crc kubenswrapper[4764]: [+]has-synced ok Mar 09 13:24:58 crc kubenswrapper[4764]: [+]process-running ok Mar 09 13:24:58 crc kubenswrapper[4764]: healthz check failed Mar 09 13:24:58 crc kubenswrapper[4764]: I0309 13:24:58.790287 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnnbl" podUID="b9c0d96b-ed96-4925-b890-8743879a8b38" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:58 crc kubenswrapper[4764]: I0309 13:24:58.792094 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:24:59 crc kubenswrapper[4764]: I0309 13:24:59.217728 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-x927s container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 09 13:24:59 crc kubenswrapper[4764]: I0309 13:24:59.217780 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 09 13:24:59 crc kubenswrapper[4764]: I0309 13:24:59.217840 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-x927s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 09 13:24:59 crc kubenswrapper[4764]: I0309 13:24:59.217893 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 09 13:24:59 crc kubenswrapper[4764]: I0309 13:24:59.792145 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:24:59 crc kubenswrapper[4764]: I0309 13:24:59.796732 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-gnnbl" Mar 09 13:25:01 crc kubenswrapper[4764]: I0309 13:25:01.824632 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d896c677f-8g9dz"] Mar 09 13:25:01 crc kubenswrapper[4764]: I0309 13:25:01.824918 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" podUID="ef84d4f2-b722-415f-bc23-d472e00474b4" containerName="controller-manager" containerID="cri-o://ac78e9706513103621b9ee866b0a26306b3671bd6884454e15e781cf3cf27583" gracePeriod=30 Mar 09 13:25:01 crc kubenswrapper[4764]: I0309 13:25:01.851704 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9"] Mar 09 13:25:01 crc kubenswrapper[4764]: I0309 13:25:01.851911 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" podUID="35b2abcd-af84-40fe-8b37-90139612d63e" containerName="route-controller-manager" containerID="cri-o://3bce33547ff50496fc00d452193e1617ac80b7c148e9afbaf542b8ac329a6ecf" gracePeriod=30 Mar 09 13:25:03 crc kubenswrapper[4764]: I0309 13:25:03.121640 4764 generic.go:334] "Generic (PLEG): container finished" podID="ef84d4f2-b722-415f-bc23-d472e00474b4" containerID="ac78e9706513103621b9ee866b0a26306b3671bd6884454e15e781cf3cf27583" exitCode=0 Mar 09 13:25:03 crc kubenswrapper[4764]: I0309 13:25:03.121689 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" event={"ID":"ef84d4f2-b722-415f-bc23-d472e00474b4","Type":"ContainerDied","Data":"ac78e9706513103621b9ee866b0a26306b3671bd6884454e15e781cf3cf27583"} Mar 09 13:25:03 crc kubenswrapper[4764]: I0309 13:25:03.126125 4764 generic.go:334] "Generic (PLEG): container finished" podID="35b2abcd-af84-40fe-8b37-90139612d63e" containerID="3bce33547ff50496fc00d452193e1617ac80b7c148e9afbaf542b8ac329a6ecf" exitCode=0 Mar 09 13:25:03 crc kubenswrapper[4764]: I0309 13:25:03.126195 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" event={"ID":"35b2abcd-af84-40fe-8b37-90139612d63e","Type":"ContainerDied","Data":"3bce33547ff50496fc00d452193e1617ac80b7c148e9afbaf542b8ac329a6ecf"} Mar 09 13:25:05 crc kubenswrapper[4764]: I0309 13:25:05.606530 4764 patch_prober.go:28] interesting pod/controller-manager-5d896c677f-8g9dz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" start-of-body= Mar 09 13:25:05 crc kubenswrapper[4764]: I0309 13:25:05.606854 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" podUID="ef84d4f2-b722-415f-bc23-d472e00474b4" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" Mar 09 13:25:08 crc kubenswrapper[4764]: I0309 13:25:08.467106 4764 patch_prober.go:28] interesting pod/route-controller-manager-866db9688c-qwkl9 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" start-of-body= Mar 09 13:25:08 crc kubenswrapper[4764]: I0309 13:25:08.467154 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" podUID="35b2abcd-af84-40fe-8b37-90139612d63e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" Mar 09 13:25:08 crc kubenswrapper[4764]: I0309 13:25:08.535537 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:25:08 crc kubenswrapper[4764]: I0309 13:25:08.550277 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:25:09 crc kubenswrapper[4764]: I0309 13:25:09.217226 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-x927s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 09 13:25:09 crc kubenswrapper[4764]: I0309 13:25:09.217519 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 09 13:25:09 crc kubenswrapper[4764]: I0309 13:25:09.217262 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-x927s container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 09 13:25:09 crc kubenswrapper[4764]: I0309 13:25:09.217655 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 09 13:25:09 crc kubenswrapper[4764]: I0309 13:25:09.217728 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-x927s" Mar 09 13:25:09 crc kubenswrapper[4764]: I0309 13:25:09.218307 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-x927s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 09 13:25:09 crc kubenswrapper[4764]: I0309 13:25:09.218340 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 09 13:25:09 crc kubenswrapper[4764]: I0309 13:25:09.218931 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"4ed619baf68f65746ed98427321c72cd4adc6131be0c5ab50bca4ff635aa848c"} pod="openshift-console/downloads-7954f5f757-x927s" containerMessage="Container download-server failed liveness probe, will be restarted" Mar 09 13:25:09 crc kubenswrapper[4764]: I0309 13:25:09.218994 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" containerID="cri-o://4ed619baf68f65746ed98427321c72cd4adc6131be0c5ab50bca4ff635aa848c" gracePeriod=2 Mar 09 13:25:09 crc kubenswrapper[4764]: I0309 13:25:09.463311 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:25:10 crc kubenswrapper[4764]: I0309 13:25:10.178398 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:25:10 crc kubenswrapper[4764]: I0309 13:25:10.205206 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"27bcfe8c-a29f-4f0b-9f73-3e075a201db7","Type":"ContainerDied","Data":"11da5fd652d318a343ea97d905dc9b8fda776729b232bdd361ea2f64c5a90799"} Mar 09 13:25:10 crc kubenswrapper[4764]: I0309 13:25:10.205248 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11da5fd652d318a343ea97d905dc9b8fda776729b232bdd361ea2f64c5a90799" Mar 09 13:25:10 crc kubenswrapper[4764]: I0309 13:25:10.205309 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:25:10 crc kubenswrapper[4764]: I0309 13:25:10.214246 4764 generic.go:334] "Generic (PLEG): container finished" podID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerID="4ed619baf68f65746ed98427321c72cd4adc6131be0c5ab50bca4ff635aa848c" exitCode=0 Mar 09 13:25:10 crc kubenswrapper[4764]: I0309 13:25:10.214278 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-x927s" event={"ID":"d245b116-e47d-4b15-a40d-0a9fa34cf1df","Type":"ContainerDied","Data":"4ed619baf68f65746ed98427321c72cd4adc6131be0c5ab50bca4ff635aa848c"} Mar 09 13:25:10 crc kubenswrapper[4764]: I0309 13:25:10.277801 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/27bcfe8c-a29f-4f0b-9f73-3e075a201db7-kube-api-access\") pod \"27bcfe8c-a29f-4f0b-9f73-3e075a201db7\" (UID: \"27bcfe8c-a29f-4f0b-9f73-3e075a201db7\") " Mar 09 13:25:10 crc kubenswrapper[4764]: I0309 13:25:10.277965 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27bcfe8c-a29f-4f0b-9f73-3e075a201db7-kubelet-dir\") pod \"27bcfe8c-a29f-4f0b-9f73-3e075a201db7\" (UID: \"27bcfe8c-a29f-4f0b-9f73-3e075a201db7\") " Mar 09 13:25:10 crc kubenswrapper[4764]: I0309 13:25:10.278074 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/27bcfe8c-a29f-4f0b-9f73-3e075a201db7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "27bcfe8c-a29f-4f0b-9f73-3e075a201db7" (UID: "27bcfe8c-a29f-4f0b-9f73-3e075a201db7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:25:10 crc kubenswrapper[4764]: I0309 13:25:10.278438 4764 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27bcfe8c-a29f-4f0b-9f73-3e075a201db7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:10 crc kubenswrapper[4764]: I0309 13:25:10.283045 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27bcfe8c-a29f-4f0b-9f73-3e075a201db7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "27bcfe8c-a29f-4f0b-9f73-3e075a201db7" (UID: "27bcfe8c-a29f-4f0b-9f73-3e075a201db7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:10 crc kubenswrapper[4764]: I0309 13:25:10.380933 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/27bcfe8c-a29f-4f0b-9f73-3e075a201db7-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:15 crc kubenswrapper[4764]: I0309 13:25:15.646781 4764 ???:1] "http: TLS handshake error from 192.168.126.11:47958: no serving certificate available for the kubelet" Mar 09 13:25:16 crc kubenswrapper[4764]: I0309 13:25:16.606238 4764 patch_prober.go:28] interesting pod/controller-manager-5d896c677f-8g9dz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 13:25:16 crc kubenswrapper[4764]: I0309 13:25:16.606621 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" podUID="ef84d4f2-b722-415f-bc23-d472e00474b4" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.347289 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.353400 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.382965 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35b2abcd-af84-40fe-8b37-90139612d63e-serving-cert\") pod \"35b2abcd-af84-40fe-8b37-90139612d63e\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.383048 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/35b2abcd-af84-40fe-8b37-90139612d63e-client-ca\") pod \"35b2abcd-af84-40fe-8b37-90139612d63e\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.383089 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35b2abcd-af84-40fe-8b37-90139612d63e-config\") pod \"35b2abcd-af84-40fe-8b37-90139612d63e\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.383147 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zpjf\" (UniqueName: \"kubernetes.io/projected/35b2abcd-af84-40fe-8b37-90139612d63e-kube-api-access-9zpjf\") pod \"35b2abcd-af84-40fe-8b37-90139612d63e\" (UID: \"35b2abcd-af84-40fe-8b37-90139612d63e\") " Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.390681 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35b2abcd-af84-40fe-8b37-90139612d63e-client-ca" (OuterVolumeSpecName: "client-ca") pod "35b2abcd-af84-40fe-8b37-90139612d63e" (UID: "35b2abcd-af84-40fe-8b37-90139612d63e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.391074 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35b2abcd-af84-40fe-8b37-90139612d63e-config" (OuterVolumeSpecName: "config") pod "35b2abcd-af84-40fe-8b37-90139612d63e" (UID: "35b2abcd-af84-40fe-8b37-90139612d63e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.391210 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35b2abcd-af84-40fe-8b37-90139612d63e-kube-api-access-9zpjf" (OuterVolumeSpecName: "kube-api-access-9zpjf") pod "35b2abcd-af84-40fe-8b37-90139612d63e" (UID: "35b2abcd-af84-40fe-8b37-90139612d63e"). InnerVolumeSpecName "kube-api-access-9zpjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.394188 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35b2abcd-af84-40fe-8b37-90139612d63e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "35b2abcd-af84-40fe-8b37-90139612d63e" (UID: "35b2abcd-af84-40fe-8b37-90139612d63e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.396316 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j"] Mar 09 13:25:17 crc kubenswrapper[4764]: E0309 13:25:17.396718 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef84d4f2-b722-415f-bc23-d472e00474b4" containerName="controller-manager" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.396739 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef84d4f2-b722-415f-bc23-d472e00474b4" containerName="controller-manager" Mar 09 13:25:17 crc kubenswrapper[4764]: E0309 13:25:17.396756 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f756dee9-7011-49b3-8a60-b7e08f01972d" containerName="pruner" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.396765 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f756dee9-7011-49b3-8a60-b7e08f01972d" containerName="pruner" Mar 09 13:25:17 crc kubenswrapper[4764]: E0309 13:25:17.396778 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39da5087-79bc-4154-b340-22183d9e4417" containerName="collect-profiles" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.396786 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="39da5087-79bc-4154-b340-22183d9e4417" containerName="collect-profiles" Mar 09 13:25:17 crc kubenswrapper[4764]: E0309 13:25:17.396798 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35b2abcd-af84-40fe-8b37-90139612d63e" containerName="route-controller-manager" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.396806 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="35b2abcd-af84-40fe-8b37-90139612d63e" containerName="route-controller-manager" Mar 09 13:25:17 crc kubenswrapper[4764]: E0309 13:25:17.396814 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27bcfe8c-a29f-4f0b-9f73-3e075a201db7" containerName="pruner" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.396822 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="27bcfe8c-a29f-4f0b-9f73-3e075a201db7" containerName="pruner" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.396945 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="35b2abcd-af84-40fe-8b37-90139612d63e" containerName="route-controller-manager" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.396964 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="39da5087-79bc-4154-b340-22183d9e4417" containerName="collect-profiles" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.396976 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f756dee9-7011-49b3-8a60-b7e08f01972d" containerName="pruner" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.396988 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef84d4f2-b722-415f-bc23-d472e00474b4" containerName="controller-manager" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.396998 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="27bcfe8c-a29f-4f0b-9f73-3e075a201db7" containerName="pruner" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.397481 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.400218 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j"] Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.484166 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-config\") pod \"ef84d4f2-b722-415f-bc23-d472e00474b4\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.484236 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef84d4f2-b722-415f-bc23-d472e00474b4-serving-cert\") pod \"ef84d4f2-b722-415f-bc23-d472e00474b4\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.484265 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jp2v7\" (UniqueName: \"kubernetes.io/projected/ef84d4f2-b722-415f-bc23-d472e00474b4-kube-api-access-jp2v7\") pod \"ef84d4f2-b722-415f-bc23-d472e00474b4\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.484356 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-client-ca\") pod \"ef84d4f2-b722-415f-bc23-d472e00474b4\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.484417 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-proxy-ca-bundles\") pod \"ef84d4f2-b722-415f-bc23-d472e00474b4\" (UID: \"ef84d4f2-b722-415f-bc23-d472e00474b4\") " Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.484609 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6d6p\" (UniqueName: \"kubernetes.io/projected/63fc6d8b-caee-491b-8434-f40958b590d5-kube-api-access-m6d6p\") pod \"route-controller-manager-57b64b694-svn4j\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.484655 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63fc6d8b-caee-491b-8434-f40958b590d5-client-ca\") pod \"route-controller-manager-57b64b694-svn4j\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.484719 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63fc6d8b-caee-491b-8434-f40958b590d5-config\") pod \"route-controller-manager-57b64b694-svn4j\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.485131 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63fc6d8b-caee-491b-8434-f40958b590d5-serving-cert\") pod \"route-controller-manager-57b64b694-svn4j\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.485340 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35b2abcd-af84-40fe-8b37-90139612d63e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.485355 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/35b2abcd-af84-40fe-8b37-90139612d63e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.485365 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35b2abcd-af84-40fe-8b37-90139612d63e-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.485376 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zpjf\" (UniqueName: \"kubernetes.io/projected/35b2abcd-af84-40fe-8b37-90139612d63e-kube-api-access-9zpjf\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.485406 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-client-ca" (OuterVolumeSpecName: "client-ca") pod "ef84d4f2-b722-415f-bc23-d472e00474b4" (UID: "ef84d4f2-b722-415f-bc23-d472e00474b4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.485466 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ef84d4f2-b722-415f-bc23-d472e00474b4" (UID: "ef84d4f2-b722-415f-bc23-d472e00474b4"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.485875 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-config" (OuterVolumeSpecName: "config") pod "ef84d4f2-b722-415f-bc23-d472e00474b4" (UID: "ef84d4f2-b722-415f-bc23-d472e00474b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.487504 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef84d4f2-b722-415f-bc23-d472e00474b4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ef84d4f2-b722-415f-bc23-d472e00474b4" (UID: "ef84d4f2-b722-415f-bc23-d472e00474b4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.487935 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef84d4f2-b722-415f-bc23-d472e00474b4-kube-api-access-jp2v7" (OuterVolumeSpecName: "kube-api-access-jp2v7") pod "ef84d4f2-b722-415f-bc23-d472e00474b4" (UID: "ef84d4f2-b722-415f-bc23-d472e00474b4"). InnerVolumeSpecName "kube-api-access-jp2v7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.586153 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63fc6d8b-caee-491b-8434-f40958b590d5-serving-cert\") pod \"route-controller-manager-57b64b694-svn4j\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.586214 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6d6p\" (UniqueName: \"kubernetes.io/projected/63fc6d8b-caee-491b-8434-f40958b590d5-kube-api-access-m6d6p\") pod \"route-controller-manager-57b64b694-svn4j\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.586237 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63fc6d8b-caee-491b-8434-f40958b590d5-client-ca\") pod \"route-controller-manager-57b64b694-svn4j\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.586284 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63fc6d8b-caee-491b-8434-f40958b590d5-config\") pod \"route-controller-manager-57b64b694-svn4j\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.586338 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef84d4f2-b722-415f-bc23-d472e00474b4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.586349 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jp2v7\" (UniqueName: \"kubernetes.io/projected/ef84d4f2-b722-415f-bc23-d472e00474b4-kube-api-access-jp2v7\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.586362 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.586370 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.586378 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef84d4f2-b722-415f-bc23-d472e00474b4-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.587614 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63fc6d8b-caee-491b-8434-f40958b590d5-config\") pod \"route-controller-manager-57b64b694-svn4j\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.588494 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63fc6d8b-caee-491b-8434-f40958b590d5-client-ca\") pod \"route-controller-manager-57b64b694-svn4j\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.599833 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63fc6d8b-caee-491b-8434-f40958b590d5-serving-cert\") pod \"route-controller-manager-57b64b694-svn4j\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.607409 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6d6p\" (UniqueName: \"kubernetes.io/projected/63fc6d8b-caee-491b-8434-f40958b590d5-kube-api-access-m6d6p\") pod \"route-controller-manager-57b64b694-svn4j\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:17 crc kubenswrapper[4764]: I0309 13:25:17.740199 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:18 crc kubenswrapper[4764]: I0309 13:25:18.263741 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" event={"ID":"ef84d4f2-b722-415f-bc23-d472e00474b4","Type":"ContainerDied","Data":"bd1be1047066fde143bce3e434912476e75a2c14646016c14c7e52ccd0c2869e"} Mar 09 13:25:18 crc kubenswrapper[4764]: I0309 13:25:18.263762 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d896c677f-8g9dz" Mar 09 13:25:18 crc kubenswrapper[4764]: I0309 13:25:18.264105 4764 scope.go:117] "RemoveContainer" containerID="ac78e9706513103621b9ee866b0a26306b3671bd6884454e15e781cf3cf27583" Mar 09 13:25:18 crc kubenswrapper[4764]: I0309 13:25:18.268752 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" event={"ID":"35b2abcd-af84-40fe-8b37-90139612d63e","Type":"ContainerDied","Data":"99e3df704f16abbd89cfa68b2dbe7a3e325602c3ff0a20c7684b7b091fc44203"} Mar 09 13:25:18 crc kubenswrapper[4764]: I0309 13:25:18.268884 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9" Mar 09 13:25:18 crc kubenswrapper[4764]: I0309 13:25:18.283903 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d896c677f-8g9dz"] Mar 09 13:25:18 crc kubenswrapper[4764]: I0309 13:25:18.289252 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5d896c677f-8g9dz"] Mar 09 13:25:18 crc kubenswrapper[4764]: I0309 13:25:18.296691 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9"] Mar 09 13:25:18 crc kubenswrapper[4764]: I0309 13:25:18.299436 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-866db9688c-qwkl9"] Mar 09 13:25:18 crc kubenswrapper[4764]: I0309 13:25:18.696879 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b2pz8" Mar 09 13:25:19 crc kubenswrapper[4764]: I0309 13:25:19.217010 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-x927s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 09 13:25:19 crc kubenswrapper[4764]: I0309 13:25:19.217418 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 09 13:25:19 crc kubenswrapper[4764]: I0309 13:25:19.572691 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35b2abcd-af84-40fe-8b37-90139612d63e" path="/var/lib/kubelet/pods/35b2abcd-af84-40fe-8b37-90139612d63e/volumes" Mar 09 13:25:19 crc kubenswrapper[4764]: I0309 13:25:19.573414 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef84d4f2-b722-415f-bc23-d472e00474b4" path="/var/lib/kubelet/pods/ef84d4f2-b722-415f-bc23-d472e00474b4/volumes" Mar 09 13:25:19 crc kubenswrapper[4764]: E0309 13:25:19.848112 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 09 13:25:19 crc kubenswrapper[4764]: E0309 13:25:19.848333 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ff292,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-nrc8s_openshift-marketplace(be22cbfb-d3e7-43c1-be38-f6fcadeb2c97): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 13:25:19 crc kubenswrapper[4764]: E0309 13:25:19.849535 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-nrc8s" podUID="be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.825922 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7f6c57f99f-67zw4"] Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.826728 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.829782 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.830021 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.830137 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.830248 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.830341 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.830485 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.837507 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.841029 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f6c57f99f-67zw4"] Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.897312 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j"] Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.950667 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-config\") pod \"controller-manager-7f6c57f99f-67zw4\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.950783 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r96c8\" (UniqueName: \"kubernetes.io/projected/e4ef7ddb-4b54-48f7-a879-d07cb3222339-kube-api-access-r96c8\") pod \"controller-manager-7f6c57f99f-67zw4\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.950813 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-client-ca\") pod \"controller-manager-7f6c57f99f-67zw4\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.950872 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ef7ddb-4b54-48f7-a879-d07cb3222339-serving-cert\") pod \"controller-manager-7f6c57f99f-67zw4\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:21 crc kubenswrapper[4764]: I0309 13:25:21.950932 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-proxy-ca-bundles\") pod \"controller-manager-7f6c57f99f-67zw4\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:22 crc kubenswrapper[4764]: I0309 13:25:22.052384 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ef7ddb-4b54-48f7-a879-d07cb3222339-serving-cert\") pod \"controller-manager-7f6c57f99f-67zw4\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:22 crc kubenswrapper[4764]: I0309 13:25:22.052497 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-proxy-ca-bundles\") pod \"controller-manager-7f6c57f99f-67zw4\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:22 crc kubenswrapper[4764]: I0309 13:25:22.052550 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-config\") pod \"controller-manager-7f6c57f99f-67zw4\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:22 crc kubenswrapper[4764]: I0309 13:25:22.052587 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r96c8\" (UniqueName: \"kubernetes.io/projected/e4ef7ddb-4b54-48f7-a879-d07cb3222339-kube-api-access-r96c8\") pod \"controller-manager-7f6c57f99f-67zw4\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:22 crc kubenswrapper[4764]: I0309 13:25:22.052614 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-client-ca\") pod \"controller-manager-7f6c57f99f-67zw4\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:22 crc kubenswrapper[4764]: I0309 13:25:22.054412 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-config\") pod \"controller-manager-7f6c57f99f-67zw4\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:22 crc kubenswrapper[4764]: I0309 13:25:22.059619 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-client-ca\") pod \"controller-manager-7f6c57f99f-67zw4\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:22 crc kubenswrapper[4764]: I0309 13:25:22.062500 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ef7ddb-4b54-48f7-a879-d07cb3222339-serving-cert\") pod \"controller-manager-7f6c57f99f-67zw4\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:22 crc kubenswrapper[4764]: I0309 13:25:22.063563 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-proxy-ca-bundles\") pod \"controller-manager-7f6c57f99f-67zw4\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:22 crc kubenswrapper[4764]: I0309 13:25:22.077576 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r96c8\" (UniqueName: \"kubernetes.io/projected/e4ef7ddb-4b54-48f7-a879-d07cb3222339-kube-api-access-r96c8\") pod \"controller-manager-7f6c57f99f-67zw4\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:22 crc kubenswrapper[4764]: I0309 13:25:22.145563 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:24 crc kubenswrapper[4764]: E0309 13:25:24.784361 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-nrc8s" podUID="be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" Mar 09 13:25:24 crc kubenswrapper[4764]: E0309 13:25:24.877188 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 09 13:25:24 crc kubenswrapper[4764]: E0309 13:25:24.877357 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v76nj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-tll5t_openshift-marketplace(41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 13:25:24 crc kubenswrapper[4764]: E0309 13:25:24.879417 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-tll5t" podUID="41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" Mar 09 13:25:25 crc kubenswrapper[4764]: I0309 13:25:25.085997 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 09 13:25:25 crc kubenswrapper[4764]: I0309 13:25:25.087563 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:25:25 crc kubenswrapper[4764]: I0309 13:25:25.094317 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 09 13:25:25 crc kubenswrapper[4764]: I0309 13:25:25.094597 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 09 13:25:25 crc kubenswrapper[4764]: I0309 13:25:25.097181 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 09 13:25:25 crc kubenswrapper[4764]: I0309 13:25:25.200061 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/048cea49-847b-4232-9ece-3656fccc1909-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"048cea49-847b-4232-9ece-3656fccc1909\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:25:25 crc kubenswrapper[4764]: I0309 13:25:25.200137 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/048cea49-847b-4232-9ece-3656fccc1909-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"048cea49-847b-4232-9ece-3656fccc1909\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:25:25 crc kubenswrapper[4764]: I0309 13:25:25.301716 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/048cea49-847b-4232-9ece-3656fccc1909-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"048cea49-847b-4232-9ece-3656fccc1909\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:25:25 crc kubenswrapper[4764]: I0309 13:25:25.301805 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/048cea49-847b-4232-9ece-3656fccc1909-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"048cea49-847b-4232-9ece-3656fccc1909\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:25:25 crc kubenswrapper[4764]: I0309 13:25:25.301984 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/048cea49-847b-4232-9ece-3656fccc1909-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"048cea49-847b-4232-9ece-3656fccc1909\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:25:25 crc kubenswrapper[4764]: I0309 13:25:25.331497 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/048cea49-847b-4232-9ece-3656fccc1909-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"048cea49-847b-4232-9ece-3656fccc1909\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:25:25 crc kubenswrapper[4764]: I0309 13:25:25.426045 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:25:27 crc kubenswrapper[4764]: E0309 13:25:27.405913 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-tll5t" podUID="41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" Mar 09 13:25:27 crc kubenswrapper[4764]: E0309 13:25:27.491321 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 09 13:25:27 crc kubenswrapper[4764]: E0309 13:25:27.491687 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5fsvt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-8d627_openshift-marketplace(88ba6041-7f8f-48f0-840c-8ea2a9bdc87b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 13:25:27 crc kubenswrapper[4764]: E0309 13:25:27.493007 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-8d627" podUID="88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" Mar 09 13:25:27 crc kubenswrapper[4764]: E0309 13:25:27.504128 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 09 13:25:27 crc kubenswrapper[4764]: E0309 13:25:27.504299 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-stkxn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-d9z59_openshift-marketplace(c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 13:25:27 crc kubenswrapper[4764]: E0309 13:25:27.505839 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-d9z59" podUID="c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" Mar 09 13:25:27 crc kubenswrapper[4764]: E0309 13:25:27.506634 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 09 13:25:27 crc kubenswrapper[4764]: E0309 13:25:27.506804 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dc5m2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-jn8f5_openshift-marketplace(a76121be-d090-4f2a-9e57-1a160a4bb4f2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 13:25:27 crc kubenswrapper[4764]: E0309 13:25:27.508020 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-jn8f5" podUID="a76121be-d090-4f2a-9e57-1a160a4bb4f2" Mar 09 13:25:28 crc kubenswrapper[4764]: I0309 13:25:28.369953 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:25:28 crc kubenswrapper[4764]: I0309 13:25:28.370022 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:25:28 crc kubenswrapper[4764]: E0309 13:25:28.886050 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-8d627" podUID="88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" Mar 09 13:25:28 crc kubenswrapper[4764]: E0309 13:25:28.886335 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jn8f5" podUID="a76121be-d090-4f2a-9e57-1a160a4bb4f2" Mar 09 13:25:28 crc kubenswrapper[4764]: E0309 13:25:28.886454 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-d9z59" podUID="c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" Mar 09 13:25:28 crc kubenswrapper[4764]: E0309 13:25:28.956164 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 09 13:25:28 crc kubenswrapper[4764]: E0309 13:25:28.956387 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gx2jl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-g7k9k_openshift-marketplace(7a967c79-e11e-4c58-b42e-652d1406ac88): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 13:25:28 crc kubenswrapper[4764]: E0309 13:25:28.957539 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-g7k9k" podUID="7a967c79-e11e-4c58-b42e-652d1406ac88" Mar 09 13:25:28 crc kubenswrapper[4764]: E0309 13:25:28.973006 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 09 13:25:28 crc kubenswrapper[4764]: E0309 13:25:28.973149 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j4z7x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-qhs57_openshift-marketplace(691ffa6f-3ee6-47fa-bcef-9fdd74ac86df): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 13:25:28 crc kubenswrapper[4764]: E0309 13:25:28.974404 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-qhs57" podUID="691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" Mar 09 13:25:28 crc kubenswrapper[4764]: E0309 13:25:28.985711 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 09 13:25:28 crc kubenswrapper[4764]: E0309 13:25:28.985913 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f4s9m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-mbm5b_openshift-marketplace(20acdcb5-ea78-435e-b472-e102d5553c75): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 13:25:28 crc kubenswrapper[4764]: E0309 13:25:28.987074 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-mbm5b" podUID="20acdcb5-ea78-435e-b472-e102d5553c75" Mar 09 13:25:29 crc kubenswrapper[4764]: I0309 13:25:29.218760 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-x927s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 09 13:25:29 crc kubenswrapper[4764]: I0309 13:25:29.218834 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 09 13:25:29 crc kubenswrapper[4764]: E0309 13:25:29.844085 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-g7k9k" podUID="7a967c79-e11e-4c58-b42e-652d1406ac88" Mar 09 13:25:29 crc kubenswrapper[4764]: E0309 13:25:29.844140 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-mbm5b" podUID="20acdcb5-ea78-435e-b472-e102d5553c75" Mar 09 13:25:29 crc kubenswrapper[4764]: E0309 13:25:29.844225 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-qhs57" podUID="691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" Mar 09 13:25:29 crc kubenswrapper[4764]: I0309 13:25:29.873000 4764 scope.go:117] "RemoveContainer" containerID="3bce33547ff50496fc00d452193e1617ac80b7c148e9afbaf542b8ac329a6ecf" Mar 09 13:25:29 crc kubenswrapper[4764]: E0309 13:25:29.876256 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 09 13:25:29 crc kubenswrapper[4764]: E0309 13:25:29.876366 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:25:29 crc kubenswrapper[4764]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 09 13:25:29 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g47h8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29551044-p748f_openshift-infra(0a005f65-920a-4cdd-b4da-a270953113aa): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 09 13:25:29 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 09 13:25:29 crc kubenswrapper[4764]: E0309 13:25:29.877473 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29551044-p748f" podUID="0a005f65-920a-4cdd-b4da-a270953113aa" Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.106127 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.113309 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.132930 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.182579 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6079e5ed-2acc-42f5-a62e-ea2a98b18abd\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.182668 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-kube-api-access\") pod \"installer-9-crc\" (UID: \"6079e5ed-2acc-42f5-a62e-ea2a98b18abd\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.182725 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-var-lock\") pod \"installer-9-crc\" (UID: \"6079e5ed-2acc-42f5-a62e-ea2a98b18abd\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.284515 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6079e5ed-2acc-42f5-a62e-ea2a98b18abd\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.284575 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-kube-api-access\") pod \"installer-9-crc\" (UID: \"6079e5ed-2acc-42f5-a62e-ea2a98b18abd\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.284620 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-var-lock\") pod \"installer-9-crc\" (UID: \"6079e5ed-2acc-42f5-a62e-ea2a98b18abd\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.284698 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6079e5ed-2acc-42f5-a62e-ea2a98b18abd\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.284710 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-var-lock\") pod \"installer-9-crc\" (UID: \"6079e5ed-2acc-42f5-a62e-ea2a98b18abd\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.305615 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-kube-api-access\") pod \"installer-9-crc\" (UID: \"6079e5ed-2acc-42f5-a62e-ea2a98b18abd\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.336936 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-x927s" event={"ID":"d245b116-e47d-4b15-a40d-0a9fa34cf1df","Type":"ContainerStarted","Data":"599976ba410eaba88a00e5c0f730b5c1ba416c87b24810e39748b0b6bec77a15"} Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.337178 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-x927s" Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.339432 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-x927s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.339486 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 09 13:25:30 crc kubenswrapper[4764]: E0309 13:25:30.342780 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29551044-p748f" podUID="0a005f65-920a-4cdd-b4da-a270953113aa" Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.394127 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f6c57f99f-67zw4"] Mar 09 13:25:30 crc kubenswrapper[4764]: W0309 13:25:30.408868 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4ef7ddb_4b54_48f7_a879_d07cb3222339.slice/crio-6dfe527522407abed5825087faeed3aa6722bd0b6c51839e5410a49d638c18ec WatchSource:0}: Error finding container 6dfe527522407abed5825087faeed3aa6722bd0b6c51839e5410a49d638c18ec: Status 404 returned error can't find the container with id 6dfe527522407abed5825087faeed3aa6722bd0b6c51839e5410a49d638c18ec Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.442993 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.456193 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.460394 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j"] Mar 09 13:25:30 crc kubenswrapper[4764]: W0309 13:25:30.470572 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod048cea49_847b_4232_9ece_3656fccc1909.slice/crio-22a8a78a0997f7beba52c53801155d1c792168145e66312da717d6dc41ad0310 WatchSource:0}: Error finding container 22a8a78a0997f7beba52c53801155d1c792168145e66312da717d6dc41ad0310: Status 404 returned error can't find the container with id 22a8a78a0997f7beba52c53801155d1c792168145e66312da717d6dc41ad0310 Mar 09 13:25:30 crc kubenswrapper[4764]: W0309 13:25:30.475579 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63fc6d8b_caee_491b_8434_f40958b590d5.slice/crio-770029a71edac963fedf423b75b89193eb45638286f1adaa5427198f1f82796f WatchSource:0}: Error finding container 770029a71edac963fedf423b75b89193eb45638286f1adaa5427198f1f82796f: Status 404 returned error can't find the container with id 770029a71edac963fedf423b75b89193eb45638286f1adaa5427198f1f82796f Mar 09 13:25:30 crc kubenswrapper[4764]: I0309 13:25:30.929558 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 09 13:25:30 crc kubenswrapper[4764]: W0309 13:25:30.947676 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6079e5ed_2acc_42f5_a62e_ea2a98b18abd.slice/crio-2b27f2baa5d5f2b2cb7d6fda09a6905faac786ec6165e8062e44b7064fdcbfbc WatchSource:0}: Error finding container 2b27f2baa5d5f2b2cb7d6fda09a6905faac786ec6165e8062e44b7064fdcbfbc: Status 404 returned error can't find the container with id 2b27f2baa5d5f2b2cb7d6fda09a6905faac786ec6165e8062e44b7064fdcbfbc Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.348162 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" event={"ID":"e4ef7ddb-4b54-48f7-a879-d07cb3222339","Type":"ContainerStarted","Data":"c13a9bf60f9efa3f676a6a2e63469b91bb787e084e0911f7a2d49d61a829e9cf"} Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.348700 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" event={"ID":"e4ef7ddb-4b54-48f7-a879-d07cb3222339","Type":"ContainerStarted","Data":"6dfe527522407abed5825087faeed3aa6722bd0b6c51839e5410a49d638c18ec"} Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.348731 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.350173 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"048cea49-847b-4232-9ece-3656fccc1909","Type":"ContainerStarted","Data":"73c91c8f718fb65e017e620c7103bee44d1903f405abd54c0038fd8acbf6dc05"} Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.350244 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"048cea49-847b-4232-9ece-3656fccc1909","Type":"ContainerStarted","Data":"22a8a78a0997f7beba52c53801155d1c792168145e66312da717d6dc41ad0310"} Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.352381 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" event={"ID":"63fc6d8b-caee-491b-8434-f40958b590d5","Type":"ContainerStarted","Data":"0f2b957008f233aacbb248e307f389252c1b53d4330c32f97fcc03549e0a1da4"} Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.352415 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" event={"ID":"63fc6d8b-caee-491b-8434-f40958b590d5","Type":"ContainerStarted","Data":"770029a71edac963fedf423b75b89193eb45638286f1adaa5427198f1f82796f"} Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.352524 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" podUID="63fc6d8b-caee-491b-8434-f40958b590d5" containerName="route-controller-manager" containerID="cri-o://0f2b957008f233aacbb248e307f389252c1b53d4330c32f97fcc03549e0a1da4" gracePeriod=30 Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.352638 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.354031 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6079e5ed-2acc-42f5-a62e-ea2a98b18abd","Type":"ContainerStarted","Data":"2b27f2baa5d5f2b2cb7d6fda09a6905faac786ec6165e8062e44b7064fdcbfbc"} Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.354763 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-x927s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.354833 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.360858 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.362522 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.375034 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" podStartSLOduration=10.375005872 podStartE2EDuration="10.375005872s" podCreationTimestamp="2026-03-09 13:25:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:25:31.370806238 +0000 UTC m=+286.620978156" watchObservedRunningTime="2026-03-09 13:25:31.375005872 +0000 UTC m=+286.625177780" Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.409493 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=6.409465675 podStartE2EDuration="6.409465675s" podCreationTimestamp="2026-03-09 13:25:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:25:31.4089227 +0000 UTC m=+286.659094618" watchObservedRunningTime="2026-03-09 13:25:31.409465675 +0000 UTC m=+286.659637603" Mar 09 13:25:31 crc kubenswrapper[4764]: I0309 13:25:31.457515 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" podStartSLOduration=30.457480167 podStartE2EDuration="30.457480167s" podCreationTimestamp="2026-03-09 13:25:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:25:31.453106118 +0000 UTC m=+286.703278046" watchObservedRunningTime="2026-03-09 13:25:31.457480167 +0000 UTC m=+286.707652085" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.254845 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.288981 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj"] Mar 09 13:25:32 crc kubenswrapper[4764]: E0309 13:25:32.289291 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63fc6d8b-caee-491b-8434-f40958b590d5" containerName="route-controller-manager" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.289307 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="63fc6d8b-caee-491b-8434-f40958b590d5" containerName="route-controller-manager" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.289443 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="63fc6d8b-caee-491b-8434-f40958b590d5" containerName="route-controller-manager" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.289962 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.299765 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj"] Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.318178 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63fc6d8b-caee-491b-8434-f40958b590d5-serving-cert\") pod \"63fc6d8b-caee-491b-8434-f40958b590d5\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.318248 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6d6p\" (UniqueName: \"kubernetes.io/projected/63fc6d8b-caee-491b-8434-f40958b590d5-kube-api-access-m6d6p\") pod \"63fc6d8b-caee-491b-8434-f40958b590d5\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.318270 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63fc6d8b-caee-491b-8434-f40958b590d5-client-ca\") pod \"63fc6d8b-caee-491b-8434-f40958b590d5\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.318314 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63fc6d8b-caee-491b-8434-f40958b590d5-config\") pod \"63fc6d8b-caee-491b-8434-f40958b590d5\" (UID: \"63fc6d8b-caee-491b-8434-f40958b590d5\") " Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.318529 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/556e4318-98c6-4910-830a-2edcafa8c5a3-serving-cert\") pod \"route-controller-manager-7745768cd9-mrqxj\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.318591 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/556e4318-98c6-4910-830a-2edcafa8c5a3-client-ca\") pod \"route-controller-manager-7745768cd9-mrqxj\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.318695 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqfnq\" (UniqueName: \"kubernetes.io/projected/556e4318-98c6-4910-830a-2edcafa8c5a3-kube-api-access-rqfnq\") pod \"route-controller-manager-7745768cd9-mrqxj\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.318723 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/556e4318-98c6-4910-830a-2edcafa8c5a3-config\") pod \"route-controller-manager-7745768cd9-mrqxj\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.319505 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63fc6d8b-caee-491b-8434-f40958b590d5-client-ca" (OuterVolumeSpecName: "client-ca") pod "63fc6d8b-caee-491b-8434-f40958b590d5" (UID: "63fc6d8b-caee-491b-8434-f40958b590d5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.319579 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63fc6d8b-caee-491b-8434-f40958b590d5-config" (OuterVolumeSpecName: "config") pod "63fc6d8b-caee-491b-8434-f40958b590d5" (UID: "63fc6d8b-caee-491b-8434-f40958b590d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.325373 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63fc6d8b-caee-491b-8434-f40958b590d5-kube-api-access-m6d6p" (OuterVolumeSpecName: "kube-api-access-m6d6p") pod "63fc6d8b-caee-491b-8434-f40958b590d5" (UID: "63fc6d8b-caee-491b-8434-f40958b590d5"). InnerVolumeSpecName "kube-api-access-m6d6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.326384 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63fc6d8b-caee-491b-8434-f40958b590d5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "63fc6d8b-caee-491b-8434-f40958b590d5" (UID: "63fc6d8b-caee-491b-8434-f40958b590d5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.361410 4764 generic.go:334] "Generic (PLEG): container finished" podID="048cea49-847b-4232-9ece-3656fccc1909" containerID="73c91c8f718fb65e017e620c7103bee44d1903f405abd54c0038fd8acbf6dc05" exitCode=0 Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.361489 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"048cea49-847b-4232-9ece-3656fccc1909","Type":"ContainerDied","Data":"73c91c8f718fb65e017e620c7103bee44d1903f405abd54c0038fd8acbf6dc05"} Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.363227 4764 generic.go:334] "Generic (PLEG): container finished" podID="63fc6d8b-caee-491b-8434-f40958b590d5" containerID="0f2b957008f233aacbb248e307f389252c1b53d4330c32f97fcc03549e0a1da4" exitCode=0 Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.363266 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" event={"ID":"63fc6d8b-caee-491b-8434-f40958b590d5","Type":"ContainerDied","Data":"0f2b957008f233aacbb248e307f389252c1b53d4330c32f97fcc03549e0a1da4"} Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.363292 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.363324 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j" event={"ID":"63fc6d8b-caee-491b-8434-f40958b590d5","Type":"ContainerDied","Data":"770029a71edac963fedf423b75b89193eb45638286f1adaa5427198f1f82796f"} Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.363341 4764 scope.go:117] "RemoveContainer" containerID="0f2b957008f233aacbb248e307f389252c1b53d4330c32f97fcc03549e0a1da4" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.364869 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6079e5ed-2acc-42f5-a62e-ea2a98b18abd","Type":"ContainerStarted","Data":"a9112ccd1b21485b0fe1f8f0b1cfb2709637cb4e00bc4c414110531d66291dd4"} Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.400146 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.400124528 podStartE2EDuration="2.400124528s" podCreationTimestamp="2026-03-09 13:25:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:25:32.396835028 +0000 UTC m=+287.647006956" watchObservedRunningTime="2026-03-09 13:25:32.400124528 +0000 UTC m=+287.650296446" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.404551 4764 scope.go:117] "RemoveContainer" containerID="0f2b957008f233aacbb248e307f389252c1b53d4330c32f97fcc03549e0a1da4" Mar 09 13:25:32 crc kubenswrapper[4764]: E0309 13:25:32.406235 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f2b957008f233aacbb248e307f389252c1b53d4330c32f97fcc03549e0a1da4\": container with ID starting with 0f2b957008f233aacbb248e307f389252c1b53d4330c32f97fcc03549e0a1da4 not found: ID does not exist" containerID="0f2b957008f233aacbb248e307f389252c1b53d4330c32f97fcc03549e0a1da4" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.406273 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f2b957008f233aacbb248e307f389252c1b53d4330c32f97fcc03549e0a1da4"} err="failed to get container status \"0f2b957008f233aacbb248e307f389252c1b53d4330c32f97fcc03549e0a1da4\": rpc error: code = NotFound desc = could not find container \"0f2b957008f233aacbb248e307f389252c1b53d4330c32f97fcc03549e0a1da4\": container with ID starting with 0f2b957008f233aacbb248e307f389252c1b53d4330c32f97fcc03549e0a1da4 not found: ID does not exist" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.416116 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j"] Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.420379 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/556e4318-98c6-4910-830a-2edcafa8c5a3-serving-cert\") pod \"route-controller-manager-7745768cd9-mrqxj\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.420489 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/556e4318-98c6-4910-830a-2edcafa8c5a3-client-ca\") pod \"route-controller-manager-7745768cd9-mrqxj\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.420566 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqfnq\" (UniqueName: \"kubernetes.io/projected/556e4318-98c6-4910-830a-2edcafa8c5a3-kube-api-access-rqfnq\") pod \"route-controller-manager-7745768cd9-mrqxj\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.420594 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/556e4318-98c6-4910-830a-2edcafa8c5a3-config\") pod \"route-controller-manager-7745768cd9-mrqxj\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.420671 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63fc6d8b-caee-491b-8434-f40958b590d5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.420683 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6d6p\" (UniqueName: \"kubernetes.io/projected/63fc6d8b-caee-491b-8434-f40958b590d5-kube-api-access-m6d6p\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.420693 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63fc6d8b-caee-491b-8434-f40958b590d5-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.420702 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63fc6d8b-caee-491b-8434-f40958b590d5-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.421590 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/556e4318-98c6-4910-830a-2edcafa8c5a3-client-ca\") pod \"route-controller-manager-7745768cd9-mrqxj\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.421746 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/556e4318-98c6-4910-830a-2edcafa8c5a3-config\") pod \"route-controller-manager-7745768cd9-mrqxj\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.424059 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/556e4318-98c6-4910-830a-2edcafa8c5a3-serving-cert\") pod \"route-controller-manager-7745768cd9-mrqxj\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.431005 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b64b694-svn4j"] Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.440853 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqfnq\" (UniqueName: \"kubernetes.io/projected/556e4318-98c6-4910-830a-2edcafa8c5a3-kube-api-access-rqfnq\") pod \"route-controller-manager-7745768cd9-mrqxj\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:32 crc kubenswrapper[4764]: I0309 13:25:32.621421 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:33 crc kubenswrapper[4764]: I0309 13:25:33.071369 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj"] Mar 09 13:25:33 crc kubenswrapper[4764]: W0309 13:25:33.078143 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod556e4318_98c6_4910_830a_2edcafa8c5a3.slice/crio-335a472e98db78b509578c161edffb85e2cda67cc112f2084014fb62974f8645 WatchSource:0}: Error finding container 335a472e98db78b509578c161edffb85e2cda67cc112f2084014fb62974f8645: Status 404 returned error can't find the container with id 335a472e98db78b509578c161edffb85e2cda67cc112f2084014fb62974f8645 Mar 09 13:25:33 crc kubenswrapper[4764]: I0309 13:25:33.373694 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" event={"ID":"556e4318-98c6-4910-830a-2edcafa8c5a3","Type":"ContainerStarted","Data":"02fabc10f8ca5b240e9914fe91984e0b3ed4f53f2c9e25d78b0f654f639a110c"} Mar 09 13:25:33 crc kubenswrapper[4764]: I0309 13:25:33.373751 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" event={"ID":"556e4318-98c6-4910-830a-2edcafa8c5a3","Type":"ContainerStarted","Data":"335a472e98db78b509578c161edffb85e2cda67cc112f2084014fb62974f8645"} Mar 09 13:25:33 crc kubenswrapper[4764]: I0309 13:25:33.566666 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63fc6d8b-caee-491b-8434-f40958b590d5" path="/var/lib/kubelet/pods/63fc6d8b-caee-491b-8434-f40958b590d5/volumes" Mar 09 13:25:33 crc kubenswrapper[4764]: I0309 13:25:33.654467 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:25:33 crc kubenswrapper[4764]: I0309 13:25:33.668670 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/048cea49-847b-4232-9ece-3656fccc1909-kubelet-dir\") pod \"048cea49-847b-4232-9ece-3656fccc1909\" (UID: \"048cea49-847b-4232-9ece-3656fccc1909\") " Mar 09 13:25:33 crc kubenswrapper[4764]: I0309 13:25:33.668721 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/048cea49-847b-4232-9ece-3656fccc1909-kube-api-access\") pod \"048cea49-847b-4232-9ece-3656fccc1909\" (UID: \"048cea49-847b-4232-9ece-3656fccc1909\") " Mar 09 13:25:33 crc kubenswrapper[4764]: I0309 13:25:33.670049 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/048cea49-847b-4232-9ece-3656fccc1909-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "048cea49-847b-4232-9ece-3656fccc1909" (UID: "048cea49-847b-4232-9ece-3656fccc1909"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:25:33 crc kubenswrapper[4764]: I0309 13:25:33.678100 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/048cea49-847b-4232-9ece-3656fccc1909-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "048cea49-847b-4232-9ece-3656fccc1909" (UID: "048cea49-847b-4232-9ece-3656fccc1909"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:33 crc kubenswrapper[4764]: I0309 13:25:33.770674 4764 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/048cea49-847b-4232-9ece-3656fccc1909-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:33 crc kubenswrapper[4764]: I0309 13:25:33.770707 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/048cea49-847b-4232-9ece-3656fccc1909-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:34 crc kubenswrapper[4764]: I0309 13:25:34.381956 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"048cea49-847b-4232-9ece-3656fccc1909","Type":"ContainerDied","Data":"22a8a78a0997f7beba52c53801155d1c792168145e66312da717d6dc41ad0310"} Mar 09 13:25:34 crc kubenswrapper[4764]: I0309 13:25:34.381999 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:25:34 crc kubenswrapper[4764]: I0309 13:25:34.382001 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22a8a78a0997f7beba52c53801155d1c792168145e66312da717d6dc41ad0310" Mar 09 13:25:34 crc kubenswrapper[4764]: I0309 13:25:34.382739 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:34 crc kubenswrapper[4764]: I0309 13:25:34.388872 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:34 crc kubenswrapper[4764]: I0309 13:25:34.401712 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" podStartSLOduration=13.401694441 podStartE2EDuration="13.401694441s" podCreationTimestamp="2026-03-09 13:25:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:25:34.399162122 +0000 UTC m=+289.649334040" watchObservedRunningTime="2026-03-09 13:25:34.401694441 +0000 UTC m=+289.651866369" Mar 09 13:25:39 crc kubenswrapper[4764]: I0309 13:25:39.216519 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-x927s container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 09 13:25:39 crc kubenswrapper[4764]: I0309 13:25:39.216530 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-x927s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 09 13:25:39 crc kubenswrapper[4764]: I0309 13:25:39.217079 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 09 13:25:39 crc kubenswrapper[4764]: I0309 13:25:39.217101 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x927s" podUID="d245b116-e47d-4b15-a40d-0a9fa34cf1df" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 09 13:25:39 crc kubenswrapper[4764]: I0309 13:25:39.427110 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrc8s" event={"ID":"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97","Type":"ContainerStarted","Data":"bf1dacbdf950a01b7bd5a3fe093c4331659602b77ce43e68786707bbc2b89ea8"} Mar 09 13:25:40 crc kubenswrapper[4764]: I0309 13:25:40.435396 4764 generic.go:334] "Generic (PLEG): container finished" podID="be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" containerID="bf1dacbdf950a01b7bd5a3fe093c4331659602b77ce43e68786707bbc2b89ea8" exitCode=0 Mar 09 13:25:40 crc kubenswrapper[4764]: I0309 13:25:40.435437 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrc8s" event={"ID":"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97","Type":"ContainerDied","Data":"bf1dacbdf950a01b7bd5a3fe093c4331659602b77ce43e68786707bbc2b89ea8"} Mar 09 13:25:41 crc kubenswrapper[4764]: I0309 13:25:41.444373 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn8f5" event={"ID":"a76121be-d090-4f2a-9e57-1a160a4bb4f2","Type":"ContainerStarted","Data":"71eda28b996e54f946f3b59d46c82f8b6fc0575bacbdbf501c3dc23dbb439d43"} Mar 09 13:25:41 crc kubenswrapper[4764]: I0309 13:25:41.448899 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrc8s" event={"ID":"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97","Type":"ContainerStarted","Data":"0dca8e186618d6e08ce56c027491c8ad89ea2784547d9fbdcf530de2e5a43711"} Mar 09 13:25:41 crc kubenswrapper[4764]: I0309 13:25:41.480273 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nrc8s" podStartSLOduration=3.543738448 podStartE2EDuration="57.480255972s" podCreationTimestamp="2026-03-09 13:24:44 +0000 UTC" firstStartedPulling="2026-03-09 13:24:47.00582228 +0000 UTC m=+242.255994188" lastFinishedPulling="2026-03-09 13:25:40.942339804 +0000 UTC m=+296.192511712" observedRunningTime="2026-03-09 13:25:41.477054834 +0000 UTC m=+296.727226762" watchObservedRunningTime="2026-03-09 13:25:41.480255972 +0000 UTC m=+296.730427880" Mar 09 13:25:41 crc kubenswrapper[4764]: I0309 13:25:41.810599 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f6c57f99f-67zw4"] Mar 09 13:25:41 crc kubenswrapper[4764]: I0309 13:25:41.810828 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" podUID="e4ef7ddb-4b54-48f7-a879-d07cb3222339" containerName="controller-manager" containerID="cri-o://c13a9bf60f9efa3f676a6a2e63469b91bb787e084e0911f7a2d49d61a829e9cf" gracePeriod=30 Mar 09 13:25:41 crc kubenswrapper[4764]: I0309 13:25:41.824164 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj"] Mar 09 13:25:41 crc kubenswrapper[4764]: I0309 13:25:41.824654 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" podUID="556e4318-98c6-4910-830a-2edcafa8c5a3" containerName="route-controller-manager" containerID="cri-o://02fabc10f8ca5b240e9914fe91984e0b3ed4f53f2c9e25d78b0f654f639a110c" gracePeriod=30 Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.476272 4764 generic.go:334] "Generic (PLEG): container finished" podID="556e4318-98c6-4910-830a-2edcafa8c5a3" containerID="02fabc10f8ca5b240e9914fe91984e0b3ed4f53f2c9e25d78b0f654f639a110c" exitCode=0 Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.476375 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" event={"ID":"556e4318-98c6-4910-830a-2edcafa8c5a3","Type":"ContainerDied","Data":"02fabc10f8ca5b240e9914fe91984e0b3ed4f53f2c9e25d78b0f654f639a110c"} Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.480510 4764 generic.go:334] "Generic (PLEG): container finished" podID="e4ef7ddb-4b54-48f7-a879-d07cb3222339" containerID="c13a9bf60f9efa3f676a6a2e63469b91bb787e084e0911f7a2d49d61a829e9cf" exitCode=0 Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.480588 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" event={"ID":"e4ef7ddb-4b54-48f7-a879-d07cb3222339","Type":"ContainerDied","Data":"c13a9bf60f9efa3f676a6a2e63469b91bb787e084e0911f7a2d49d61a829e9cf"} Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.482903 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tll5t" event={"ID":"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6","Type":"ContainerStarted","Data":"0ae328f899ade57662b7a57d61c5864e374b6610f4676d30b76a8c7048f7853c"} Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.486956 4764 generic.go:334] "Generic (PLEG): container finished" podID="a76121be-d090-4f2a-9e57-1a160a4bb4f2" containerID="71eda28b996e54f946f3b59d46c82f8b6fc0575bacbdbf501c3dc23dbb439d43" exitCode=0 Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.487012 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn8f5" event={"ID":"a76121be-d090-4f2a-9e57-1a160a4bb4f2","Type":"ContainerDied","Data":"71eda28b996e54f946f3b59d46c82f8b6fc0575bacbdbf501c3dc23dbb439d43"} Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.492582 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d627" event={"ID":"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b","Type":"ContainerStarted","Data":"8cec696202563cb93ec1099b2921beba11271e2ac2498229b87b108935e01fc2"} Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.757632 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.762626 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.938292 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/556e4318-98c6-4910-830a-2edcafa8c5a3-config\") pod \"556e4318-98c6-4910-830a-2edcafa8c5a3\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.938404 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqfnq\" (UniqueName: \"kubernetes.io/projected/556e4318-98c6-4910-830a-2edcafa8c5a3-kube-api-access-rqfnq\") pod \"556e4318-98c6-4910-830a-2edcafa8c5a3\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.938442 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/556e4318-98c6-4910-830a-2edcafa8c5a3-client-ca\") pod \"556e4318-98c6-4910-830a-2edcafa8c5a3\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.938467 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/556e4318-98c6-4910-830a-2edcafa8c5a3-serving-cert\") pod \"556e4318-98c6-4910-830a-2edcafa8c5a3\" (UID: \"556e4318-98c6-4910-830a-2edcafa8c5a3\") " Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.938500 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r96c8\" (UniqueName: \"kubernetes.io/projected/e4ef7ddb-4b54-48f7-a879-d07cb3222339-kube-api-access-r96c8\") pod \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.938552 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-client-ca\") pod \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.938578 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-proxy-ca-bundles\") pod \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.938636 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ef7ddb-4b54-48f7-a879-d07cb3222339-serving-cert\") pod \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.938695 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-config\") pod \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\" (UID: \"e4ef7ddb-4b54-48f7-a879-d07cb3222339\") " Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.939286 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/556e4318-98c6-4910-830a-2edcafa8c5a3-config" (OuterVolumeSpecName: "config") pod "556e4318-98c6-4910-830a-2edcafa8c5a3" (UID: "556e4318-98c6-4910-830a-2edcafa8c5a3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.939539 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e4ef7ddb-4b54-48f7-a879-d07cb3222339" (UID: "e4ef7ddb-4b54-48f7-a879-d07cb3222339"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.939557 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-client-ca" (OuterVolumeSpecName: "client-ca") pod "e4ef7ddb-4b54-48f7-a879-d07cb3222339" (UID: "e4ef7ddb-4b54-48f7-a879-d07cb3222339"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.939716 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-config" (OuterVolumeSpecName: "config") pod "e4ef7ddb-4b54-48f7-a879-d07cb3222339" (UID: "e4ef7ddb-4b54-48f7-a879-d07cb3222339"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.940089 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/556e4318-98c6-4910-830a-2edcafa8c5a3-client-ca" (OuterVolumeSpecName: "client-ca") pod "556e4318-98c6-4910-830a-2edcafa8c5a3" (UID: "556e4318-98c6-4910-830a-2edcafa8c5a3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.950722 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/556e4318-98c6-4910-830a-2edcafa8c5a3-kube-api-access-rqfnq" (OuterVolumeSpecName: "kube-api-access-rqfnq") pod "556e4318-98c6-4910-830a-2edcafa8c5a3" (UID: "556e4318-98c6-4910-830a-2edcafa8c5a3"). InnerVolumeSpecName "kube-api-access-rqfnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.950764 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/556e4318-98c6-4910-830a-2edcafa8c5a3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "556e4318-98c6-4910-830a-2edcafa8c5a3" (UID: "556e4318-98c6-4910-830a-2edcafa8c5a3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.950779 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4ef7ddb-4b54-48f7-a879-d07cb3222339-kube-api-access-r96c8" (OuterVolumeSpecName: "kube-api-access-r96c8") pod "e4ef7ddb-4b54-48f7-a879-d07cb3222339" (UID: "e4ef7ddb-4b54-48f7-a879-d07cb3222339"). InnerVolumeSpecName "kube-api-access-r96c8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:42 crc kubenswrapper[4764]: I0309 13:25:42.962860 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ef7ddb-4b54-48f7-a879-d07cb3222339-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e4ef7ddb-4b54-48f7-a879-d07cb3222339" (UID: "e4ef7ddb-4b54-48f7-a879-d07cb3222339"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.040465 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/556e4318-98c6-4910-830a-2edcafa8c5a3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.040516 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r96c8\" (UniqueName: \"kubernetes.io/projected/e4ef7ddb-4b54-48f7-a879-d07cb3222339-kube-api-access-r96c8\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.040531 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.040543 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.040556 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ef7ddb-4b54-48f7-a879-d07cb3222339-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.040567 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ef7ddb-4b54-48f7-a879-d07cb3222339-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.040579 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/556e4318-98c6-4910-830a-2edcafa8c5a3-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.040591 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqfnq\" (UniqueName: \"kubernetes.io/projected/556e4318-98c6-4910-830a-2edcafa8c5a3-kube-api-access-rqfnq\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.040602 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/556e4318-98c6-4910-830a-2edcafa8c5a3-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.110999 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz"] Mar 09 13:25:43 crc kubenswrapper[4764]: E0309 13:25:43.111315 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="048cea49-847b-4232-9ece-3656fccc1909" containerName="pruner" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.111330 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="048cea49-847b-4232-9ece-3656fccc1909" containerName="pruner" Mar 09 13:25:43 crc kubenswrapper[4764]: E0309 13:25:43.111359 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4ef7ddb-4b54-48f7-a879-d07cb3222339" containerName="controller-manager" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.111367 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4ef7ddb-4b54-48f7-a879-d07cb3222339" containerName="controller-manager" Mar 09 13:25:43 crc kubenswrapper[4764]: E0309 13:25:43.111379 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556e4318-98c6-4910-830a-2edcafa8c5a3" containerName="route-controller-manager" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.111388 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="556e4318-98c6-4910-830a-2edcafa8c5a3" containerName="route-controller-manager" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.111508 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="048cea49-847b-4232-9ece-3656fccc1909" containerName="pruner" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.111525 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4ef7ddb-4b54-48f7-a879-d07cb3222339" containerName="controller-manager" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.111539 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="556e4318-98c6-4910-830a-2edcafa8c5a3" containerName="route-controller-manager" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.112685 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.116099 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7c9d566569-xmnfz"] Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.116842 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.131275 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c9d566569-xmnfz"] Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.135688 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz"] Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.147485 4764 patch_prober.go:28] interesting pod/controller-manager-7f6c57f99f-67zw4 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.147545 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" podUID="e4ef7ddb-4b54-48f7-a879-d07cb3222339" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.242603 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32312778-9c44-4843-a588-5fba60384e05-serving-cert\") pod \"route-controller-manager-fb48676f5-nvfnz\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.242691 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e222771-a709-459f-a36f-e44f4b87983e-serving-cert\") pod \"controller-manager-7c9d566569-xmnfz\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.242736 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l6gj\" (UniqueName: \"kubernetes.io/projected/32312778-9c44-4843-a588-5fba60384e05-kube-api-access-4l6gj\") pod \"route-controller-manager-fb48676f5-nvfnz\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.242754 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-client-ca\") pod \"controller-manager-7c9d566569-xmnfz\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.242785 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-proxy-ca-bundles\") pod \"controller-manager-7c9d566569-xmnfz\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.242804 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bf7k\" (UniqueName: \"kubernetes.io/projected/8e222771-a709-459f-a36f-e44f4b87983e-kube-api-access-7bf7k\") pod \"controller-manager-7c9d566569-xmnfz\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.242902 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-config\") pod \"controller-manager-7c9d566569-xmnfz\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.242941 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32312778-9c44-4843-a588-5fba60384e05-client-ca\") pod \"route-controller-manager-fb48676f5-nvfnz\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.243059 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32312778-9c44-4843-a588-5fba60384e05-config\") pod \"route-controller-manager-fb48676f5-nvfnz\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.344181 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l6gj\" (UniqueName: \"kubernetes.io/projected/32312778-9c44-4843-a588-5fba60384e05-kube-api-access-4l6gj\") pod \"route-controller-manager-fb48676f5-nvfnz\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.344260 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-client-ca\") pod \"controller-manager-7c9d566569-xmnfz\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.344345 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-proxy-ca-bundles\") pod \"controller-manager-7c9d566569-xmnfz\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.344396 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bf7k\" (UniqueName: \"kubernetes.io/projected/8e222771-a709-459f-a36f-e44f4b87983e-kube-api-access-7bf7k\") pod \"controller-manager-7c9d566569-xmnfz\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.344460 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-config\") pod \"controller-manager-7c9d566569-xmnfz\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.344501 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32312778-9c44-4843-a588-5fba60384e05-client-ca\") pod \"route-controller-manager-fb48676f5-nvfnz\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.344615 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32312778-9c44-4843-a588-5fba60384e05-config\") pod \"route-controller-manager-fb48676f5-nvfnz\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.344739 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32312778-9c44-4843-a588-5fba60384e05-serving-cert\") pod \"route-controller-manager-fb48676f5-nvfnz\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.344842 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e222771-a709-459f-a36f-e44f4b87983e-serving-cert\") pod \"controller-manager-7c9d566569-xmnfz\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.347238 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-client-ca\") pod \"controller-manager-7c9d566569-xmnfz\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.348231 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-config\") pod \"controller-manager-7c9d566569-xmnfz\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.348971 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32312778-9c44-4843-a588-5fba60384e05-client-ca\") pod \"route-controller-manager-fb48676f5-nvfnz\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.349461 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32312778-9c44-4843-a588-5fba60384e05-config\") pod \"route-controller-manager-fb48676f5-nvfnz\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.350851 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-proxy-ca-bundles\") pod \"controller-manager-7c9d566569-xmnfz\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.350892 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e222771-a709-459f-a36f-e44f4b87983e-serving-cert\") pod \"controller-manager-7c9d566569-xmnfz\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.356812 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32312778-9c44-4843-a588-5fba60384e05-serving-cert\") pod \"route-controller-manager-fb48676f5-nvfnz\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.368151 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bf7k\" (UniqueName: \"kubernetes.io/projected/8e222771-a709-459f-a36f-e44f4b87983e-kube-api-access-7bf7k\") pod \"controller-manager-7c9d566569-xmnfz\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.397480 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l6gj\" (UniqueName: \"kubernetes.io/projected/32312778-9c44-4843-a588-5fba60384e05-kube-api-access-4l6gj\") pod \"route-controller-manager-fb48676f5-nvfnz\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.432191 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.463601 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.507740 4764 generic.go:334] "Generic (PLEG): container finished" podID="20acdcb5-ea78-435e-b472-e102d5553c75" containerID="67daa9bd8128811d30d5710b782098e52f919063831113dfb444335915817615" exitCode=0 Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.507821 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbm5b" event={"ID":"20acdcb5-ea78-435e-b472-e102d5553c75","Type":"ContainerDied","Data":"67daa9bd8128811d30d5710b782098e52f919063831113dfb444335915817615"} Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.514056 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.514267 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" event={"ID":"556e4318-98c6-4910-830a-2edcafa8c5a3","Type":"ContainerDied","Data":"335a472e98db78b509578c161edffb85e2cda67cc112f2084014fb62974f8645"} Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.514341 4764 scope.go:117] "RemoveContainer" containerID="02fabc10f8ca5b240e9914fe91984e0b3ed4f53f2c9e25d78b0f654f639a110c" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.519471 4764 generic.go:334] "Generic (PLEG): container finished" podID="88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" containerID="8cec696202563cb93ec1099b2921beba11271e2ac2498229b87b108935e01fc2" exitCode=0 Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.519570 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d627" event={"ID":"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b","Type":"ContainerDied","Data":"8cec696202563cb93ec1099b2921beba11271e2ac2498229b87b108935e01fc2"} Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.522736 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" event={"ID":"e4ef7ddb-4b54-48f7-a879-d07cb3222339","Type":"ContainerDied","Data":"6dfe527522407abed5825087faeed3aa6722bd0b6c51839e5410a49d638c18ec"} Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.522825 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f6c57f99f-67zw4" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.567865 4764 scope.go:117] "RemoveContainer" containerID="c13a9bf60f9efa3f676a6a2e63469b91bb787e084e0911f7a2d49d61a829e9cf" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.594846 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn8f5" event={"ID":"a76121be-d090-4f2a-9e57-1a160a4bb4f2","Type":"ContainerStarted","Data":"1d40a7985c63e046f744271b4a5346b0a7b6d6cdf8fbc36bc9abda5ba599778a"} Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.626881 4764 patch_prober.go:28] interesting pod/route-controller-manager-7745768cd9-mrqxj container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.626944 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj" podUID="556e4318-98c6-4910-830a-2edcafa8c5a3" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.655578 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f6c57f99f-67zw4"] Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.659010 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7f6c57f99f-67zw4"] Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.666875 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj"] Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.666931 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7745768cd9-mrqxj"] Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.720139 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jn8f5" podStartSLOduration=3.850183257 podStartE2EDuration="58.720115779s" podCreationTimestamp="2026-03-09 13:24:45 +0000 UTC" firstStartedPulling="2026-03-09 13:24:48.145948354 +0000 UTC m=+243.396120262" lastFinishedPulling="2026-03-09 13:25:43.015880876 +0000 UTC m=+298.266052784" observedRunningTime="2026-03-09 13:25:43.712049688 +0000 UTC m=+298.962221596" watchObservedRunningTime="2026-03-09 13:25:43.720115779 +0000 UTC m=+298.970287687" Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.871205 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz"] Mar 09 13:25:43 crc kubenswrapper[4764]: I0309 13:25:43.972070 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c9d566569-xmnfz"] Mar 09 13:25:43 crc kubenswrapper[4764]: W0309 13:25:43.980031 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e222771_a709_459f_a36f_e44f4b87983e.slice/crio-bc74e94f60e5ca1dd91e48b51c639a01e86dbc1382d15f18a1aac3c35e28f4ad WatchSource:0}: Error finding container bc74e94f60e5ca1dd91e48b51c639a01e86dbc1382d15f18a1aac3c35e28f4ad: Status 404 returned error can't find the container with id bc74e94f60e5ca1dd91e48b51c639a01e86dbc1382d15f18a1aac3c35e28f4ad Mar 09 13:25:44 crc kubenswrapper[4764]: I0309 13:25:44.579910 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d9z59" event={"ID":"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2","Type":"ContainerStarted","Data":"b507b889d835011a58e3fb35bdc6611d2459c15d083a937632ed6f78fbcf35b2"} Mar 09 13:25:44 crc kubenswrapper[4764]: I0309 13:25:44.584266 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" event={"ID":"8e222771-a709-459f-a36f-e44f4b87983e","Type":"ContainerStarted","Data":"2fc672eabcdfde188584b8ded74f5c4e2f917baeb322acb79784b8d6c120dfee"} Mar 09 13:25:44 crc kubenswrapper[4764]: I0309 13:25:44.584398 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" event={"ID":"8e222771-a709-459f-a36f-e44f4b87983e","Type":"ContainerStarted","Data":"bc74e94f60e5ca1dd91e48b51c639a01e86dbc1382d15f18a1aac3c35e28f4ad"} Mar 09 13:25:44 crc kubenswrapper[4764]: I0309 13:25:44.585143 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:44 crc kubenswrapper[4764]: I0309 13:25:44.586499 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" event={"ID":"32312778-9c44-4843-a588-5fba60384e05","Type":"ContainerStarted","Data":"759cbe04f032d8df08a612b05e244889018e14ae649430d1d9c9f8c95485aa5d"} Mar 09 13:25:44 crc kubenswrapper[4764]: I0309 13:25:44.586549 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" event={"ID":"32312778-9c44-4843-a588-5fba60384e05","Type":"ContainerStarted","Data":"6264edd6e509ddf66049653028a6e5a99b8ff3fab370367781f6b4c2c4544a37"} Mar 09 13:25:44 crc kubenswrapper[4764]: I0309 13:25:44.586770 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:44 crc kubenswrapper[4764]: I0309 13:25:44.629281 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:25:44 crc kubenswrapper[4764]: I0309 13:25:44.641225 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" podStartSLOduration=3.641211142 podStartE2EDuration="3.641211142s" podCreationTimestamp="2026-03-09 13:25:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:25:44.639979578 +0000 UTC m=+299.890151486" watchObservedRunningTime="2026-03-09 13:25:44.641211142 +0000 UTC m=+299.891383050" Mar 09 13:25:44 crc kubenswrapper[4764]: I0309 13:25:44.662760 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" podStartSLOduration=3.662692349 podStartE2EDuration="3.662692349s" podCreationTimestamp="2026-03-09 13:25:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:25:44.659866302 +0000 UTC m=+299.910038210" watchObservedRunningTime="2026-03-09 13:25:44.662692349 +0000 UTC m=+299.912864257" Mar 09 13:25:44 crc kubenswrapper[4764]: I0309 13:25:44.918619 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:25:44 crc kubenswrapper[4764]: I0309 13:25:44.918989 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:25:44 crc kubenswrapper[4764]: I0309 13:25:44.954938 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:25:45 crc kubenswrapper[4764]: I0309 13:25:45.496748 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:25:45 crc kubenswrapper[4764]: I0309 13:25:45.496799 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:25:45 crc kubenswrapper[4764]: I0309 13:25:45.570006 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="556e4318-98c6-4910-830a-2edcafa8c5a3" path="/var/lib/kubelet/pods/556e4318-98c6-4910-830a-2edcafa8c5a3/volumes" Mar 09 13:25:45 crc kubenswrapper[4764]: I0309 13:25:45.570571 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4ef7ddb-4b54-48f7-a879-d07cb3222339" path="/var/lib/kubelet/pods/e4ef7ddb-4b54-48f7-a879-d07cb3222339/volumes" Mar 09 13:25:45 crc kubenswrapper[4764]: I0309 13:25:45.598981 4764 generic.go:334] "Generic (PLEG): container finished" podID="41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" containerID="0ae328f899ade57662b7a57d61c5864e374b6610f4676d30b76a8c7048f7853c" exitCode=0 Mar 09 13:25:45 crc kubenswrapper[4764]: I0309 13:25:45.599819 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tll5t" event={"ID":"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6","Type":"ContainerDied","Data":"0ae328f899ade57662b7a57d61c5864e374b6610f4676d30b76a8c7048f7853c"} Mar 09 13:25:45 crc kubenswrapper[4764]: I0309 13:25:45.960543 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:25:46 crc kubenswrapper[4764]: I0309 13:25:46.609838 4764 generic.go:334] "Generic (PLEG): container finished" podID="c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" containerID="b507b889d835011a58e3fb35bdc6611d2459c15d083a937632ed6f78fbcf35b2" exitCode=0 Mar 09 13:25:46 crc kubenswrapper[4764]: I0309 13:25:46.609938 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d9z59" event={"ID":"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2","Type":"ContainerDied","Data":"b507b889d835011a58e3fb35bdc6611d2459c15d083a937632ed6f78fbcf35b2"} Mar 09 13:25:46 crc kubenswrapper[4764]: I0309 13:25:46.833787 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-jn8f5" podUID="a76121be-d090-4f2a-9e57-1a160a4bb4f2" containerName="registry-server" probeResult="failure" output=< Mar 09 13:25:46 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Mar 09 13:25:46 crc kubenswrapper[4764]: > Mar 09 13:25:49 crc kubenswrapper[4764]: I0309 13:25:49.232386 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-x927s" Mar 09 13:25:49 crc kubenswrapper[4764]: I0309 13:25:49.630527 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d627" event={"ID":"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b","Type":"ContainerStarted","Data":"5dafc18a60471b2aa074fa19f2a0f2019626c0dc18f4738cc76636c111e1e0e1"} Mar 09 13:25:49 crc kubenswrapper[4764]: I0309 13:25:49.633505 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhs57" event={"ID":"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df","Type":"ContainerStarted","Data":"aca0e54cf991318058295c5dea65c4ff3d25ff689a99973580ba7e63019a2803"} Mar 09 13:25:49 crc kubenswrapper[4764]: I0309 13:25:49.636883 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d9z59" event={"ID":"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2","Type":"ContainerStarted","Data":"ffc2a9491a6276ce817db9e39e528e75cd29f3eedc77f220c7dae09ad9a1c62b"} Mar 09 13:25:49 crc kubenswrapper[4764]: I0309 13:25:49.639800 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tll5t" event={"ID":"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6","Type":"ContainerStarted","Data":"4325cf9007df54fa3b2a5bed7103ccce2df4b9e7e47622fe04491c8fac8d1503"} Mar 09 13:25:49 crc kubenswrapper[4764]: I0309 13:25:49.642970 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbm5b" event={"ID":"20acdcb5-ea78-435e-b472-e102d5553c75","Type":"ContainerStarted","Data":"b884af634d32c12476d7f597eeeb93dfd054afb59803fed8ff12daa9129069bc"} Mar 09 13:25:49 crc kubenswrapper[4764]: I0309 13:25:49.644959 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551044-p748f" event={"ID":"0a005f65-920a-4cdd-b4da-a270953113aa","Type":"ContainerStarted","Data":"3ce66a9ae238a55280ff899673763223d028dea40122d545386701984ba576ae"} Mar 09 13:25:49 crc kubenswrapper[4764]: I0309 13:25:49.646985 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7k9k" event={"ID":"7a967c79-e11e-4c58-b42e-652d1406ac88","Type":"ContainerStarted","Data":"5ed7c7fd6e4e524d04d9ab3277bdad76d1f17e39f4ce65077628067a3a616861"} Mar 09 13:25:49 crc kubenswrapper[4764]: I0309 13:25:49.656459 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8d627" podStartSLOduration=3.61726369 podStartE2EDuration="1m5.656439049s" podCreationTimestamp="2026-03-09 13:24:44 +0000 UTC" firstStartedPulling="2026-03-09 13:24:47.022442687 +0000 UTC m=+242.272614595" lastFinishedPulling="2026-03-09 13:25:49.061618026 +0000 UTC m=+304.311789954" observedRunningTime="2026-03-09 13:25:49.654048233 +0000 UTC m=+304.904220131" watchObservedRunningTime="2026-03-09 13:25:49.656439049 +0000 UTC m=+304.906610957" Mar 09 13:25:49 crc kubenswrapper[4764]: I0309 13:25:49.674847 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d9z59" podStartSLOduration=3.335646016 podStartE2EDuration="1m1.674824261s" podCreationTimestamp="2026-03-09 13:24:48 +0000 UTC" firstStartedPulling="2026-03-09 13:24:50.842840003 +0000 UTC m=+246.093011911" lastFinishedPulling="2026-03-09 13:25:49.182018248 +0000 UTC m=+304.432190156" observedRunningTime="2026-03-09 13:25:49.673367121 +0000 UTC m=+304.923539039" watchObservedRunningTime="2026-03-09 13:25:49.674824261 +0000 UTC m=+304.924996169" Mar 09 13:25:49 crc kubenswrapper[4764]: I0309 13:25:49.690018 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551044-p748f" podStartSLOduration=40.290774396 podStartE2EDuration="1m49.690002196s" podCreationTimestamp="2026-03-09 13:24:00 +0000 UTC" firstStartedPulling="2026-03-09 13:24:39.65230508 +0000 UTC m=+234.902476988" lastFinishedPulling="2026-03-09 13:25:49.05153288 +0000 UTC m=+304.301704788" observedRunningTime="2026-03-09 13:25:49.686507101 +0000 UTC m=+304.936679009" watchObservedRunningTime="2026-03-09 13:25:49.690002196 +0000 UTC m=+304.940174104" Mar 09 13:25:49 crc kubenswrapper[4764]: I0309 13:25:49.713742 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mbm5b" podStartSLOduration=9.516713179 podStartE2EDuration="1m5.713722505s" podCreationTimestamp="2026-03-09 13:24:44 +0000 UTC" firstStartedPulling="2026-03-09 13:24:48.395613897 +0000 UTC m=+243.645785805" lastFinishedPulling="2026-03-09 13:25:44.592623223 +0000 UTC m=+299.842795131" observedRunningTime="2026-03-09 13:25:49.708566444 +0000 UTC m=+304.958738342" watchObservedRunningTime="2026-03-09 13:25:49.713722505 +0000 UTC m=+304.963894433" Mar 09 13:25:49 crc kubenswrapper[4764]: I0309 13:25:49.741084 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tll5t" podStartSLOduration=4.371014844 podStartE2EDuration="1m2.741057002s" podCreationTimestamp="2026-03-09 13:24:47 +0000 UTC" firstStartedPulling="2026-03-09 13:24:50.842313299 +0000 UTC m=+246.092485207" lastFinishedPulling="2026-03-09 13:25:49.212355457 +0000 UTC m=+304.462527365" observedRunningTime="2026-03-09 13:25:49.739178981 +0000 UTC m=+304.989350889" watchObservedRunningTime="2026-03-09 13:25:49.741057002 +0000 UTC m=+304.991228910" Mar 09 13:25:50 crc kubenswrapper[4764]: I0309 13:25:50.226280 4764 csr.go:261] certificate signing request csr-dbwk4 is approved, waiting to be issued Mar 09 13:25:50 crc kubenswrapper[4764]: I0309 13:25:50.234781 4764 csr.go:257] certificate signing request csr-dbwk4 is issued Mar 09 13:25:50 crc kubenswrapper[4764]: I0309 13:25:50.655029 4764 generic.go:334] "Generic (PLEG): container finished" podID="0a005f65-920a-4cdd-b4da-a270953113aa" containerID="3ce66a9ae238a55280ff899673763223d028dea40122d545386701984ba576ae" exitCode=0 Mar 09 13:25:50 crc kubenswrapper[4764]: I0309 13:25:50.655117 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551044-p748f" event={"ID":"0a005f65-920a-4cdd-b4da-a270953113aa","Type":"ContainerDied","Data":"3ce66a9ae238a55280ff899673763223d028dea40122d545386701984ba576ae"} Mar 09 13:25:50 crc kubenswrapper[4764]: I0309 13:25:50.657388 4764 generic.go:334] "Generic (PLEG): container finished" podID="7a967c79-e11e-4c58-b42e-652d1406ac88" containerID="5ed7c7fd6e4e524d04d9ab3277bdad76d1f17e39f4ce65077628067a3a616861" exitCode=0 Mar 09 13:25:50 crc kubenswrapper[4764]: I0309 13:25:50.657441 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7k9k" event={"ID":"7a967c79-e11e-4c58-b42e-652d1406ac88","Type":"ContainerDied","Data":"5ed7c7fd6e4e524d04d9ab3277bdad76d1f17e39f4ce65077628067a3a616861"} Mar 09 13:25:50 crc kubenswrapper[4764]: I0309 13:25:50.659513 4764 generic.go:334] "Generic (PLEG): container finished" podID="691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" containerID="aca0e54cf991318058295c5dea65c4ff3d25ff689a99973580ba7e63019a2803" exitCode=0 Mar 09 13:25:50 crc kubenswrapper[4764]: I0309 13:25:50.659550 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhs57" event={"ID":"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df","Type":"ContainerDied","Data":"aca0e54cf991318058295c5dea65c4ff3d25ff689a99973580ba7e63019a2803"} Mar 09 13:25:51 crc kubenswrapper[4764]: I0309 13:25:51.237461 4764 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-24 09:03:09.417637625 +0000 UTC Mar 09 13:25:51 crc kubenswrapper[4764]: I0309 13:25:51.237518 4764 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6235h37m18.18012371s for next certificate rotation Mar 09 13:25:52 crc kubenswrapper[4764]: I0309 13:25:52.077027 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551044-p748f" Mar 09 13:25:52 crc kubenswrapper[4764]: I0309 13:25:52.200638 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g47h8\" (UniqueName: \"kubernetes.io/projected/0a005f65-920a-4cdd-b4da-a270953113aa-kube-api-access-g47h8\") pod \"0a005f65-920a-4cdd-b4da-a270953113aa\" (UID: \"0a005f65-920a-4cdd-b4da-a270953113aa\") " Mar 09 13:25:52 crc kubenswrapper[4764]: I0309 13:25:52.207914 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a005f65-920a-4cdd-b4da-a270953113aa-kube-api-access-g47h8" (OuterVolumeSpecName: "kube-api-access-g47h8") pod "0a005f65-920a-4cdd-b4da-a270953113aa" (UID: "0a005f65-920a-4cdd-b4da-a270953113aa"). InnerVolumeSpecName "kube-api-access-g47h8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:52 crc kubenswrapper[4764]: I0309 13:25:52.238176 4764 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-18 06:20:17.031616075 +0000 UTC Mar 09 13:25:52 crc kubenswrapper[4764]: I0309 13:25:52.238225 4764 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7552h54m24.793393641s for next certificate rotation Mar 09 13:25:52 crc kubenswrapper[4764]: I0309 13:25:52.302440 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g47h8\" (UniqueName: \"kubernetes.io/projected/0a005f65-920a-4cdd-b4da-a270953113aa-kube-api-access-g47h8\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:52 crc kubenswrapper[4764]: I0309 13:25:52.674429 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551044-p748f" event={"ID":"0a005f65-920a-4cdd-b4da-a270953113aa","Type":"ContainerDied","Data":"0090a60e7aca5b3b5065eab933849dd34829a2b082555fb9f3ff31c4a933640e"} Mar 09 13:25:52 crc kubenswrapper[4764]: I0309 13:25:52.674479 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0090a60e7aca5b3b5065eab933849dd34829a2b082555fb9f3ff31c4a933640e" Mar 09 13:25:52 crc kubenswrapper[4764]: I0309 13:25:52.674535 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551044-p748f" Mar 09 13:25:54 crc kubenswrapper[4764]: I0309 13:25:54.702149 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7k9k" event={"ID":"7a967c79-e11e-4c58-b42e-652d1406ac88","Type":"ContainerStarted","Data":"018f9058cd176b4bd85208014309fc2aaec24cd432697bbe4160e6f16524159a"} Mar 09 13:25:54 crc kubenswrapper[4764]: I0309 13:25:54.726600 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g7k9k" podStartSLOduration=3.546473055 podStartE2EDuration="1m7.726584806s" podCreationTimestamp="2026-03-09 13:24:47 +0000 UTC" firstStartedPulling="2026-03-09 13:24:49.552451749 +0000 UTC m=+244.802623657" lastFinishedPulling="2026-03-09 13:25:53.7325635 +0000 UTC m=+308.982735408" observedRunningTime="2026-03-09 13:25:54.724565411 +0000 UTC m=+309.974737339" watchObservedRunningTime="2026-03-09 13:25:54.726584806 +0000 UTC m=+309.976756714" Mar 09 13:25:54 crc kubenswrapper[4764]: I0309 13:25:54.980665 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:25:55 crc kubenswrapper[4764]: I0309 13:25:55.047947 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:25:55 crc kubenswrapper[4764]: I0309 13:25:55.048174 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:25:55 crc kubenswrapper[4764]: I0309 13:25:55.112087 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:25:55 crc kubenswrapper[4764]: I0309 13:25:55.276747 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:25:55 crc kubenswrapper[4764]: I0309 13:25:55.276857 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:25:55 crc kubenswrapper[4764]: I0309 13:25:55.331864 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:25:55 crc kubenswrapper[4764]: I0309 13:25:55.551792 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:25:55 crc kubenswrapper[4764]: I0309 13:25:55.596995 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:25:55 crc kubenswrapper[4764]: I0309 13:25:55.709037 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhs57" event={"ID":"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df","Type":"ContainerStarted","Data":"d02322e51b07c573018bc35bb85c4b069d5e04cb23ef61de23f558be13356214"} Mar 09 13:25:55 crc kubenswrapper[4764]: I0309 13:25:55.754337 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:25:55 crc kubenswrapper[4764]: I0309 13:25:55.761241 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:25:55 crc kubenswrapper[4764]: I0309 13:25:55.780355 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qhs57" podStartSLOduration=3.800128888 podStartE2EDuration="1m9.780338706s" podCreationTimestamp="2026-03-09 13:24:46 +0000 UTC" firstStartedPulling="2026-03-09 13:24:48.460555203 +0000 UTC m=+243.710727111" lastFinishedPulling="2026-03-09 13:25:54.440765021 +0000 UTC m=+309.690936929" observedRunningTime="2026-03-09 13:25:55.728870199 +0000 UTC m=+310.979042117" watchObservedRunningTime="2026-03-09 13:25:55.780338706 +0000 UTC m=+311.030510614" Mar 09 13:25:57 crc kubenswrapper[4764]: I0309 13:25:57.157908 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:25:57 crc kubenswrapper[4764]: I0309 13:25:57.157987 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:25:57 crc kubenswrapper[4764]: I0309 13:25:57.211825 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:25:57 crc kubenswrapper[4764]: I0309 13:25:57.258853 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mbm5b"] Mar 09 13:25:57 crc kubenswrapper[4764]: I0309 13:25:57.603373 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:25:57 crc kubenswrapper[4764]: I0309 13:25:57.603442 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:25:57 crc kubenswrapper[4764]: I0309 13:25:57.655215 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:25:57 crc kubenswrapper[4764]: I0309 13:25:57.849044 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nj856"] Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.258592 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jn8f5"] Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.259082 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jn8f5" podUID="a76121be-d090-4f2a-9e57-1a160a4bb4f2" containerName="registry-server" containerID="cri-o://1d40a7985c63e046f744271b4a5346b0a7b6d6cdf8fbc36bc9abda5ba599778a" gracePeriod=2 Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.369917 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.369984 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.370039 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.370674 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.370732 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad" gracePeriod=600 Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.556559 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.607088 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.617992 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.731901 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad" exitCode=0 Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.731983 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad"} Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.732019 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"8bb39ae24112881ead01c36905f25d69cb36d1e4d3c6e8aa79f283e7dd0d444b"} Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.739274 4764 generic.go:334] "Generic (PLEG): container finished" podID="a76121be-d090-4f2a-9e57-1a160a4bb4f2" containerID="1d40a7985c63e046f744271b4a5346b0a7b6d6cdf8fbc36bc9abda5ba599778a" exitCode=0 Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.739301 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn8f5" event={"ID":"a76121be-d090-4f2a-9e57-1a160a4bb4f2","Type":"ContainerDied","Data":"1d40a7985c63e046f744271b4a5346b0a7b6d6cdf8fbc36bc9abda5ba599778a"} Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.739771 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mbm5b" podUID="20acdcb5-ea78-435e-b472-e102d5553c75" containerName="registry-server" containerID="cri-o://b884af634d32c12476d7f597eeeb93dfd054afb59803fed8ff12daa9129069bc" gracePeriod=2 Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.764225 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.765247 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.822374 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.838189 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.876113 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.889847 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc5m2\" (UniqueName: \"kubernetes.io/projected/a76121be-d090-4f2a-9e57-1a160a4bb4f2-kube-api-access-dc5m2\") pod \"a76121be-d090-4f2a-9e57-1a160a4bb4f2\" (UID: \"a76121be-d090-4f2a-9e57-1a160a4bb4f2\") " Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.889906 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a76121be-d090-4f2a-9e57-1a160a4bb4f2-utilities\") pod \"a76121be-d090-4f2a-9e57-1a160a4bb4f2\" (UID: \"a76121be-d090-4f2a-9e57-1a160a4bb4f2\") " Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.889944 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a76121be-d090-4f2a-9e57-1a160a4bb4f2-catalog-content\") pod \"a76121be-d090-4f2a-9e57-1a160a4bb4f2\" (UID: \"a76121be-d090-4f2a-9e57-1a160a4bb4f2\") " Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.897404 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a76121be-d090-4f2a-9e57-1a160a4bb4f2-utilities" (OuterVolumeSpecName: "utilities") pod "a76121be-d090-4f2a-9e57-1a160a4bb4f2" (UID: "a76121be-d090-4f2a-9e57-1a160a4bb4f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.900729 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a76121be-d090-4f2a-9e57-1a160a4bb4f2-kube-api-access-dc5m2" (OuterVolumeSpecName: "kube-api-access-dc5m2") pod "a76121be-d090-4f2a-9e57-1a160a4bb4f2" (UID: "a76121be-d090-4f2a-9e57-1a160a4bb4f2"). InnerVolumeSpecName "kube-api-access-dc5m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.954303 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a76121be-d090-4f2a-9e57-1a160a4bb4f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a76121be-d090-4f2a-9e57-1a160a4bb4f2" (UID: "a76121be-d090-4f2a-9e57-1a160a4bb4f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.991469 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc5m2\" (UniqueName: \"kubernetes.io/projected/a76121be-d090-4f2a-9e57-1a160a4bb4f2-kube-api-access-dc5m2\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.991756 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a76121be-d090-4f2a-9e57-1a160a4bb4f2-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:58 crc kubenswrapper[4764]: I0309 13:25:58.991770 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a76121be-d090-4f2a-9e57-1a160a4bb4f2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.179234 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.193552 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20acdcb5-ea78-435e-b472-e102d5553c75-utilities\") pod \"20acdcb5-ea78-435e-b472-e102d5553c75\" (UID: \"20acdcb5-ea78-435e-b472-e102d5553c75\") " Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.193623 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4s9m\" (UniqueName: \"kubernetes.io/projected/20acdcb5-ea78-435e-b472-e102d5553c75-kube-api-access-f4s9m\") pod \"20acdcb5-ea78-435e-b472-e102d5553c75\" (UID: \"20acdcb5-ea78-435e-b472-e102d5553c75\") " Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.193658 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20acdcb5-ea78-435e-b472-e102d5553c75-catalog-content\") pod \"20acdcb5-ea78-435e-b472-e102d5553c75\" (UID: \"20acdcb5-ea78-435e-b472-e102d5553c75\") " Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.194376 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20acdcb5-ea78-435e-b472-e102d5553c75-utilities" (OuterVolumeSpecName: "utilities") pod "20acdcb5-ea78-435e-b472-e102d5553c75" (UID: "20acdcb5-ea78-435e-b472-e102d5553c75"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.198882 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20acdcb5-ea78-435e-b472-e102d5553c75-kube-api-access-f4s9m" (OuterVolumeSpecName: "kube-api-access-f4s9m") pod "20acdcb5-ea78-435e-b472-e102d5553c75" (UID: "20acdcb5-ea78-435e-b472-e102d5553c75"). InnerVolumeSpecName "kube-api-access-f4s9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.262132 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20acdcb5-ea78-435e-b472-e102d5553c75-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "20acdcb5-ea78-435e-b472-e102d5553c75" (UID: "20acdcb5-ea78-435e-b472-e102d5553c75"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.295340 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20acdcb5-ea78-435e-b472-e102d5553c75-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.295384 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4s9m\" (UniqueName: \"kubernetes.io/projected/20acdcb5-ea78-435e-b472-e102d5553c75-kube-api-access-f4s9m\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.295395 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20acdcb5-ea78-435e-b472-e102d5553c75-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.749802 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jn8f5" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.749832 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn8f5" event={"ID":"a76121be-d090-4f2a-9e57-1a160a4bb4f2","Type":"ContainerDied","Data":"a07c170a29ea8bcf9be266201f1dd0580d7bdb690c3b989b62809138bb677d6e"} Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.750258 4764 scope.go:117] "RemoveContainer" containerID="1d40a7985c63e046f744271b4a5346b0a7b6d6cdf8fbc36bc9abda5ba599778a" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.754541 4764 generic.go:334] "Generic (PLEG): container finished" podID="20acdcb5-ea78-435e-b472-e102d5553c75" containerID="b884af634d32c12476d7f597eeeb93dfd054afb59803fed8ff12daa9129069bc" exitCode=0 Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.754626 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbm5b" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.754695 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbm5b" event={"ID":"20acdcb5-ea78-435e-b472-e102d5553c75","Type":"ContainerDied","Data":"b884af634d32c12476d7f597eeeb93dfd054afb59803fed8ff12daa9129069bc"} Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.754761 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbm5b" event={"ID":"20acdcb5-ea78-435e-b472-e102d5553c75","Type":"ContainerDied","Data":"36b8a908fc96eec5fd19468146038ec7f847f96484b3a606a41defe1a23a894e"} Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.773238 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jn8f5"] Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.773419 4764 scope.go:117] "RemoveContainer" containerID="71eda28b996e54f946f3b59d46c82f8b6fc0575bacbdbf501c3dc23dbb439d43" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.785431 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jn8f5"] Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.797629 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mbm5b"] Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.800356 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mbm5b"] Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.804002 4764 scope.go:117] "RemoveContainer" containerID="851e5f42b5e692a4d7bb4a0fda84945ae1aa93c4dc2838b29856d0c9ed98624f" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.819109 4764 scope.go:117] "RemoveContainer" containerID="b884af634d32c12476d7f597eeeb93dfd054afb59803fed8ff12daa9129069bc" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.823806 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.836779 4764 scope.go:117] "RemoveContainer" containerID="67daa9bd8128811d30d5710b782098e52f919063831113dfb444335915817615" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.856124 4764 scope.go:117] "RemoveContainer" containerID="8e85cc77a9a235fb4dc23dc9451690c29cdcf977870a98d1bc077bac0fa992d1" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.869759 4764 scope.go:117] "RemoveContainer" containerID="b884af634d32c12476d7f597eeeb93dfd054afb59803fed8ff12daa9129069bc" Mar 09 13:25:59 crc kubenswrapper[4764]: E0309 13:25:59.870022 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b884af634d32c12476d7f597eeeb93dfd054afb59803fed8ff12daa9129069bc\": container with ID starting with b884af634d32c12476d7f597eeeb93dfd054afb59803fed8ff12daa9129069bc not found: ID does not exist" containerID="b884af634d32c12476d7f597eeeb93dfd054afb59803fed8ff12daa9129069bc" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.870056 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b884af634d32c12476d7f597eeeb93dfd054afb59803fed8ff12daa9129069bc"} err="failed to get container status \"b884af634d32c12476d7f597eeeb93dfd054afb59803fed8ff12daa9129069bc\": rpc error: code = NotFound desc = could not find container \"b884af634d32c12476d7f597eeeb93dfd054afb59803fed8ff12daa9129069bc\": container with ID starting with b884af634d32c12476d7f597eeeb93dfd054afb59803fed8ff12daa9129069bc not found: ID does not exist" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.870081 4764 scope.go:117] "RemoveContainer" containerID="67daa9bd8128811d30d5710b782098e52f919063831113dfb444335915817615" Mar 09 13:25:59 crc kubenswrapper[4764]: E0309 13:25:59.877155 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67daa9bd8128811d30d5710b782098e52f919063831113dfb444335915817615\": container with ID starting with 67daa9bd8128811d30d5710b782098e52f919063831113dfb444335915817615 not found: ID does not exist" containerID="67daa9bd8128811d30d5710b782098e52f919063831113dfb444335915817615" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.877960 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67daa9bd8128811d30d5710b782098e52f919063831113dfb444335915817615"} err="failed to get container status \"67daa9bd8128811d30d5710b782098e52f919063831113dfb444335915817615\": rpc error: code = NotFound desc = could not find container \"67daa9bd8128811d30d5710b782098e52f919063831113dfb444335915817615\": container with ID starting with 67daa9bd8128811d30d5710b782098e52f919063831113dfb444335915817615 not found: ID does not exist" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.878083 4764 scope.go:117] "RemoveContainer" containerID="8e85cc77a9a235fb4dc23dc9451690c29cdcf977870a98d1bc077bac0fa992d1" Mar 09 13:25:59 crc kubenswrapper[4764]: E0309 13:25:59.878622 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e85cc77a9a235fb4dc23dc9451690c29cdcf977870a98d1bc077bac0fa992d1\": container with ID starting with 8e85cc77a9a235fb4dc23dc9451690c29cdcf977870a98d1bc077bac0fa992d1 not found: ID does not exist" containerID="8e85cc77a9a235fb4dc23dc9451690c29cdcf977870a98d1bc077bac0fa992d1" Mar 09 13:25:59 crc kubenswrapper[4764]: I0309 13:25:59.878668 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e85cc77a9a235fb4dc23dc9451690c29cdcf977870a98d1bc077bac0fa992d1"} err="failed to get container status \"8e85cc77a9a235fb4dc23dc9451690c29cdcf977870a98d1bc077bac0fa992d1\": rpc error: code = NotFound desc = could not find container \"8e85cc77a9a235fb4dc23dc9451690c29cdcf977870a98d1bc077bac0fa992d1\": container with ID starting with 8e85cc77a9a235fb4dc23dc9451690c29cdcf977870a98d1bc077bac0fa992d1 not found: ID does not exist" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.139009 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551046-ll87d"] Mar 09 13:26:00 crc kubenswrapper[4764]: E0309 13:26:00.139486 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20acdcb5-ea78-435e-b472-e102d5553c75" containerName="extract-content" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.139563 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="20acdcb5-ea78-435e-b472-e102d5553c75" containerName="extract-content" Mar 09 13:26:00 crc kubenswrapper[4764]: E0309 13:26:00.139628 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a76121be-d090-4f2a-9e57-1a160a4bb4f2" containerName="extract-utilities" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.139719 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a76121be-d090-4f2a-9e57-1a160a4bb4f2" containerName="extract-utilities" Mar 09 13:26:00 crc kubenswrapper[4764]: E0309 13:26:00.139791 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a76121be-d090-4f2a-9e57-1a160a4bb4f2" containerName="registry-server" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.139846 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a76121be-d090-4f2a-9e57-1a160a4bb4f2" containerName="registry-server" Mar 09 13:26:00 crc kubenswrapper[4764]: E0309 13:26:00.139899 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20acdcb5-ea78-435e-b472-e102d5553c75" containerName="registry-server" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.140038 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="20acdcb5-ea78-435e-b472-e102d5553c75" containerName="registry-server" Mar 09 13:26:00 crc kubenswrapper[4764]: E0309 13:26:00.140103 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20acdcb5-ea78-435e-b472-e102d5553c75" containerName="extract-utilities" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.140155 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="20acdcb5-ea78-435e-b472-e102d5553c75" containerName="extract-utilities" Mar 09 13:26:00 crc kubenswrapper[4764]: E0309 13:26:00.140208 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a76121be-d090-4f2a-9e57-1a160a4bb4f2" containerName="extract-content" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.140258 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a76121be-d090-4f2a-9e57-1a160a4bb4f2" containerName="extract-content" Mar 09 13:26:00 crc kubenswrapper[4764]: E0309 13:26:00.140314 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a005f65-920a-4cdd-b4da-a270953113aa" containerName="oc" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.140375 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a005f65-920a-4cdd-b4da-a270953113aa" containerName="oc" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.140538 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="20acdcb5-ea78-435e-b472-e102d5553c75" containerName="registry-server" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.140597 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a005f65-920a-4cdd-b4da-a270953113aa" containerName="oc" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.140683 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a76121be-d090-4f2a-9e57-1a160a4bb4f2" containerName="registry-server" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.141153 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551046-ll87d" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.143455 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.143695 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.144914 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.148197 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551046-ll87d"] Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.208879 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msj62\" (UniqueName: \"kubernetes.io/projected/c09230f9-b117-44a0-b3ed-ab6dc7ce0285-kube-api-access-msj62\") pod \"auto-csr-approver-29551046-ll87d\" (UID: \"c09230f9-b117-44a0-b3ed-ab6dc7ce0285\") " pod="openshift-infra/auto-csr-approver-29551046-ll87d" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.310525 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msj62\" (UniqueName: \"kubernetes.io/projected/c09230f9-b117-44a0-b3ed-ab6dc7ce0285-kube-api-access-msj62\") pod \"auto-csr-approver-29551046-ll87d\" (UID: \"c09230f9-b117-44a0-b3ed-ab6dc7ce0285\") " pod="openshift-infra/auto-csr-approver-29551046-ll87d" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.345855 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msj62\" (UniqueName: \"kubernetes.io/projected/c09230f9-b117-44a0-b3ed-ab6dc7ce0285-kube-api-access-msj62\") pod \"auto-csr-approver-29551046-ll87d\" (UID: \"c09230f9-b117-44a0-b3ed-ab6dc7ce0285\") " pod="openshift-infra/auto-csr-approver-29551046-ll87d" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.460189 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551046-ll87d" Mar 09 13:26:00 crc kubenswrapper[4764]: I0309 13:26:00.869341 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551046-ll87d"] Mar 09 13:26:01 crc kubenswrapper[4764]: I0309 13:26:01.569151 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20acdcb5-ea78-435e-b472-e102d5553c75" path="/var/lib/kubelet/pods/20acdcb5-ea78-435e-b472-e102d5553c75/volumes" Mar 09 13:26:01 crc kubenswrapper[4764]: I0309 13:26:01.569945 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a76121be-d090-4f2a-9e57-1a160a4bb4f2" path="/var/lib/kubelet/pods/a76121be-d090-4f2a-9e57-1a160a4bb4f2/volumes" Mar 09 13:26:01 crc kubenswrapper[4764]: I0309 13:26:01.769813 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551046-ll87d" event={"ID":"c09230f9-b117-44a0-b3ed-ab6dc7ce0285","Type":"ContainerStarted","Data":"f8624d20b131f960061486b5cf6a38954f017f1461bb66469047b83c408d5f71"} Mar 09 13:26:01 crc kubenswrapper[4764]: I0309 13:26:01.782946 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c9d566569-xmnfz"] Mar 09 13:26:01 crc kubenswrapper[4764]: I0309 13:26:01.783223 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" podUID="8e222771-a709-459f-a36f-e44f4b87983e" containerName="controller-manager" containerID="cri-o://2fc672eabcdfde188584b8ded74f5c4e2f917baeb322acb79784b8d6c120dfee" gracePeriod=30 Mar 09 13:26:01 crc kubenswrapper[4764]: I0309 13:26:01.888219 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz"] Mar 09 13:26:01 crc kubenswrapper[4764]: I0309 13:26:01.888494 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" podUID="32312778-9c44-4843-a588-5fba60384e05" containerName="route-controller-manager" containerID="cri-o://759cbe04f032d8df08a612b05e244889018e14ae649430d1d9c9f8c95485aa5d" gracePeriod=30 Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.397766 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.401773 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.439084 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-client-ca\") pod \"8e222771-a709-459f-a36f-e44f4b87983e\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.439134 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32312778-9c44-4843-a588-5fba60384e05-serving-cert\") pod \"32312778-9c44-4843-a588-5fba60384e05\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.439189 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l6gj\" (UniqueName: \"kubernetes.io/projected/32312778-9c44-4843-a588-5fba60384e05-kube-api-access-4l6gj\") pod \"32312778-9c44-4843-a588-5fba60384e05\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.439219 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32312778-9c44-4843-a588-5fba60384e05-config\") pod \"32312778-9c44-4843-a588-5fba60384e05\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.439242 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-proxy-ca-bundles\") pod \"8e222771-a709-459f-a36f-e44f4b87983e\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.439266 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e222771-a709-459f-a36f-e44f4b87983e-serving-cert\") pod \"8e222771-a709-459f-a36f-e44f4b87983e\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.439290 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32312778-9c44-4843-a588-5fba60384e05-client-ca\") pod \"32312778-9c44-4843-a588-5fba60384e05\" (UID: \"32312778-9c44-4843-a588-5fba60384e05\") " Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.439319 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bf7k\" (UniqueName: \"kubernetes.io/projected/8e222771-a709-459f-a36f-e44f4b87983e-kube-api-access-7bf7k\") pod \"8e222771-a709-459f-a36f-e44f4b87983e\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.439359 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-config\") pod \"8e222771-a709-459f-a36f-e44f4b87983e\" (UID: \"8e222771-a709-459f-a36f-e44f4b87983e\") " Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.440005 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8e222771-a709-459f-a36f-e44f4b87983e" (UID: "8e222771-a709-459f-a36f-e44f4b87983e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.440009 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32312778-9c44-4843-a588-5fba60384e05-config" (OuterVolumeSpecName: "config") pod "32312778-9c44-4843-a588-5fba60384e05" (UID: "32312778-9c44-4843-a588-5fba60384e05"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.440173 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32312778-9c44-4843-a588-5fba60384e05-client-ca" (OuterVolumeSpecName: "client-ca") pod "32312778-9c44-4843-a588-5fba60384e05" (UID: "32312778-9c44-4843-a588-5fba60384e05"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.440245 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-client-ca" (OuterVolumeSpecName: "client-ca") pod "8e222771-a709-459f-a36f-e44f4b87983e" (UID: "8e222771-a709-459f-a36f-e44f4b87983e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.441557 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-config" (OuterVolumeSpecName: "config") pod "8e222771-a709-459f-a36f-e44f4b87983e" (UID: "8e222771-a709-459f-a36f-e44f4b87983e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.448884 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e222771-a709-459f-a36f-e44f4b87983e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8e222771-a709-459f-a36f-e44f4b87983e" (UID: "8e222771-a709-459f-a36f-e44f4b87983e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.449779 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32312778-9c44-4843-a588-5fba60384e05-kube-api-access-4l6gj" (OuterVolumeSpecName: "kube-api-access-4l6gj") pod "32312778-9c44-4843-a588-5fba60384e05" (UID: "32312778-9c44-4843-a588-5fba60384e05"). InnerVolumeSpecName "kube-api-access-4l6gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.451788 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32312778-9c44-4843-a588-5fba60384e05-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "32312778-9c44-4843-a588-5fba60384e05" (UID: "32312778-9c44-4843-a588-5fba60384e05"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.451820 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e222771-a709-459f-a36f-e44f4b87983e-kube-api-access-7bf7k" (OuterVolumeSpecName: "kube-api-access-7bf7k") pod "8e222771-a709-459f-a36f-e44f4b87983e" (UID: "8e222771-a709-459f-a36f-e44f4b87983e"). InnerVolumeSpecName "kube-api-access-7bf7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.540122 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32312778-9c44-4843-a588-5fba60384e05-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.540156 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bf7k\" (UniqueName: \"kubernetes.io/projected/8e222771-a709-459f-a36f-e44f4b87983e-kube-api-access-7bf7k\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.540168 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.540179 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.540191 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32312778-9c44-4843-a588-5fba60384e05-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.540200 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4l6gj\" (UniqueName: \"kubernetes.io/projected/32312778-9c44-4843-a588-5fba60384e05-kube-api-access-4l6gj\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.540210 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32312778-9c44-4843-a588-5fba60384e05-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.540220 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8e222771-a709-459f-a36f-e44f4b87983e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.540230 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e222771-a709-459f-a36f-e44f4b87983e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.660236 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d9z59"] Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.776973 4764 generic.go:334] "Generic (PLEG): container finished" podID="c09230f9-b117-44a0-b3ed-ab6dc7ce0285" containerID="bde3db10cbf6b95804c8c69e0b70c34f476ab92c998e4a7ae6079e0e0e9d7b98" exitCode=0 Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.777089 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551046-ll87d" event={"ID":"c09230f9-b117-44a0-b3ed-ab6dc7ce0285","Type":"ContainerDied","Data":"bde3db10cbf6b95804c8c69e0b70c34f476ab92c998e4a7ae6079e0e0e9d7b98"} Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.778999 4764 generic.go:334] "Generic (PLEG): container finished" podID="8e222771-a709-459f-a36f-e44f4b87983e" containerID="2fc672eabcdfde188584b8ded74f5c4e2f917baeb322acb79784b8d6c120dfee" exitCode=0 Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.779052 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" event={"ID":"8e222771-a709-459f-a36f-e44f4b87983e","Type":"ContainerDied","Data":"2fc672eabcdfde188584b8ded74f5c4e2f917baeb322acb79784b8d6c120dfee"} Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.779090 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" event={"ID":"8e222771-a709-459f-a36f-e44f4b87983e","Type":"ContainerDied","Data":"bc74e94f60e5ca1dd91e48b51c639a01e86dbc1382d15f18a1aac3c35e28f4ad"} Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.779078 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c9d566569-xmnfz" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.779117 4764 scope.go:117] "RemoveContainer" containerID="2fc672eabcdfde188584b8ded74f5c4e2f917baeb322acb79784b8d6c120dfee" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.780805 4764 generic.go:334] "Generic (PLEG): container finished" podID="32312778-9c44-4843-a588-5fba60384e05" containerID="759cbe04f032d8df08a612b05e244889018e14ae649430d1d9c9f8c95485aa5d" exitCode=0 Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.781052 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d9z59" podUID="c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" containerName="registry-server" containerID="cri-o://ffc2a9491a6276ce817db9e39e528e75cd29f3eedc77f220c7dae09ad9a1c62b" gracePeriod=2 Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.781435 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.782204 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" event={"ID":"32312778-9c44-4843-a588-5fba60384e05","Type":"ContainerDied","Data":"759cbe04f032d8df08a612b05e244889018e14ae649430d1d9c9f8c95485aa5d"} Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.782264 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz" event={"ID":"32312778-9c44-4843-a588-5fba60384e05","Type":"ContainerDied","Data":"6264edd6e509ddf66049653028a6e5a99b8ff3fab370367781f6b4c2c4544a37"} Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.800657 4764 scope.go:117] "RemoveContainer" containerID="2fc672eabcdfde188584b8ded74f5c4e2f917baeb322acb79784b8d6c120dfee" Mar 09 13:26:02 crc kubenswrapper[4764]: E0309 13:26:02.801016 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fc672eabcdfde188584b8ded74f5c4e2f917baeb322acb79784b8d6c120dfee\": container with ID starting with 2fc672eabcdfde188584b8ded74f5c4e2f917baeb322acb79784b8d6c120dfee not found: ID does not exist" containerID="2fc672eabcdfde188584b8ded74f5c4e2f917baeb322acb79784b8d6c120dfee" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.801048 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fc672eabcdfde188584b8ded74f5c4e2f917baeb322acb79784b8d6c120dfee"} err="failed to get container status \"2fc672eabcdfde188584b8ded74f5c4e2f917baeb322acb79784b8d6c120dfee\": rpc error: code = NotFound desc = could not find container \"2fc672eabcdfde188584b8ded74f5c4e2f917baeb322acb79784b8d6c120dfee\": container with ID starting with 2fc672eabcdfde188584b8ded74f5c4e2f917baeb322acb79784b8d6c120dfee not found: ID does not exist" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.801077 4764 scope.go:117] "RemoveContainer" containerID="759cbe04f032d8df08a612b05e244889018e14ae649430d1d9c9f8c95485aa5d" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.823732 4764 scope.go:117] "RemoveContainer" containerID="759cbe04f032d8df08a612b05e244889018e14ae649430d1d9c9f8c95485aa5d" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.825751 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz"] Mar 09 13:26:02 crc kubenswrapper[4764]: E0309 13:26:02.825958 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"759cbe04f032d8df08a612b05e244889018e14ae649430d1d9c9f8c95485aa5d\": container with ID starting with 759cbe04f032d8df08a612b05e244889018e14ae649430d1d9c9f8c95485aa5d not found: ID does not exist" containerID="759cbe04f032d8df08a612b05e244889018e14ae649430d1d9c9f8c95485aa5d" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.826047 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"759cbe04f032d8df08a612b05e244889018e14ae649430d1d9c9f8c95485aa5d"} err="failed to get container status \"759cbe04f032d8df08a612b05e244889018e14ae649430d1d9c9f8c95485aa5d\": rpc error: code = NotFound desc = could not find container \"759cbe04f032d8df08a612b05e244889018e14ae649430d1d9c9f8c95485aa5d\": container with ID starting with 759cbe04f032d8df08a612b05e244889018e14ae649430d1d9c9f8c95485aa5d not found: ID does not exist" Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.831266 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fb48676f5-nvfnz"] Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.835577 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c9d566569-xmnfz"] Mar 09 13:26:02 crc kubenswrapper[4764]: I0309 13:26:02.839430 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7c9d566569-xmnfz"] Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.138741 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-549f4d59df-4zh55"] Mar 09 13:26:03 crc kubenswrapper[4764]: E0309 13:26:03.139010 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32312778-9c44-4843-a588-5fba60384e05" containerName="route-controller-manager" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.139038 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="32312778-9c44-4843-a588-5fba60384e05" containerName="route-controller-manager" Mar 09 13:26:03 crc kubenswrapper[4764]: E0309 13:26:03.139064 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e222771-a709-459f-a36f-e44f4b87983e" containerName="controller-manager" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.139074 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e222771-a709-459f-a36f-e44f4b87983e" containerName="controller-manager" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.139198 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e222771-a709-459f-a36f-e44f4b87983e" containerName="controller-manager" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.139222 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="32312778-9c44-4843-a588-5fba60384e05" containerName="route-controller-manager" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.139688 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6"] Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.139965 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.140313 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.142597 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.144004 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.144042 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.144112 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.144141 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.144208 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.144369 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.144583 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.144763 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.144905 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.145396 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.146105 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.147556 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76db3a34-f290-4c40-892c-f22642bae846-config\") pod \"controller-manager-549f4d59df-4zh55\" (UID: \"76db3a34-f290-4c40-892c-f22642bae846\") " pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.147612 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/76db3a34-f290-4c40-892c-f22642bae846-proxy-ca-bundles\") pod \"controller-manager-549f4d59df-4zh55\" (UID: \"76db3a34-f290-4c40-892c-f22642bae846\") " pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.147674 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76db3a34-f290-4c40-892c-f22642bae846-serving-cert\") pod \"controller-manager-549f4d59df-4zh55\" (UID: \"76db3a34-f290-4c40-892c-f22642bae846\") " pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.147725 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnsmm\" (UniqueName: \"kubernetes.io/projected/037afebc-2339-4b7b-ba28-45bd9d6e949e-kube-api-access-pnsmm\") pod \"route-controller-manager-59846f5c55-j5dz6\" (UID: \"037afebc-2339-4b7b-ba28-45bd9d6e949e\") " pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.147765 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/037afebc-2339-4b7b-ba28-45bd9d6e949e-config\") pod \"route-controller-manager-59846f5c55-j5dz6\" (UID: \"037afebc-2339-4b7b-ba28-45bd9d6e949e\") " pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.147788 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7mw2\" (UniqueName: \"kubernetes.io/projected/76db3a34-f290-4c40-892c-f22642bae846-kube-api-access-v7mw2\") pod \"controller-manager-549f4d59df-4zh55\" (UID: \"76db3a34-f290-4c40-892c-f22642bae846\") " pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.147803 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/037afebc-2339-4b7b-ba28-45bd9d6e949e-serving-cert\") pod \"route-controller-manager-59846f5c55-j5dz6\" (UID: \"037afebc-2339-4b7b-ba28-45bd9d6e949e\") " pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.147824 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/037afebc-2339-4b7b-ba28-45bd9d6e949e-client-ca\") pod \"route-controller-manager-59846f5c55-j5dz6\" (UID: \"037afebc-2339-4b7b-ba28-45bd9d6e949e\") " pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.147843 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76db3a34-f290-4c40-892c-f22642bae846-client-ca\") pod \"controller-manager-549f4d59df-4zh55\" (UID: \"76db3a34-f290-4c40-892c-f22642bae846\") " pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.152600 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.175074 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6"] Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.181517 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.197882 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-549f4d59df-4zh55"] Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.248780 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-catalog-content\") pod \"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2\" (UID: \"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2\") " Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.248878 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-utilities\") pod \"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2\" (UID: \"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2\") " Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.248997 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76db3a34-f290-4c40-892c-f22642bae846-config\") pod \"controller-manager-549f4d59df-4zh55\" (UID: \"76db3a34-f290-4c40-892c-f22642bae846\") " pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.249068 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/76db3a34-f290-4c40-892c-f22642bae846-proxy-ca-bundles\") pod \"controller-manager-549f4d59df-4zh55\" (UID: \"76db3a34-f290-4c40-892c-f22642bae846\") " pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.249132 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76db3a34-f290-4c40-892c-f22642bae846-serving-cert\") pod \"controller-manager-549f4d59df-4zh55\" (UID: \"76db3a34-f290-4c40-892c-f22642bae846\") " pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.249183 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnsmm\" (UniqueName: \"kubernetes.io/projected/037afebc-2339-4b7b-ba28-45bd9d6e949e-kube-api-access-pnsmm\") pod \"route-controller-manager-59846f5c55-j5dz6\" (UID: \"037afebc-2339-4b7b-ba28-45bd9d6e949e\") " pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.249235 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/037afebc-2339-4b7b-ba28-45bd9d6e949e-config\") pod \"route-controller-manager-59846f5c55-j5dz6\" (UID: \"037afebc-2339-4b7b-ba28-45bd9d6e949e\") " pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.249268 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7mw2\" (UniqueName: \"kubernetes.io/projected/76db3a34-f290-4c40-892c-f22642bae846-kube-api-access-v7mw2\") pod \"controller-manager-549f4d59df-4zh55\" (UID: \"76db3a34-f290-4c40-892c-f22642bae846\") " pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.249298 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/037afebc-2339-4b7b-ba28-45bd9d6e949e-serving-cert\") pod \"route-controller-manager-59846f5c55-j5dz6\" (UID: \"037afebc-2339-4b7b-ba28-45bd9d6e949e\") " pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.249340 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/037afebc-2339-4b7b-ba28-45bd9d6e949e-client-ca\") pod \"route-controller-manager-59846f5c55-j5dz6\" (UID: \"037afebc-2339-4b7b-ba28-45bd9d6e949e\") " pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.249373 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76db3a34-f290-4c40-892c-f22642bae846-client-ca\") pod \"controller-manager-549f4d59df-4zh55\" (UID: \"76db3a34-f290-4c40-892c-f22642bae846\") " pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.251199 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/76db3a34-f290-4c40-892c-f22642bae846-proxy-ca-bundles\") pod \"controller-manager-549f4d59df-4zh55\" (UID: \"76db3a34-f290-4c40-892c-f22642bae846\") " pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.251418 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/037afebc-2339-4b7b-ba28-45bd9d6e949e-client-ca\") pod \"route-controller-manager-59846f5c55-j5dz6\" (UID: \"037afebc-2339-4b7b-ba28-45bd9d6e949e\") " pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.251557 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/037afebc-2339-4b7b-ba28-45bd9d6e949e-config\") pod \"route-controller-manager-59846f5c55-j5dz6\" (UID: \"037afebc-2339-4b7b-ba28-45bd9d6e949e\") " pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.251613 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76db3a34-f290-4c40-892c-f22642bae846-config\") pod \"controller-manager-549f4d59df-4zh55\" (UID: \"76db3a34-f290-4c40-892c-f22642bae846\") " pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.251894 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76db3a34-f290-4c40-892c-f22642bae846-client-ca\") pod \"controller-manager-549f4d59df-4zh55\" (UID: \"76db3a34-f290-4c40-892c-f22642bae846\") " pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.252782 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-utilities" (OuterVolumeSpecName: "utilities") pod "c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" (UID: "c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.255532 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/037afebc-2339-4b7b-ba28-45bd9d6e949e-serving-cert\") pod \"route-controller-manager-59846f5c55-j5dz6\" (UID: \"037afebc-2339-4b7b-ba28-45bd9d6e949e\") " pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.255826 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76db3a34-f290-4c40-892c-f22642bae846-serving-cert\") pod \"controller-manager-549f4d59df-4zh55\" (UID: \"76db3a34-f290-4c40-892c-f22642bae846\") " pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.267013 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7mw2\" (UniqueName: \"kubernetes.io/projected/76db3a34-f290-4c40-892c-f22642bae846-kube-api-access-v7mw2\") pod \"controller-manager-549f4d59df-4zh55\" (UID: \"76db3a34-f290-4c40-892c-f22642bae846\") " pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.270321 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnsmm\" (UniqueName: \"kubernetes.io/projected/037afebc-2339-4b7b-ba28-45bd9d6e949e-kube-api-access-pnsmm\") pod \"route-controller-manager-59846f5c55-j5dz6\" (UID: \"037afebc-2339-4b7b-ba28-45bd9d6e949e\") " pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.349874 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stkxn\" (UniqueName: \"kubernetes.io/projected/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-kube-api-access-stkxn\") pod \"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2\" (UID: \"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2\") " Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.350628 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.353406 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-kube-api-access-stkxn" (OuterVolumeSpecName: "kube-api-access-stkxn") pod "c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" (UID: "c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2"). InnerVolumeSpecName "kube-api-access-stkxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.374310 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" (UID: "c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.451174 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.451439 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stkxn\" (UniqueName: \"kubernetes.io/projected/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2-kube-api-access-stkxn\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.490465 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.498439 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.585314 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32312778-9c44-4843-a588-5fba60384e05" path="/var/lib/kubelet/pods/32312778-9c44-4843-a588-5fba60384e05/volumes" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.590384 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e222771-a709-459f-a36f-e44f4b87983e" path="/var/lib/kubelet/pods/8e222771-a709-459f-a36f-e44f4b87983e/volumes" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.789396 4764 generic.go:334] "Generic (PLEG): container finished" podID="c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" containerID="ffc2a9491a6276ce817db9e39e528e75cd29f3eedc77f220c7dae09ad9a1c62b" exitCode=0 Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.789500 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d9z59" event={"ID":"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2","Type":"ContainerDied","Data":"ffc2a9491a6276ce817db9e39e528e75cd29f3eedc77f220c7dae09ad9a1c62b"} Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.789552 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d9z59" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.789593 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d9z59" event={"ID":"c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2","Type":"ContainerDied","Data":"63639f892c3d7cc35dde0976454fc20f0b0dcd4c9977b4b39ee9f80a34190631"} Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.789620 4764 scope.go:117] "RemoveContainer" containerID="ffc2a9491a6276ce817db9e39e528e75cd29f3eedc77f220c7dae09ad9a1c62b" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.809908 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d9z59"] Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.811710 4764 scope.go:117] "RemoveContainer" containerID="b507b889d835011a58e3fb35bdc6611d2459c15d083a937632ed6f78fbcf35b2" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.812851 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d9z59"] Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.845859 4764 scope.go:117] "RemoveContainer" containerID="d8b26ed27d9d828e2d55afc5c28aac89bdf5c58d97e8a4288af402a3789e0cd2" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.859878 4764 scope.go:117] "RemoveContainer" containerID="ffc2a9491a6276ce817db9e39e528e75cd29f3eedc77f220c7dae09ad9a1c62b" Mar 09 13:26:03 crc kubenswrapper[4764]: E0309 13:26:03.860295 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffc2a9491a6276ce817db9e39e528e75cd29f3eedc77f220c7dae09ad9a1c62b\": container with ID starting with ffc2a9491a6276ce817db9e39e528e75cd29f3eedc77f220c7dae09ad9a1c62b not found: ID does not exist" containerID="ffc2a9491a6276ce817db9e39e528e75cd29f3eedc77f220c7dae09ad9a1c62b" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.860340 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc2a9491a6276ce817db9e39e528e75cd29f3eedc77f220c7dae09ad9a1c62b"} err="failed to get container status \"ffc2a9491a6276ce817db9e39e528e75cd29f3eedc77f220c7dae09ad9a1c62b\": rpc error: code = NotFound desc = could not find container \"ffc2a9491a6276ce817db9e39e528e75cd29f3eedc77f220c7dae09ad9a1c62b\": container with ID starting with ffc2a9491a6276ce817db9e39e528e75cd29f3eedc77f220c7dae09ad9a1c62b not found: ID does not exist" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.860370 4764 scope.go:117] "RemoveContainer" containerID="b507b889d835011a58e3fb35bdc6611d2459c15d083a937632ed6f78fbcf35b2" Mar 09 13:26:03 crc kubenswrapper[4764]: E0309 13:26:03.860608 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b507b889d835011a58e3fb35bdc6611d2459c15d083a937632ed6f78fbcf35b2\": container with ID starting with b507b889d835011a58e3fb35bdc6611d2459c15d083a937632ed6f78fbcf35b2 not found: ID does not exist" containerID="b507b889d835011a58e3fb35bdc6611d2459c15d083a937632ed6f78fbcf35b2" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.860633 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b507b889d835011a58e3fb35bdc6611d2459c15d083a937632ed6f78fbcf35b2"} err="failed to get container status \"b507b889d835011a58e3fb35bdc6611d2459c15d083a937632ed6f78fbcf35b2\": rpc error: code = NotFound desc = could not find container \"b507b889d835011a58e3fb35bdc6611d2459c15d083a937632ed6f78fbcf35b2\": container with ID starting with b507b889d835011a58e3fb35bdc6611d2459c15d083a937632ed6f78fbcf35b2 not found: ID does not exist" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.860662 4764 scope.go:117] "RemoveContainer" containerID="d8b26ed27d9d828e2d55afc5c28aac89bdf5c58d97e8a4288af402a3789e0cd2" Mar 09 13:26:03 crc kubenswrapper[4764]: E0309 13:26:03.861082 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8b26ed27d9d828e2d55afc5c28aac89bdf5c58d97e8a4288af402a3789e0cd2\": container with ID starting with d8b26ed27d9d828e2d55afc5c28aac89bdf5c58d97e8a4288af402a3789e0cd2 not found: ID does not exist" containerID="d8b26ed27d9d828e2d55afc5c28aac89bdf5c58d97e8a4288af402a3789e0cd2" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.861111 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8b26ed27d9d828e2d55afc5c28aac89bdf5c58d97e8a4288af402a3789e0cd2"} err="failed to get container status \"d8b26ed27d9d828e2d55afc5c28aac89bdf5c58d97e8a4288af402a3789e0cd2\": rpc error: code = NotFound desc = could not find container \"d8b26ed27d9d828e2d55afc5c28aac89bdf5c58d97e8a4288af402a3789e0cd2\": container with ID starting with d8b26ed27d9d828e2d55afc5c28aac89bdf5c58d97e8a4288af402a3789e0cd2 not found: ID does not exist" Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.917993 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-549f4d59df-4zh55"] Mar 09 13:26:03 crc kubenswrapper[4764]: W0309 13:26:03.930590 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76db3a34_f290_4c40_892c_f22642bae846.slice/crio-017b5cfb1ffda49d2bd410be18d85e5f0dcd32686c8057ab01f06d5a0893d59b WatchSource:0}: Error finding container 017b5cfb1ffda49d2bd410be18d85e5f0dcd32686c8057ab01f06d5a0893d59b: Status 404 returned error can't find the container with id 017b5cfb1ffda49d2bd410be18d85e5f0dcd32686c8057ab01f06d5a0893d59b Mar 09 13:26:03 crc kubenswrapper[4764]: I0309 13:26:03.969037 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6"] Mar 09 13:26:03 crc kubenswrapper[4764]: W0309 13:26:03.989013 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod037afebc_2339_4b7b_ba28_45bd9d6e949e.slice/crio-76fd845119563bf1fa122b5fc75988041e296874da4420c76b26f371cfa3fd42 WatchSource:0}: Error finding container 76fd845119563bf1fa122b5fc75988041e296874da4420c76b26f371cfa3fd42: Status 404 returned error can't find the container with id 76fd845119563bf1fa122b5fc75988041e296874da4420c76b26f371cfa3fd42 Mar 09 13:26:04 crc kubenswrapper[4764]: I0309 13:26:04.078165 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551046-ll87d" Mar 09 13:26:04 crc kubenswrapper[4764]: I0309 13:26:04.263847 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msj62\" (UniqueName: \"kubernetes.io/projected/c09230f9-b117-44a0-b3ed-ab6dc7ce0285-kube-api-access-msj62\") pod \"c09230f9-b117-44a0-b3ed-ab6dc7ce0285\" (UID: \"c09230f9-b117-44a0-b3ed-ab6dc7ce0285\") " Mar 09 13:26:04 crc kubenswrapper[4764]: I0309 13:26:04.269128 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c09230f9-b117-44a0-b3ed-ab6dc7ce0285-kube-api-access-msj62" (OuterVolumeSpecName: "kube-api-access-msj62") pod "c09230f9-b117-44a0-b3ed-ab6dc7ce0285" (UID: "c09230f9-b117-44a0-b3ed-ab6dc7ce0285"). InnerVolumeSpecName "kube-api-access-msj62". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:26:04 crc kubenswrapper[4764]: I0309 13:26:04.365817 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msj62\" (UniqueName: \"kubernetes.io/projected/c09230f9-b117-44a0-b3ed-ab6dc7ce0285-kube-api-access-msj62\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:04 crc kubenswrapper[4764]: I0309 13:26:04.799192 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551046-ll87d" event={"ID":"c09230f9-b117-44a0-b3ed-ab6dc7ce0285","Type":"ContainerDied","Data":"f8624d20b131f960061486b5cf6a38954f017f1461bb66469047b83c408d5f71"} Mar 09 13:26:04 crc kubenswrapper[4764]: I0309 13:26:04.799255 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8624d20b131f960061486b5cf6a38954f017f1461bb66469047b83c408d5f71" Mar 09 13:26:04 crc kubenswrapper[4764]: I0309 13:26:04.800316 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551046-ll87d" Mar 09 13:26:04 crc kubenswrapper[4764]: I0309 13:26:04.800406 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" event={"ID":"037afebc-2339-4b7b-ba28-45bd9d6e949e","Type":"ContainerStarted","Data":"3a95f0e2fee7cc7194c065f9e519774ef1f7d76dd3ee442ec0d780b7dfa666fe"} Mar 09 13:26:04 crc kubenswrapper[4764]: I0309 13:26:04.800572 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" event={"ID":"037afebc-2339-4b7b-ba28-45bd9d6e949e","Type":"ContainerStarted","Data":"76fd845119563bf1fa122b5fc75988041e296874da4420c76b26f371cfa3fd42"} Mar 09 13:26:04 crc kubenswrapper[4764]: I0309 13:26:04.801851 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" event={"ID":"76db3a34-f290-4c40-892c-f22642bae846","Type":"ContainerStarted","Data":"a2a09a6554fbde267814aaed185f46872ec5b6ad8edd513166ed165681209bb8"} Mar 09 13:26:04 crc kubenswrapper[4764]: I0309 13:26:04.801885 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" event={"ID":"76db3a34-f290-4c40-892c-f22642bae846","Type":"ContainerStarted","Data":"017b5cfb1ffda49d2bd410be18d85e5f0dcd32686c8057ab01f06d5a0893d59b"} Mar 09 13:26:04 crc kubenswrapper[4764]: I0309 13:26:04.802286 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:04 crc kubenswrapper[4764]: I0309 13:26:04.806452 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" Mar 09 13:26:04 crc kubenswrapper[4764]: I0309 13:26:04.821882 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" podStartSLOduration=3.821863241 podStartE2EDuration="3.821863241s" podCreationTimestamp="2026-03-09 13:26:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:26:04.818350195 +0000 UTC m=+320.068522103" watchObservedRunningTime="2026-03-09 13:26:04.821863241 +0000 UTC m=+320.072035149" Mar 09 13:26:05 crc kubenswrapper[4764]: I0309 13:26:05.108025 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-549f4d59df-4zh55" podStartSLOduration=4.108001334 podStartE2EDuration="4.108001334s" podCreationTimestamp="2026-03-09 13:26:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:26:04.84082437 +0000 UTC m=+320.090996288" watchObservedRunningTime="2026-03-09 13:26:05.108001334 +0000 UTC m=+320.358173242" Mar 09 13:26:05 crc kubenswrapper[4764]: I0309 13:26:05.566908 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" path="/var/lib/kubelet/pods/c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2/volumes" Mar 09 13:26:05 crc kubenswrapper[4764]: I0309 13:26:05.807498 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:05 crc kubenswrapper[4764]: I0309 13:26:05.813312 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-59846f5c55-j5dz6" Mar 09 13:26:07 crc kubenswrapper[4764]: I0309 13:26:07.194997 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:26:07 crc kubenswrapper[4764]: I0309 13:26:07.645487 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.199471 4764 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 13:26:09 crc kubenswrapper[4764]: E0309 13:26:09.200002 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" containerName="registry-server" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.200018 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" containerName="registry-server" Mar 09 13:26:09 crc kubenswrapper[4764]: E0309 13:26:09.200036 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" containerName="extract-utilities" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.200044 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" containerName="extract-utilities" Mar 09 13:26:09 crc kubenswrapper[4764]: E0309 13:26:09.200057 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c09230f9-b117-44a0-b3ed-ab6dc7ce0285" containerName="oc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.200064 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c09230f9-b117-44a0-b3ed-ab6dc7ce0285" containerName="oc" Mar 09 13:26:09 crc kubenswrapper[4764]: E0309 13:26:09.200081 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" containerName="extract-content" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.200088 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" containerName="extract-content" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.200201 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c09230f9-b117-44a0-b3ed-ab6dc7ce0285" containerName="oc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.200221 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6dd4a8c-608d-44b4-a42c-1cc8b7f9fec2" containerName="registry-server" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.200605 4764 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.200789 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.200916 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7" gracePeriod=15 Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.200954 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b" gracePeriod=15 Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.200966 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6" gracePeriod=15 Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.200926 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811" gracePeriod=15 Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201091 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f" gracePeriod=15 Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201349 4764 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 13:26:09 crc kubenswrapper[4764]: E0309 13:26:09.201472 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201485 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 09 13:26:09 crc kubenswrapper[4764]: E0309 13:26:09.201492 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201499 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 09 13:26:09 crc kubenswrapper[4764]: E0309 13:26:09.201509 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201516 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:26:09 crc kubenswrapper[4764]: E0309 13:26:09.201522 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201529 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:26:09 crc kubenswrapper[4764]: E0309 13:26:09.201539 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201546 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:26:09 crc kubenswrapper[4764]: E0309 13:26:09.201556 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201563 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 09 13:26:09 crc kubenswrapper[4764]: E0309 13:26:09.201575 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201582 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 09 13:26:09 crc kubenswrapper[4764]: E0309 13:26:09.201591 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201598 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201711 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201722 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201731 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201737 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201746 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201753 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201760 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201769 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:26:09 crc kubenswrapper[4764]: E0309 13:26:09.201851 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201857 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:26:09 crc kubenswrapper[4764]: E0309 13:26:09.201872 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201879 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.201956 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.232810 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.232870 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.232892 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.232911 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.232973 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.233007 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.233023 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.233050 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.259754 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.333906 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.333962 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.333989 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.334012 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.334033 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.334062 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.334319 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.334397 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.334435 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.334431 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.334457 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.334484 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.334514 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.334487 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.334506 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.334562 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.553790 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:26:09 crc kubenswrapper[4764]: W0309 13:26:09.578467 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-2206d89c112cb493c431c0955e1d8f5a2852de0b3891643d63061de8ac6c7054 WatchSource:0}: Error finding container 2206d89c112cb493c431c0955e1d8f5a2852de0b3891643d63061de8ac6c7054: Status 404 returned error can't find the container with id 2206d89c112cb493c431c0955e1d8f5a2852de0b3891643d63061de8ac6c7054 Mar 09 13:26:09 crc kubenswrapper[4764]: E0309 13:26:09.581322 4764 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.52:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b2f31e152c0df openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:26:09.580622047 +0000 UTC m=+324.830793955,LastTimestamp:2026-03-09 13:26:09.580622047 +0000 UTC m=+324.830793955,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.830759 4764 generic.go:334] "Generic (PLEG): container finished" podID="6079e5ed-2acc-42f5-a62e-ea2a98b18abd" containerID="a9112ccd1b21485b0fe1f8f0b1cfb2709637cb4e00bc4c414110531d66291dd4" exitCode=0 Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.830864 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6079e5ed-2acc-42f5-a62e-ea2a98b18abd","Type":"ContainerDied","Data":"a9112ccd1b21485b0fe1f8f0b1cfb2709637cb4e00bc4c414110531d66291dd4"} Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.831809 4764 status_manager.go:851] "Failed to get status for pod" podUID="6079e5ed-2acc-42f5-a62e-ea2a98b18abd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.832538 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.836042 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.837934 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.838673 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811" exitCode=0 Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.838699 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f" exitCode=0 Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.838709 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b" exitCode=0 Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.838718 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6" exitCode=2 Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.838760 4764 scope.go:117] "RemoveContainer" containerID="033930947dc527026e2797f16a6dd9181c8c7d8ec88b09031a132f55b953c356" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.842192 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"34d89c76e37e470d7c44361ef7ee1eb9e62495b1f676f3e88198ea7b09a66772"} Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.842229 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"2206d89c112cb493c431c0955e1d8f5a2852de0b3891643d63061de8ac6c7054"} Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.842936 4764 status_manager.go:851] "Failed to get status for pod" podUID="6079e5ed-2acc-42f5-a62e-ea2a98b18abd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:09 crc kubenswrapper[4764]: I0309 13:26:09.843183 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:10 crc kubenswrapper[4764]: E0309 13:26:10.651101 4764 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.129.56.52:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" volumeName="registry-storage" Mar 09 13:26:10 crc kubenswrapper[4764]: E0309 13:26:10.763239 4764 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.52:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b2f31e152c0df openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:26:09.580622047 +0000 UTC m=+324.830793955,LastTimestamp:2026-03-09 13:26:09.580622047 +0000 UTC m=+324.830793955,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:26:10 crc kubenswrapper[4764]: I0309 13:26:10.850787 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.210237 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.211132 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.211492 4764 status_manager.go:851] "Failed to get status for pod" podUID="6079e5ed-2acc-42f5-a62e-ea2a98b18abd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.370283 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-kube-api-access\") pod \"6079e5ed-2acc-42f5-a62e-ea2a98b18abd\" (UID: \"6079e5ed-2acc-42f5-a62e-ea2a98b18abd\") " Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.370397 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-var-lock\") pod \"6079e5ed-2acc-42f5-a62e-ea2a98b18abd\" (UID: \"6079e5ed-2acc-42f5-a62e-ea2a98b18abd\") " Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.370508 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-kubelet-dir\") pod \"6079e5ed-2acc-42f5-a62e-ea2a98b18abd\" (UID: \"6079e5ed-2acc-42f5-a62e-ea2a98b18abd\") " Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.370692 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-var-lock" (OuterVolumeSpecName: "var-lock") pod "6079e5ed-2acc-42f5-a62e-ea2a98b18abd" (UID: "6079e5ed-2acc-42f5-a62e-ea2a98b18abd"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.370731 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6079e5ed-2acc-42f5-a62e-ea2a98b18abd" (UID: "6079e5ed-2acc-42f5-a62e-ea2a98b18abd"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.370796 4764 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-var-lock\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.370813 4764 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.377393 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6079e5ed-2acc-42f5-a62e-ea2a98b18abd" (UID: "6079e5ed-2acc-42f5-a62e-ea2a98b18abd"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.471950 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6079e5ed-2acc-42f5-a62e-ea2a98b18abd-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.580125 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.581687 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.582471 4764 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.583026 4764 status_manager.go:851] "Failed to get status for pod" podUID="6079e5ed-2acc-42f5-a62e-ea2a98b18abd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.583375 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.776058 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.776222 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.776207 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.776306 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.776320 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.776340 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.776621 4764 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.776632 4764 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.776662 4764 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.861870 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.862846 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7" exitCode=0 Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.862971 4764 scope.go:117] "RemoveContainer" containerID="e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.862989 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.865857 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6079e5ed-2acc-42f5-a62e-ea2a98b18abd","Type":"ContainerDied","Data":"2b27f2baa5d5f2b2cb7d6fda09a6905faac786ec6165e8062e44b7064fdcbfbc"} Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.865892 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b27f2baa5d5f2b2cb7d6fda09a6905faac786ec6165e8062e44b7064fdcbfbc" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.865914 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.870623 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.871191 4764 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.871535 4764 status_manager.go:851] "Failed to get status for pod" podUID="6079e5ed-2acc-42f5-a62e-ea2a98b18abd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.876692 4764 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.877271 4764 status_manager.go:851] "Failed to get status for pod" podUID="6079e5ed-2acc-42f5-a62e-ea2a98b18abd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.877877 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.878045 4764 scope.go:117] "RemoveContainer" containerID="a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.892340 4764 scope.go:117] "RemoveContainer" containerID="c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.906735 4764 scope.go:117] "RemoveContainer" containerID="4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.925435 4764 scope.go:117] "RemoveContainer" containerID="482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.943268 4764 scope.go:117] "RemoveContainer" containerID="a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.962024 4764 scope.go:117] "RemoveContainer" containerID="e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811" Mar 09 13:26:11 crc kubenswrapper[4764]: E0309 13:26:11.962516 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\": container with ID starting with e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811 not found: ID does not exist" containerID="e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.962557 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811"} err="failed to get container status \"e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\": rpc error: code = NotFound desc = could not find container \"e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811\": container with ID starting with e04ceb7d287648fc77742a0a1b4cf13a56f634301b703291340961730fc73811 not found: ID does not exist" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.962583 4764 scope.go:117] "RemoveContainer" containerID="a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f" Mar 09 13:26:11 crc kubenswrapper[4764]: E0309 13:26:11.962858 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\": container with ID starting with a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f not found: ID does not exist" containerID="a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.962886 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f"} err="failed to get container status \"a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\": rpc error: code = NotFound desc = could not find container \"a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f\": container with ID starting with a3a34e47ede9b05cec931ca6337a98381bcceeb735352702d19128e8dfe8476f not found: ID does not exist" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.962903 4764 scope.go:117] "RemoveContainer" containerID="c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b" Mar 09 13:26:11 crc kubenswrapper[4764]: E0309 13:26:11.963140 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\": container with ID starting with c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b not found: ID does not exist" containerID="c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.963164 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b"} err="failed to get container status \"c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\": rpc error: code = NotFound desc = could not find container \"c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b\": container with ID starting with c3571aa0b654db21a79a11d6c58af9f858d2493283d9eff7a84f3404c34dcf0b not found: ID does not exist" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.963180 4764 scope.go:117] "RemoveContainer" containerID="4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6" Mar 09 13:26:11 crc kubenswrapper[4764]: E0309 13:26:11.963422 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\": container with ID starting with 4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6 not found: ID does not exist" containerID="4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.963444 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6"} err="failed to get container status \"4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\": rpc error: code = NotFound desc = could not find container \"4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6\": container with ID starting with 4aae266a5256be6c65a7637dfaed552087689f317f7b61dcb52f4aa6d4d4d7e6 not found: ID does not exist" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.963463 4764 scope.go:117] "RemoveContainer" containerID="482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7" Mar 09 13:26:11 crc kubenswrapper[4764]: E0309 13:26:11.963944 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\": container with ID starting with 482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7 not found: ID does not exist" containerID="482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.963986 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7"} err="failed to get container status \"482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\": rpc error: code = NotFound desc = could not find container \"482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7\": container with ID starting with 482469a933d0aba722e6a341f6934669e0d84998bd2c99d6962a17548ab80ce7 not found: ID does not exist" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.964016 4764 scope.go:117] "RemoveContainer" containerID="a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3" Mar 09 13:26:11 crc kubenswrapper[4764]: E0309 13:26:11.964312 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\": container with ID starting with a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3 not found: ID does not exist" containerID="a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3" Mar 09 13:26:11 crc kubenswrapper[4764]: I0309 13:26:11.964341 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3"} err="failed to get container status \"a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\": rpc error: code = NotFound desc = could not find container \"a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3\": container with ID starting with a38de1b3854f33b5728435571b8f0df92aeaae7d8e8a235ed58044fa2a9e09c3 not found: ID does not exist" Mar 09 13:26:13 crc kubenswrapper[4764]: I0309 13:26:13.567553 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 09 13:26:15 crc kubenswrapper[4764]: I0309 13:26:15.561852 4764 status_manager.go:851] "Failed to get status for pod" podUID="6079e5ed-2acc-42f5-a62e-ea2a98b18abd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:15 crc kubenswrapper[4764]: I0309 13:26:15.562429 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:17 crc kubenswrapper[4764]: E0309 13:26:17.381627 4764 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:17 crc kubenswrapper[4764]: E0309 13:26:17.382751 4764 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:17 crc kubenswrapper[4764]: E0309 13:26:17.383158 4764 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:17 crc kubenswrapper[4764]: E0309 13:26:17.383568 4764 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:17 crc kubenswrapper[4764]: E0309 13:26:17.384037 4764 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:17 crc kubenswrapper[4764]: I0309 13:26:17.384082 4764 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 09 13:26:17 crc kubenswrapper[4764]: E0309 13:26:17.384446 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.52:6443: connect: connection refused" interval="200ms" Mar 09 13:26:17 crc kubenswrapper[4764]: E0309 13:26:17.586037 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.52:6443: connect: connection refused" interval="400ms" Mar 09 13:26:17 crc kubenswrapper[4764]: E0309 13:26:17.986471 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.52:6443: connect: connection refused" interval="800ms" Mar 09 13:26:18 crc kubenswrapper[4764]: E0309 13:26:18.787188 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.52:6443: connect: connection refused" interval="1.6s" Mar 09 13:26:20 crc kubenswrapper[4764]: E0309 13:26:20.388308 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.52:6443: connect: connection refused" interval="3.2s" Mar 09 13:26:20 crc kubenswrapper[4764]: E0309 13:26:20.764449 4764 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.52:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b2f31e152c0df openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:26:09.580622047 +0000 UTC m=+324.830793955,LastTimestamp:2026-03-09 13:26:09.580622047 +0000 UTC m=+324.830793955,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:26:22 crc kubenswrapper[4764]: I0309 13:26:22.559067 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:22 crc kubenswrapper[4764]: I0309 13:26:22.560745 4764 status_manager.go:851] "Failed to get status for pod" podUID="6079e5ed-2acc-42f5-a62e-ea2a98b18abd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:22 crc kubenswrapper[4764]: I0309 13:26:22.561901 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:22 crc kubenswrapper[4764]: I0309 13:26:22.585769 4764 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7c21ab2-1820-47de-a61d-71d81928564a" Mar 09 13:26:22 crc kubenswrapper[4764]: I0309 13:26:22.585810 4764 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7c21ab2-1820-47de-a61d-71d81928564a" Mar 09 13:26:22 crc kubenswrapper[4764]: E0309 13:26:22.586457 4764 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:22 crc kubenswrapper[4764]: I0309 13:26:22.587375 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:22 crc kubenswrapper[4764]: W0309 13:26:22.622620 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-05f248f283cd22d284be0ce2b65eee19c9c2cfbe94abfcc012118bce277e2802 WatchSource:0}: Error finding container 05f248f283cd22d284be0ce2b65eee19c9c2cfbe94abfcc012118bce277e2802: Status 404 returned error can't find the container with id 05f248f283cd22d284be0ce2b65eee19c9c2cfbe94abfcc012118bce277e2802 Mar 09 13:26:22 crc kubenswrapper[4764]: I0309 13:26:22.875135 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-nj856" podUID="f9b3244b-8df0-4330-9887-4092260d416a" containerName="oauth-openshift" containerID="cri-o://5e8fcdb2f8638bd16f9f163bfced3472de42f2c6547d60287a9a8804c7cc74e3" gracePeriod=15 Mar 09 13:26:22 crc kubenswrapper[4764]: I0309 13:26:22.927800 4764 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="3f15ba85d13eeff185b108cb2355b170c5025452a10f0b464ee7561803e49a28" exitCode=0 Mar 09 13:26:22 crc kubenswrapper[4764]: I0309 13:26:22.927907 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"3f15ba85d13eeff185b108cb2355b170c5025452a10f0b464ee7561803e49a28"} Mar 09 13:26:22 crc kubenswrapper[4764]: I0309 13:26:22.927985 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"05f248f283cd22d284be0ce2b65eee19c9c2cfbe94abfcc012118bce277e2802"} Mar 09 13:26:22 crc kubenswrapper[4764]: I0309 13:26:22.928370 4764 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7c21ab2-1820-47de-a61d-71d81928564a" Mar 09 13:26:22 crc kubenswrapper[4764]: I0309 13:26:22.928395 4764 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7c21ab2-1820-47de-a61d-71d81928564a" Mar 09 13:26:22 crc kubenswrapper[4764]: I0309 13:26:22.928792 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:22 crc kubenswrapper[4764]: E0309 13:26:22.928866 4764 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:22 crc kubenswrapper[4764]: I0309 13:26:22.929164 4764 status_manager.go:851] "Failed to get status for pod" podUID="6079e5ed-2acc-42f5-a62e-ea2a98b18abd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.52:6443: connect: connection refused" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.328125 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.528700 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-error\") pod \"f9b3244b-8df0-4330-9887-4092260d416a\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.528776 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-service-ca\") pod \"f9b3244b-8df0-4330-9887-4092260d416a\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.528805 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-idp-0-file-data\") pod \"f9b3244b-8df0-4330-9887-4092260d416a\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.528831 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-cliconfig\") pod \"f9b3244b-8df0-4330-9887-4092260d416a\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.528851 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-ocp-branding-template\") pod \"f9b3244b-8df0-4330-9887-4092260d416a\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.528873 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f9b3244b-8df0-4330-9887-4092260d416a-audit-dir\") pod \"f9b3244b-8df0-4330-9887-4092260d416a\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.528892 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-provider-selection\") pod \"f9b3244b-8df0-4330-9887-4092260d416a\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.528919 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-serving-cert\") pod \"f9b3244b-8df0-4330-9887-4092260d416a\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.528937 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdsds\" (UniqueName: \"kubernetes.io/projected/f9b3244b-8df0-4330-9887-4092260d416a-kube-api-access-tdsds\") pod \"f9b3244b-8df0-4330-9887-4092260d416a\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.528952 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-session\") pod \"f9b3244b-8df0-4330-9887-4092260d416a\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.528971 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-login\") pod \"f9b3244b-8df0-4330-9887-4092260d416a\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.529002 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-router-certs\") pod \"f9b3244b-8df0-4330-9887-4092260d416a\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.529037 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-trusted-ca-bundle\") pod \"f9b3244b-8df0-4330-9887-4092260d416a\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.529095 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-audit-policies\") pod \"f9b3244b-8df0-4330-9887-4092260d416a\" (UID: \"f9b3244b-8df0-4330-9887-4092260d416a\") " Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.530228 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "f9b3244b-8df0-4330-9887-4092260d416a" (UID: "f9b3244b-8df0-4330-9887-4092260d416a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.530264 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9b3244b-8df0-4330-9887-4092260d416a-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f9b3244b-8df0-4330-9887-4092260d416a" (UID: "f9b3244b-8df0-4330-9887-4092260d416a"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.534006 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "f9b3244b-8df0-4330-9887-4092260d416a" (UID: "f9b3244b-8df0-4330-9887-4092260d416a"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.534655 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "f9b3244b-8df0-4330-9887-4092260d416a" (UID: "f9b3244b-8df0-4330-9887-4092260d416a"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.534925 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "f9b3244b-8df0-4330-9887-4092260d416a" (UID: "f9b3244b-8df0-4330-9887-4092260d416a"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.535317 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "f9b3244b-8df0-4330-9887-4092260d416a" (UID: "f9b3244b-8df0-4330-9887-4092260d416a"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.535634 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "f9b3244b-8df0-4330-9887-4092260d416a" (UID: "f9b3244b-8df0-4330-9887-4092260d416a"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.536879 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "f9b3244b-8df0-4330-9887-4092260d416a" (UID: "f9b3244b-8df0-4330-9887-4092260d416a"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.537174 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "f9b3244b-8df0-4330-9887-4092260d416a" (UID: "f9b3244b-8df0-4330-9887-4092260d416a"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.537580 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9b3244b-8df0-4330-9887-4092260d416a-kube-api-access-tdsds" (OuterVolumeSpecName: "kube-api-access-tdsds") pod "f9b3244b-8df0-4330-9887-4092260d416a" (UID: "f9b3244b-8df0-4330-9887-4092260d416a"). InnerVolumeSpecName "kube-api-access-tdsds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.538893 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "f9b3244b-8df0-4330-9887-4092260d416a" (UID: "f9b3244b-8df0-4330-9887-4092260d416a"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.539565 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "f9b3244b-8df0-4330-9887-4092260d416a" (UID: "f9b3244b-8df0-4330-9887-4092260d416a"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.540117 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "f9b3244b-8df0-4330-9887-4092260d416a" (UID: "f9b3244b-8df0-4330-9887-4092260d416a"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.540247 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "f9b3244b-8df0-4330-9887-4092260d416a" (UID: "f9b3244b-8df0-4330-9887-4092260d416a"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.631071 4764 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.631117 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.631137 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.631156 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.631171 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.631188 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.631203 4764 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f9b3244b-8df0-4330-9887-4092260d416a-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.631220 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.631237 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.631253 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdsds\" (UniqueName: \"kubernetes.io/projected/f9b3244b-8df0-4330-9887-4092260d416a-kube-api-access-tdsds\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.631269 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.631286 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.631302 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.631317 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9b3244b-8df0-4330-9887-4092260d416a-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.934360 4764 generic.go:334] "Generic (PLEG): container finished" podID="f9b3244b-8df0-4330-9887-4092260d416a" containerID="5e8fcdb2f8638bd16f9f163bfced3472de42f2c6547d60287a9a8804c7cc74e3" exitCode=0 Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.934406 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-nj856" event={"ID":"f9b3244b-8df0-4330-9887-4092260d416a","Type":"ContainerDied","Data":"5e8fcdb2f8638bd16f9f163bfced3472de42f2c6547d60287a9a8804c7cc74e3"} Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.934450 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-nj856" event={"ID":"f9b3244b-8df0-4330-9887-4092260d416a","Type":"ContainerDied","Data":"42c43e046cf7b3c37122d69fdadfdf37b294da8c4d73ad8e7c4ac09039e43f1c"} Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.934474 4764 scope.go:117] "RemoveContainer" containerID="5e8fcdb2f8638bd16f9f163bfced3472de42f2c6547d60287a9a8804c7cc74e3" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.934488 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-nj856" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.941661 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.942892 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.942938 4764 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c7857b72b1425eeefd8acf226dd6f0d6c783e14267fcfb79dd038118573a2b26" exitCode=1 Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.943015 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c7857b72b1425eeefd8acf226dd6f0d6c783e14267fcfb79dd038118573a2b26"} Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.943721 4764 scope.go:117] "RemoveContainer" containerID="c7857b72b1425eeefd8acf226dd6f0d6c783e14267fcfb79dd038118573a2b26" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.951202 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"63b66a410c949ae6d3d6dc6b0c563f5b581fecd85b8265b0c00a4c83886e2ce0"} Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.951253 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7e060219163b02e46f8db3481391682e57e0b6de76226ef075e51a64be0bec36"} Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.951266 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"df6c1ff2f65d47f0a3c78d6dc60d07c1b853d55ca9eb1473ea9b5a644648e0a1"} Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.951276 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"183304b472a2d0deb0b611dfab80a08e0d465eddb0d348720b4b779cab544b60"} Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.959134 4764 scope.go:117] "RemoveContainer" containerID="5e8fcdb2f8638bd16f9f163bfced3472de42f2c6547d60287a9a8804c7cc74e3" Mar 09 13:26:23 crc kubenswrapper[4764]: E0309 13:26:23.959782 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e8fcdb2f8638bd16f9f163bfced3472de42f2c6547d60287a9a8804c7cc74e3\": container with ID starting with 5e8fcdb2f8638bd16f9f163bfced3472de42f2c6547d60287a9a8804c7cc74e3 not found: ID does not exist" containerID="5e8fcdb2f8638bd16f9f163bfced3472de42f2c6547d60287a9a8804c7cc74e3" Mar 09 13:26:23 crc kubenswrapper[4764]: I0309 13:26:23.959816 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e8fcdb2f8638bd16f9f163bfced3472de42f2c6547d60287a9a8804c7cc74e3"} err="failed to get container status \"5e8fcdb2f8638bd16f9f163bfced3472de42f2c6547d60287a9a8804c7cc74e3\": rpc error: code = NotFound desc = could not find container \"5e8fcdb2f8638bd16f9f163bfced3472de42f2c6547d60287a9a8804c7cc74e3\": container with ID starting with 5e8fcdb2f8638bd16f9f163bfced3472de42f2c6547d60287a9a8804c7cc74e3 not found: ID does not exist" Mar 09 13:26:24 crc kubenswrapper[4764]: I0309 13:26:24.643171 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:26:24 crc kubenswrapper[4764]: I0309 13:26:24.643428 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:26:24 crc kubenswrapper[4764]: I0309 13:26:24.643458 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:26:24 crc kubenswrapper[4764]: I0309 13:26:24.643488 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:26:24 crc kubenswrapper[4764]: I0309 13:26:24.963902 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 13:26:24 crc kubenswrapper[4764]: I0309 13:26:24.965157 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 09 13:26:24 crc kubenswrapper[4764]: I0309 13:26:24.965243 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fb20688a647451d7fe5639f828c352ed485f469ea775fff8f4e5ab64cd8e6498"} Mar 09 13:26:24 crc kubenswrapper[4764]: I0309 13:26:24.969952 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"24bdf5877ca80ce2fd116bfc5e95f5b85a8ad7217195967412e57a418efae285"} Mar 09 13:26:24 crc kubenswrapper[4764]: I0309 13:26:24.970320 4764 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7c21ab2-1820-47de-a61d-71d81928564a" Mar 09 13:26:24 crc kubenswrapper[4764]: I0309 13:26:24.970347 4764 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7c21ab2-1820-47de-a61d-71d81928564a" Mar 09 13:26:24 crc kubenswrapper[4764]: I0309 13:26:24.971736 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:25 crc kubenswrapper[4764]: E0309 13:26:25.644423 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: failed to sync secret cache: timed out waiting for the condition Mar 09 13:26:25 crc kubenswrapper[4764]: E0309 13:26:25.644493 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 09 13:26:25 crc kubenswrapper[4764]: E0309 13:26:25.644514 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 09 13:26:25 crc kubenswrapper[4764]: E0309 13:26:25.644549 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: failed to sync configmap cache: timed out waiting for the condition Mar 09 13:26:25 crc kubenswrapper[4764]: E0309 13:26:25.644520 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:28:27.644497082 +0000 UTC m=+462.894668990 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync secret cache: timed out waiting for the condition Mar 09 13:26:25 crc kubenswrapper[4764]: E0309 13:26:25.644628 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:28:27.644608985 +0000 UTC m=+462.894780893 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync configmap cache: timed out waiting for the condition Mar 09 13:26:26 crc kubenswrapper[4764]: E0309 13:26:26.644992 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 09 13:26:26 crc kubenswrapper[4764]: E0309 13:26:26.645022 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 09 13:26:26 crc kubenswrapper[4764]: E0309 13:26:26.645055 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: failed to sync configmap cache: timed out waiting for the condition Mar 09 13:26:26 crc kubenswrapper[4764]: E0309 13:26:26.645078 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: failed to sync configmap cache: timed out waiting for the condition Mar 09 13:26:26 crc kubenswrapper[4764]: E0309 13:26:26.645177 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:28:28.645141737 +0000 UTC m=+463.895313655 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : failed to sync configmap cache: timed out waiting for the condition Mar 09 13:26:26 crc kubenswrapper[4764]: E0309 13:26:26.645205 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:28:28.645194838 +0000 UTC m=+463.895366756 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : failed to sync configmap cache: timed out waiting for the condition Mar 09 13:26:27 crc kubenswrapper[4764]: I0309 13:26:27.588452 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:27 crc kubenswrapper[4764]: I0309 13:26:27.588890 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:27 crc kubenswrapper[4764]: I0309 13:26:27.596153 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:29 crc kubenswrapper[4764]: I0309 13:26:29.649759 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 09 13:26:29 crc kubenswrapper[4764]: I0309 13:26:29.649763 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 09 13:26:29 crc kubenswrapper[4764]: I0309 13:26:29.649971 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 09 13:26:29 crc kubenswrapper[4764]: I0309 13:26:29.980448 4764 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:30 crc kubenswrapper[4764]: I0309 13:26:30.568195 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:26:30 crc kubenswrapper[4764]: I0309 13:26:30.568386 4764 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 09 13:26:30 crc kubenswrapper[4764]: I0309 13:26:30.568432 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 09 13:26:30 crc kubenswrapper[4764]: I0309 13:26:30.647874 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 09 13:26:31 crc kubenswrapper[4764]: I0309 13:26:31.007984 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_71bb4a3aecc4ba5b26c4b7318770ce13/kube-apiserver-check-endpoints/0.log" Mar 09 13:26:31 crc kubenswrapper[4764]: I0309 13:26:31.010258 4764 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="24bdf5877ca80ce2fd116bfc5e95f5b85a8ad7217195967412e57a418efae285" exitCode=255 Mar 09 13:26:31 crc kubenswrapper[4764]: I0309 13:26:31.010307 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"24bdf5877ca80ce2fd116bfc5e95f5b85a8ad7217195967412e57a418efae285"} Mar 09 13:26:31 crc kubenswrapper[4764]: I0309 13:26:31.010739 4764 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7c21ab2-1820-47de-a61d-71d81928564a" Mar 09 13:26:31 crc kubenswrapper[4764]: I0309 13:26:31.010793 4764 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7c21ab2-1820-47de-a61d-71d81928564a" Mar 09 13:26:31 crc kubenswrapper[4764]: I0309 13:26:31.014034 4764 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="9a603f3e-3222-4fb3-8ede-6d83ee2e90f5" Mar 09 13:26:31 crc kubenswrapper[4764]: I0309 13:26:31.014497 4764 scope.go:117] "RemoveContainer" containerID="24bdf5877ca80ce2fd116bfc5e95f5b85a8ad7217195967412e57a418efae285" Mar 09 13:26:31 crc kubenswrapper[4764]: I0309 13:26:31.015361 4764 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://183304b472a2d0deb0b611dfab80a08e0d465eddb0d348720b4b779cab544b60" Mar 09 13:26:31 crc kubenswrapper[4764]: I0309 13:26:31.015386 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:32 crc kubenswrapper[4764]: I0309 13:26:32.016269 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_71bb4a3aecc4ba5b26c4b7318770ce13/kube-apiserver-check-endpoints/0.log" Mar 09 13:26:32 crc kubenswrapper[4764]: I0309 13:26:32.017900 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"cf28d9c9b484bede2e19ccb05028dda2bfdc5dce7bd6007e9619002f8a6be71f"} Mar 09 13:26:32 crc kubenswrapper[4764]: I0309 13:26:32.018134 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:32 crc kubenswrapper[4764]: I0309 13:26:32.018215 4764 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7c21ab2-1820-47de-a61d-71d81928564a" Mar 09 13:26:32 crc kubenswrapper[4764]: I0309 13:26:32.018339 4764 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7c21ab2-1820-47de-a61d-71d81928564a" Mar 09 13:26:32 crc kubenswrapper[4764]: I0309 13:26:32.121636 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:26:33 crc kubenswrapper[4764]: I0309 13:26:33.022525 4764 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7c21ab2-1820-47de-a61d-71d81928564a" Mar 09 13:26:33 crc kubenswrapper[4764]: I0309 13:26:33.022551 4764 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7c21ab2-1820-47de-a61d-71d81928564a" Mar 09 13:26:35 crc kubenswrapper[4764]: E0309 13:26:35.572002 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-cqllr], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:26:35 crc kubenswrapper[4764]: I0309 13:26:35.575403 4764 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="9a603f3e-3222-4fb3-8ede-6d83ee2e90f5" Mar 09 13:26:35 crc kubenswrapper[4764]: E0309 13:26:35.579911 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert nginx-conf], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:26:35 crc kubenswrapper[4764]: E0309 13:26:35.585489 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-s2dwl], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:26:39 crc kubenswrapper[4764]: I0309 13:26:39.511604 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 09 13:26:40 crc kubenswrapper[4764]: I0309 13:26:40.569122 4764 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 09 13:26:40 crc kubenswrapper[4764]: I0309 13:26:40.569208 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 09 13:26:41 crc kubenswrapper[4764]: I0309 13:26:41.016934 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 09 13:26:41 crc kubenswrapper[4764]: I0309 13:26:41.572446 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 09 13:26:41 crc kubenswrapper[4764]: I0309 13:26:41.583606 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 09 13:26:41 crc kubenswrapper[4764]: I0309 13:26:41.687352 4764 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 09 13:26:41 crc kubenswrapper[4764]: I0309 13:26:41.812514 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 09 13:26:42 crc kubenswrapper[4764]: I0309 13:26:42.142236 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 09 13:26:42 crc kubenswrapper[4764]: I0309 13:26:42.416970 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 09 13:26:42 crc kubenswrapper[4764]: I0309 13:26:42.817593 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 09 13:26:42 crc kubenswrapper[4764]: I0309 13:26:42.820881 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 09 13:26:42 crc kubenswrapper[4764]: I0309 13:26:42.855792 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 09 13:26:42 crc kubenswrapper[4764]: I0309 13:26:42.925357 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 09 13:26:43 crc kubenswrapper[4764]: I0309 13:26:43.163289 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 09 13:26:43 crc kubenswrapper[4764]: I0309 13:26:43.334825 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 09 13:26:43 crc kubenswrapper[4764]: I0309 13:26:43.358602 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 09 13:26:43 crc kubenswrapper[4764]: I0309 13:26:43.569505 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 09 13:26:43 crc kubenswrapper[4764]: I0309 13:26:43.728710 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 09 13:26:43 crc kubenswrapper[4764]: I0309 13:26:43.831047 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 09 13:26:43 crc kubenswrapper[4764]: I0309 13:26:43.915688 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 09 13:26:43 crc kubenswrapper[4764]: I0309 13:26:43.924124 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.023754 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.174393 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.224741 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.225682 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.236845 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.308969 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.390316 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.432366 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.469210 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.490744 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.522574 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.558025 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.558592 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.647498 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.648241 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.723234 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.735940 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.780765 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.927459 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.982733 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 13:26:44 crc kubenswrapper[4764]: I0309 13:26:44.986845 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.023004 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.039993 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.048287 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.105022 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.122868 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.127437 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.174771 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.267070 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.298014 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.362999 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.486093 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.590059 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.620761 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.639573 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.645314 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.758833 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.802379 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.809578 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.814616 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 09 13:26:45 crc kubenswrapper[4764]: I0309 13:26:45.829574 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 09 13:26:46 crc kubenswrapper[4764]: I0309 13:26:46.184394 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 09 13:26:46 crc kubenswrapper[4764]: I0309 13:26:46.228943 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 09 13:26:46 crc kubenswrapper[4764]: I0309 13:26:46.289829 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 09 13:26:46 crc kubenswrapper[4764]: I0309 13:26:46.507470 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 09 13:26:46 crc kubenswrapper[4764]: I0309 13:26:46.559792 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:26:46 crc kubenswrapper[4764]: I0309 13:26:46.613695 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 09 13:26:46 crc kubenswrapper[4764]: I0309 13:26:46.665370 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 09 13:26:46 crc kubenswrapper[4764]: I0309 13:26:46.665419 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 09 13:26:46 crc kubenswrapper[4764]: I0309 13:26:46.756635 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 09 13:26:46 crc kubenswrapper[4764]: I0309 13:26:46.771565 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 09 13:26:46 crc kubenswrapper[4764]: I0309 13:26:46.829556 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 09 13:26:46 crc kubenswrapper[4764]: I0309 13:26:46.892377 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 09 13:26:46 crc kubenswrapper[4764]: I0309 13:26:46.900235 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.037195 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.152213 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.202735 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.223973 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.226478 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.235999 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.325310 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.343104 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.394857 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.440335 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.498201 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.500567 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.520524 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.651713 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.725721 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.751694 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 09 13:26:47 crc kubenswrapper[4764]: I0309 13:26:47.752281 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.094748 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.143158 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.161153 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.167018 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.194145 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.211696 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.383108 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.572208 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.586147 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.611052 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.654963 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.682769 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.714887 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.723293 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.725179 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.746703 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.773921 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.812323 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.813535 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.813943 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 09 13:26:48 crc kubenswrapper[4764]: I0309 13:26:48.970790 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.047164 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.049434 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.053243 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.056141 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.135775 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.136057 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.177245 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.200632 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.260815 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.516941 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.531181 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.556902 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.643380 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.703147 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.763860 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.784787 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 09 13:26:49 crc kubenswrapper[4764]: I0309 13:26:49.932208 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.166823 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.176305 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.233754 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.305474 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.333829 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.377834 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.389582 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.419716 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.425787 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.446635 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.530109 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.531953 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.559190 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.559285 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.569035 4764 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.569104 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.569144 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.569519 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"fb20688a647451d7fe5639f828c352ed485f469ea775fff8f4e5ab64cd8e6498"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.569666 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://fb20688a647451d7fe5639f828c352ed485f469ea775fff8f4e5ab64cd8e6498" gracePeriod=30 Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.571454 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.578776 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.591779 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.612148 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.648357 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.698484 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.778170 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.826615 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.878898 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.933714 4764 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 09 13:26:50 crc kubenswrapper[4764]: I0309 13:26:50.966634 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.204226 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.262649 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.275222 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.308256 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.326383 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.364981 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.438691 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.484072 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.493701 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.500843 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.519404 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.536522 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.559737 4764 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.562145 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.623610 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.761398 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.829105 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 09 13:26:51 crc kubenswrapper[4764]: I0309 13:26:51.847288 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.107327 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.159877 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.218166 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.218283 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.368180 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.463875 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.530881 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.538894 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.663412 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.694929 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.710053 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.804024 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.840945 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.883442 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.959587 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 09 13:26:52 crc kubenswrapper[4764]: I0309 13:26:52.985851 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.196168 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.265381 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.331227 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.337073 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.446908 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.487943 4764 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.497482 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.512234 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.519470 4764 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.520140 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=44.520122512 podStartE2EDuration="44.520122512s" podCreationTimestamp="2026-03-09 13:26:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:26:29.66090189 +0000 UTC m=+344.911073808" watchObservedRunningTime="2026-03-09 13:26:53.520122512 +0000 UTC m=+368.770294430" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.525525 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nj856","openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.525584 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-fb6b676c8-zsq8z","openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 13:26:53 crc kubenswrapper[4764]: E0309 13:26:53.525895 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9b3244b-8df0-4330-9887-4092260d416a" containerName="oauth-openshift" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.525915 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9b3244b-8df0-4330-9887-4092260d416a" containerName="oauth-openshift" Mar 09 13:26:53 crc kubenswrapper[4764]: E0309 13:26:53.525940 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6079e5ed-2acc-42f5-a62e-ea2a98b18abd" containerName="installer" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.525948 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6079e5ed-2acc-42f5-a62e-ea2a98b18abd" containerName="installer" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.526055 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9b3244b-8df0-4330-9887-4092260d416a" containerName="oauth-openshift" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.526071 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6079e5ed-2acc-42f5-a62e-ea2a98b18abd" containerName="installer" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.526083 4764 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7c21ab2-1820-47de-a61d-71d81928564a" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.526108 4764 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7c21ab2-1820-47de-a61d-71d81928564a" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.526453 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g7k9k"] Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.526594 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.526709 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-g7k9k" podUID="7a967c79-e11e-4c58-b42e-652d1406ac88" containerName="registry-server" containerID="cri-o://018f9058cd176b4bd85208014309fc2aaec24cd432697bbe4160e6f16524159a" gracePeriod=2 Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.531916 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.532148 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.532295 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.532608 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.533251 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.533425 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.533747 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.534084 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.535662 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.537623 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.538026 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.538190 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.538568 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.540211 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.544605 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.545953 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.546732 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.557020 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.558100 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.567517 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=24.567492697 podStartE2EDuration="24.567492697s" podCreationTimestamp="2026-03-09 13:26:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:26:53.563555319 +0000 UTC m=+368.813727237" watchObservedRunningTime="2026-03-09 13:26:53.567492697 +0000 UTC m=+368.817664615" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.568790 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9b3244b-8df0-4330-9887-4092260d416a" path="/var/lib/kubelet/pods/f9b3244b-8df0-4330-9887-4092260d416a/volumes" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.625578 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.648491 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-session\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.649028 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-user-template-error\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.649064 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.649095 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.649264 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/30bdf0f7-c597-42b1-80a1-20dd593c3333-audit-dir\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.649469 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.649550 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.649603 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k489f\" (UniqueName: \"kubernetes.io/projected/30bdf0f7-c597-42b1-80a1-20dd593c3333-kube-api-access-k489f\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.649633 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-router-certs\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.649680 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-cliconfig\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.649728 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/30bdf0f7-c597-42b1-80a1-20dd593c3333-audit-policies\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.649900 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-serving-cert\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.649936 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-user-template-login\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.649975 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-service-ca\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.705529 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.751131 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.751193 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.751217 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k489f\" (UniqueName: \"kubernetes.io/projected/30bdf0f7-c597-42b1-80a1-20dd593c3333-kube-api-access-k489f\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.751234 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-cliconfig\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.751255 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-router-certs\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.751290 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/30bdf0f7-c597-42b1-80a1-20dd593c3333-audit-policies\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.751330 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-serving-cert\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.751350 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-user-template-login\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.751371 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-service-ca\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.751420 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-session\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.751436 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-user-template-error\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.751453 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.751470 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.751493 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/30bdf0f7-c597-42b1-80a1-20dd593c3333-audit-dir\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.751578 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/30bdf0f7-c597-42b1-80a1-20dd593c3333-audit-dir\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.752351 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.752947 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-cliconfig\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.752959 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-service-ca\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.753577 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/30bdf0f7-c597-42b1-80a1-20dd593c3333-audit-policies\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.756817 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.756900 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.757200 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-router-certs\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.757264 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-session\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.757610 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-serving-cert\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.757875 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-user-template-login\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.758632 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-user-template-error\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.759877 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/30bdf0f7-c597-42b1-80a1-20dd593c3333-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.770969 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k489f\" (UniqueName: \"kubernetes.io/projected/30bdf0f7-c597-42b1-80a1-20dd593c3333-kube-api-access-k489f\") pod \"oauth-openshift-fb6b676c8-zsq8z\" (UID: \"30bdf0f7-c597-42b1-80a1-20dd593c3333\") " pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.816168 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.845699 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.851971 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.934907 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.958983 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.982445 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 09 13:26:53 crc kubenswrapper[4764]: I0309 13:26:53.985775 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.061272 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.137326 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.156972 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a967c79-e11e-4c58-b42e-652d1406ac88-utilities\") pod \"7a967c79-e11e-4c58-b42e-652d1406ac88\" (UID: \"7a967c79-e11e-4c58-b42e-652d1406ac88\") " Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.157162 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx2jl\" (UniqueName: \"kubernetes.io/projected/7a967c79-e11e-4c58-b42e-652d1406ac88-kube-api-access-gx2jl\") pod \"7a967c79-e11e-4c58-b42e-652d1406ac88\" (UID: \"7a967c79-e11e-4c58-b42e-652d1406ac88\") " Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.157221 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a967c79-e11e-4c58-b42e-652d1406ac88-catalog-content\") pod \"7a967c79-e11e-4c58-b42e-652d1406ac88\" (UID: \"7a967c79-e11e-4c58-b42e-652d1406ac88\") " Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.160472 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a967c79-e11e-4c58-b42e-652d1406ac88-utilities" (OuterVolumeSpecName: "utilities") pod "7a967c79-e11e-4c58-b42e-652d1406ac88" (UID: "7a967c79-e11e-4c58-b42e-652d1406ac88"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.164020 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a967c79-e11e-4c58-b42e-652d1406ac88-kube-api-access-gx2jl" (OuterVolumeSpecName: "kube-api-access-gx2jl") pod "7a967c79-e11e-4c58-b42e-652d1406ac88" (UID: "7a967c79-e11e-4c58-b42e-652d1406ac88"). InnerVolumeSpecName "kube-api-access-gx2jl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.172729 4764 generic.go:334] "Generic (PLEG): container finished" podID="7a967c79-e11e-4c58-b42e-652d1406ac88" containerID="018f9058cd176b4bd85208014309fc2aaec24cd432697bbe4160e6f16524159a" exitCode=0 Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.172835 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g7k9k" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.173474 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7k9k" event={"ID":"7a967c79-e11e-4c58-b42e-652d1406ac88","Type":"ContainerDied","Data":"018f9058cd176b4bd85208014309fc2aaec24cd432697bbe4160e6f16524159a"} Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.173512 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7k9k" event={"ID":"7a967c79-e11e-4c58-b42e-652d1406ac88","Type":"ContainerDied","Data":"2183e838c5144408fdc015b8deb0cb2c5e715404d51e8b64aa5f21859f0ebf3c"} Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.173532 4764 scope.go:117] "RemoveContainer" containerID="018f9058cd176b4bd85208014309fc2aaec24cd432697bbe4160e6f16524159a" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.194259 4764 scope.go:117] "RemoveContainer" containerID="5ed7c7fd6e4e524d04d9ab3277bdad76d1f17e39f4ce65077628067a3a616861" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.198923 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a967c79-e11e-4c58-b42e-652d1406ac88-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a967c79-e11e-4c58-b42e-652d1406ac88" (UID: "7a967c79-e11e-4c58-b42e-652d1406ac88"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.214769 4764 scope.go:117] "RemoveContainer" containerID="a7592b9bea39d66aa3632078215711f53e09d218ed98410b531203d64bd85eca" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.230972 4764 scope.go:117] "RemoveContainer" containerID="018f9058cd176b4bd85208014309fc2aaec24cd432697bbe4160e6f16524159a" Mar 09 13:26:54 crc kubenswrapper[4764]: E0309 13:26:54.231647 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"018f9058cd176b4bd85208014309fc2aaec24cd432697bbe4160e6f16524159a\": container with ID starting with 018f9058cd176b4bd85208014309fc2aaec24cd432697bbe4160e6f16524159a not found: ID does not exist" containerID="018f9058cd176b4bd85208014309fc2aaec24cd432697bbe4160e6f16524159a" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.231699 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"018f9058cd176b4bd85208014309fc2aaec24cd432697bbe4160e6f16524159a"} err="failed to get container status \"018f9058cd176b4bd85208014309fc2aaec24cd432697bbe4160e6f16524159a\": rpc error: code = NotFound desc = could not find container \"018f9058cd176b4bd85208014309fc2aaec24cd432697bbe4160e6f16524159a\": container with ID starting with 018f9058cd176b4bd85208014309fc2aaec24cd432697bbe4160e6f16524159a not found: ID does not exist" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.231722 4764 scope.go:117] "RemoveContainer" containerID="5ed7c7fd6e4e524d04d9ab3277bdad76d1f17e39f4ce65077628067a3a616861" Mar 09 13:26:54 crc kubenswrapper[4764]: E0309 13:26:54.232134 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ed7c7fd6e4e524d04d9ab3277bdad76d1f17e39f4ce65077628067a3a616861\": container with ID starting with 5ed7c7fd6e4e524d04d9ab3277bdad76d1f17e39f4ce65077628067a3a616861 not found: ID does not exist" containerID="5ed7c7fd6e4e524d04d9ab3277bdad76d1f17e39f4ce65077628067a3a616861" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.232210 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ed7c7fd6e4e524d04d9ab3277bdad76d1f17e39f4ce65077628067a3a616861"} err="failed to get container status \"5ed7c7fd6e4e524d04d9ab3277bdad76d1f17e39f4ce65077628067a3a616861\": rpc error: code = NotFound desc = could not find container \"5ed7c7fd6e4e524d04d9ab3277bdad76d1f17e39f4ce65077628067a3a616861\": container with ID starting with 5ed7c7fd6e4e524d04d9ab3277bdad76d1f17e39f4ce65077628067a3a616861 not found: ID does not exist" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.232300 4764 scope.go:117] "RemoveContainer" containerID="a7592b9bea39d66aa3632078215711f53e09d218ed98410b531203d64bd85eca" Mar 09 13:26:54 crc kubenswrapper[4764]: E0309 13:26:54.232983 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7592b9bea39d66aa3632078215711f53e09d218ed98410b531203d64bd85eca\": container with ID starting with a7592b9bea39d66aa3632078215711f53e09d218ed98410b531203d64bd85eca not found: ID does not exist" containerID="a7592b9bea39d66aa3632078215711f53e09d218ed98410b531203d64bd85eca" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.233038 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7592b9bea39d66aa3632078215711f53e09d218ed98410b531203d64bd85eca"} err="failed to get container status \"a7592b9bea39d66aa3632078215711f53e09d218ed98410b531203d64bd85eca\": rpc error: code = NotFound desc = could not find container \"a7592b9bea39d66aa3632078215711f53e09d218ed98410b531203d64bd85eca\": container with ID starting with a7592b9bea39d66aa3632078215711f53e09d218ed98410b531203d64bd85eca not found: ID does not exist" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.260003 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gx2jl\" (UniqueName: \"kubernetes.io/projected/7a967c79-e11e-4c58-b42e-652d1406ac88-kube-api-access-gx2jl\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.260030 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a967c79-e11e-4c58-b42e-652d1406ac88-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.260038 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a967c79-e11e-4c58-b42e-652d1406ac88-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.289737 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-fb6b676c8-zsq8z"] Mar 09 13:26:54 crc kubenswrapper[4764]: W0309 13:26:54.292374 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30bdf0f7_c597_42b1_80a1_20dd593c3333.slice/crio-e2458c779c30a86ce1ffd2d7c27698d8f52d1ee90a69670513e1fb9403db4910 WatchSource:0}: Error finding container e2458c779c30a86ce1ffd2d7c27698d8f52d1ee90a69670513e1fb9403db4910: Status 404 returned error can't find the container with id e2458c779c30a86ce1ffd2d7c27698d8f52d1ee90a69670513e1fb9403db4910 Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.502411 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g7k9k"] Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.508914 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-g7k9k"] Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.583503 4764 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.663312 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.778787 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.810030 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 09 13:26:54 crc kubenswrapper[4764]: I0309 13:26:54.846390 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.088875 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.179591 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" event={"ID":"30bdf0f7-c597-42b1-80a1-20dd593c3333","Type":"ContainerStarted","Data":"fbf8ad1016fa6322c6426f8e0aa6c9b8ce7c5b7f4dd4a83bfa7292d4743055e4"} Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.179637 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" event={"ID":"30bdf0f7-c597-42b1-80a1-20dd593c3333","Type":"ContainerStarted","Data":"e2458c779c30a86ce1ffd2d7c27698d8f52d1ee90a69670513e1fb9403db4910"} Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.179968 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.185582 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.198415 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-fb6b676c8-zsq8z" podStartSLOduration=58.198400025 podStartE2EDuration="58.198400025s" podCreationTimestamp="2026-03-09 13:25:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:26:55.198182589 +0000 UTC m=+370.448354507" watchObservedRunningTime="2026-03-09 13:26:55.198400025 +0000 UTC m=+370.448571933" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.226452 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.411833 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.424490 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.443096 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.479278 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.542499 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.574535 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a967c79-e11e-4c58-b42e-652d1406ac88" path="/var/lib/kubelet/pods/7a967c79-e11e-4c58-b42e-652d1406ac88/volumes" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.620292 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.835156 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.835972 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.919068 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 09 13:26:55 crc kubenswrapper[4764]: I0309 13:26:55.919152 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 09 13:26:56 crc kubenswrapper[4764]: I0309 13:26:56.167143 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 09 13:26:56 crc kubenswrapper[4764]: I0309 13:26:56.203141 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 09 13:26:56 crc kubenswrapper[4764]: I0309 13:26:56.463675 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 09 13:26:56 crc kubenswrapper[4764]: I0309 13:26:56.634353 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 09 13:26:56 crc kubenswrapper[4764]: I0309 13:26:56.655196 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 09 13:26:57 crc kubenswrapper[4764]: I0309 13:26:57.301948 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 09 13:26:57 crc kubenswrapper[4764]: I0309 13:26:57.650688 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 09 13:26:57 crc kubenswrapper[4764]: I0309 13:26:57.763438 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 09 13:26:57 crc kubenswrapper[4764]: I0309 13:26:57.788131 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 09 13:26:59 crc kubenswrapper[4764]: I0309 13:26:59.924739 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 09 13:27:03 crc kubenswrapper[4764]: I0309 13:27:03.595521 4764 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 13:27:03 crc kubenswrapper[4764]: I0309 13:27:03.596036 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://34d89c76e37e470d7c44361ef7ee1eb9e62495b1f676f3e88198ea7b09a66772" gracePeriod=5 Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.173184 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.174061 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.257611 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.258087 4764 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="34d89c76e37e470d7c44361ef7ee1eb9e62495b1f676f3e88198ea7b09a66772" exitCode=137 Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.258168 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.258175 4764 scope.go:117] "RemoveContainer" containerID="34d89c76e37e470d7c44361ef7ee1eb9e62495b1f676f3e88198ea7b09a66772" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.275796 4764 scope.go:117] "RemoveContainer" containerID="34d89c76e37e470d7c44361ef7ee1eb9e62495b1f676f3e88198ea7b09a66772" Mar 09 13:27:09 crc kubenswrapper[4764]: E0309 13:27:09.276364 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34d89c76e37e470d7c44361ef7ee1eb9e62495b1f676f3e88198ea7b09a66772\": container with ID starting with 34d89c76e37e470d7c44361ef7ee1eb9e62495b1f676f3e88198ea7b09a66772 not found: ID does not exist" containerID="34d89c76e37e470d7c44361ef7ee1eb9e62495b1f676f3e88198ea7b09a66772" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.276406 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34d89c76e37e470d7c44361ef7ee1eb9e62495b1f676f3e88198ea7b09a66772"} err="failed to get container status \"34d89c76e37e470d7c44361ef7ee1eb9e62495b1f676f3e88198ea7b09a66772\": rpc error: code = NotFound desc = could not find container \"34d89c76e37e470d7c44361ef7ee1eb9e62495b1f676f3e88198ea7b09a66772\": container with ID starting with 34d89c76e37e470d7c44361ef7ee1eb9e62495b1f676f3e88198ea7b09a66772 not found: ID does not exist" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.366823 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.366892 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.366946 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.366980 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.367038 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.367488 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.367545 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.367625 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.367958 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.376140 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.468587 4764 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.468673 4764 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.468686 4764 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.468695 4764 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.468707 4764 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.566885 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.567168 4764 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.582560 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.582626 4764 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="9b9d0900-658e-4494-b290-05115386e626" Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.586980 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 13:27:09 crc kubenswrapper[4764]: I0309 13:27:09.587014 4764 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="9b9d0900-658e-4494-b290-05115386e626" Mar 09 13:27:21 crc kubenswrapper[4764]: I0309 13:27:21.351316 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 09 13:27:21 crc kubenswrapper[4764]: I0309 13:27:21.354206 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 13:27:21 crc kubenswrapper[4764]: I0309 13:27:21.355469 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 09 13:27:21 crc kubenswrapper[4764]: I0309 13:27:21.355518 4764 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="fb20688a647451d7fe5639f828c352ed485f469ea775fff8f4e5ab64cd8e6498" exitCode=137 Mar 09 13:27:21 crc kubenswrapper[4764]: I0309 13:27:21.355550 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"fb20688a647451d7fe5639f828c352ed485f469ea775fff8f4e5ab64cd8e6498"} Mar 09 13:27:21 crc kubenswrapper[4764]: I0309 13:27:21.355590 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2ab451f1911e48e54d9d1932bca99cb132f443fb9a8d6dc8e6229a6e072ef358"} Mar 09 13:27:21 crc kubenswrapper[4764]: I0309 13:27:21.355612 4764 scope.go:117] "RemoveContainer" containerID="c7857b72b1425eeefd8acf226dd6f0d6c783e14267fcfb79dd038118573a2b26" Mar 09 13:27:22 crc kubenswrapper[4764]: I0309 13:27:22.121269 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:27:22 crc kubenswrapper[4764]: I0309 13:27:22.366275 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 09 13:27:22 crc kubenswrapper[4764]: I0309 13:27:22.367952 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 13:27:30 crc kubenswrapper[4764]: I0309 13:27:30.568440 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:27:30 crc kubenswrapper[4764]: I0309 13:27:30.575342 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:27:31 crc kubenswrapper[4764]: I0309 13:27:31.418451 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:27:58 crc kubenswrapper[4764]: I0309 13:27:58.370731 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:27:58 crc kubenswrapper[4764]: I0309 13:27:58.371381 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.204961 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551048-fgf8g"] Mar 09 13:28:00 crc kubenswrapper[4764]: E0309 13:28:00.205804 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a967c79-e11e-4c58-b42e-652d1406ac88" containerName="extract-utilities" Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.205829 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a967c79-e11e-4c58-b42e-652d1406ac88" containerName="extract-utilities" Mar 09 13:28:00 crc kubenswrapper[4764]: E0309 13:28:00.205865 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a967c79-e11e-4c58-b42e-652d1406ac88" containerName="extract-content" Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.205878 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a967c79-e11e-4c58-b42e-652d1406ac88" containerName="extract-content" Mar 09 13:28:00 crc kubenswrapper[4764]: E0309 13:28:00.205905 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a967c79-e11e-4c58-b42e-652d1406ac88" containerName="registry-server" Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.205922 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a967c79-e11e-4c58-b42e-652d1406ac88" containerName="registry-server" Mar 09 13:28:00 crc kubenswrapper[4764]: E0309 13:28:00.205946 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.205958 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.206380 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a967c79-e11e-4c58-b42e-652d1406ac88" containerName="registry-server" Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.206444 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.207334 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551048-fgf8g" Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.209974 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.211563 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.212075 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.244285 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd5xb\" (UniqueName: \"kubernetes.io/projected/0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f-kube-api-access-cd5xb\") pod \"auto-csr-approver-29551048-fgf8g\" (UID: \"0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f\") " pod="openshift-infra/auto-csr-approver-29551048-fgf8g" Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.249582 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551048-fgf8g"] Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.345971 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd5xb\" (UniqueName: \"kubernetes.io/projected/0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f-kube-api-access-cd5xb\") pod \"auto-csr-approver-29551048-fgf8g\" (UID: \"0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f\") " pod="openshift-infra/auto-csr-approver-29551048-fgf8g" Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.373943 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd5xb\" (UniqueName: \"kubernetes.io/projected/0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f-kube-api-access-cd5xb\") pod \"auto-csr-approver-29551048-fgf8g\" (UID: \"0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f\") " pod="openshift-infra/auto-csr-approver-29551048-fgf8g" Mar 09 13:28:00 crc kubenswrapper[4764]: I0309 13:28:00.541111 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551048-fgf8g" Mar 09 13:28:01 crc kubenswrapper[4764]: I0309 13:28:01.006035 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551048-fgf8g"] Mar 09 13:28:01 crc kubenswrapper[4764]: I0309 13:28:01.616589 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551048-fgf8g" event={"ID":"0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f","Type":"ContainerStarted","Data":"dff5be776046969c512855e99368dd5e3ffa23c7cd0821f06ec61907f21c8cae"} Mar 09 13:28:02 crc kubenswrapper[4764]: I0309 13:28:02.626031 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551048-fgf8g" event={"ID":"0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f","Type":"ContainerStarted","Data":"f44561c47745677a2b6e923d3f449a7c01740fc8b6e465eeb90cec0a7d1ebe67"} Mar 09 13:28:02 crc kubenswrapper[4764]: I0309 13:28:02.641696 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551048-fgf8g" podStartSLOduration=1.454571248 podStartE2EDuration="2.641677917s" podCreationTimestamp="2026-03-09 13:28:00 +0000 UTC" firstStartedPulling="2026-03-09 13:28:01.021756345 +0000 UTC m=+436.271928253" lastFinishedPulling="2026-03-09 13:28:02.208863014 +0000 UTC m=+437.459034922" observedRunningTime="2026-03-09 13:28:02.639469067 +0000 UTC m=+437.889640975" watchObservedRunningTime="2026-03-09 13:28:02.641677917 +0000 UTC m=+437.891849825" Mar 09 13:28:03 crc kubenswrapper[4764]: I0309 13:28:03.635552 4764 generic.go:334] "Generic (PLEG): container finished" podID="0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f" containerID="f44561c47745677a2b6e923d3f449a7c01740fc8b6e465eeb90cec0a7d1ebe67" exitCode=0 Mar 09 13:28:03 crc kubenswrapper[4764]: I0309 13:28:03.635607 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551048-fgf8g" event={"ID":"0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f","Type":"ContainerDied","Data":"f44561c47745677a2b6e923d3f449a7c01740fc8b6e465eeb90cec0a7d1ebe67"} Mar 09 13:28:04 crc kubenswrapper[4764]: I0309 13:28:04.973290 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551048-fgf8g" Mar 09 13:28:05 crc kubenswrapper[4764]: I0309 13:28:05.119568 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd5xb\" (UniqueName: \"kubernetes.io/projected/0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f-kube-api-access-cd5xb\") pod \"0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f\" (UID: \"0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f\") " Mar 09 13:28:05 crc kubenswrapper[4764]: I0309 13:28:05.126800 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f-kube-api-access-cd5xb" (OuterVolumeSpecName: "kube-api-access-cd5xb") pod "0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f" (UID: "0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f"). InnerVolumeSpecName "kube-api-access-cd5xb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:28:05 crc kubenswrapper[4764]: I0309 13:28:05.222293 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cd5xb\" (UniqueName: \"kubernetes.io/projected/0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f-kube-api-access-cd5xb\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:05 crc kubenswrapper[4764]: I0309 13:28:05.658043 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551048-fgf8g" Mar 09 13:28:05 crc kubenswrapper[4764]: I0309 13:28:05.657895 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551048-fgf8g" event={"ID":"0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f","Type":"ContainerDied","Data":"dff5be776046969c512855e99368dd5e3ffa23c7cd0821f06ec61907f21c8cae"} Mar 09 13:28:05 crc kubenswrapper[4764]: I0309 13:28:05.660327 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dff5be776046969c512855e99368dd5e3ffa23c7cd0821f06ec61907f21c8cae" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.162726 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8d627"] Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.163545 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8d627" podUID="88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" containerName="registry-server" containerID="cri-o://5dafc18a60471b2aa074fa19f2a0f2019626c0dc18f4738cc76636c111e1e0e1" gracePeriod=30 Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.187914 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nrc8s"] Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.188191 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nrc8s" podUID="be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" containerName="registry-server" containerID="cri-o://0dca8e186618d6e08ce56c027491c8ad89ea2784547d9fbdcf530de2e5a43711" gracePeriod=30 Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.192277 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-d4gwh"] Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.194719 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" podUID="1ccc5b44-95ad-4f4c-8086-c176c41bbd19" containerName="marketplace-operator" containerID="cri-o://2b23f753224c479f8f3f33754ceac00343e6409f6988048cb245d41d456b1666" gracePeriod=30 Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.218579 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m2gc7"] Mar 09 13:28:13 crc kubenswrapper[4764]: E0309 13:28:13.218950 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f" containerName="oc" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.218965 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f" containerName="oc" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.219069 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f" containerName="oc" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.219463 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.241503 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qhs57"] Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.241860 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qhs57" podUID="691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" containerName="registry-server" containerID="cri-o://d02322e51b07c573018bc35bb85c4b069d5e04cb23ef61de23f558be13356214" gracePeriod=30 Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.253688 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m2gc7"] Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.264449 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tll5t"] Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.264780 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tll5t" podUID="41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" containerName="registry-server" containerID="cri-o://4325cf9007df54fa3b2a5bed7103ccce2df4b9e7e47622fe04491c8fac8d1503" gracePeriod=30 Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.326854 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4351c9fc-c207-4d15-b8a6-f51c0651fe83-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m2gc7\" (UID: \"4351c9fc-c207-4d15-b8a6-f51c0651fe83\") " pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.326900 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hwt8\" (UniqueName: \"kubernetes.io/projected/4351c9fc-c207-4d15-b8a6-f51c0651fe83-kube-api-access-4hwt8\") pod \"marketplace-operator-79b997595-m2gc7\" (UID: \"4351c9fc-c207-4d15-b8a6-f51c0651fe83\") " pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.326928 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4351c9fc-c207-4d15-b8a6-f51c0651fe83-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m2gc7\" (UID: \"4351c9fc-c207-4d15-b8a6-f51c0651fe83\") " pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.428631 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4351c9fc-c207-4d15-b8a6-f51c0651fe83-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m2gc7\" (UID: \"4351c9fc-c207-4d15-b8a6-f51c0651fe83\") " pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.428713 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hwt8\" (UniqueName: \"kubernetes.io/projected/4351c9fc-c207-4d15-b8a6-f51c0651fe83-kube-api-access-4hwt8\") pod \"marketplace-operator-79b997595-m2gc7\" (UID: \"4351c9fc-c207-4d15-b8a6-f51c0651fe83\") " pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.428753 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4351c9fc-c207-4d15-b8a6-f51c0651fe83-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m2gc7\" (UID: \"4351c9fc-c207-4d15-b8a6-f51c0651fe83\") " pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.430165 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4351c9fc-c207-4d15-b8a6-f51c0651fe83-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m2gc7\" (UID: \"4351c9fc-c207-4d15-b8a6-f51c0651fe83\") " pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.436601 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4351c9fc-c207-4d15-b8a6-f51c0651fe83-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m2gc7\" (UID: \"4351c9fc-c207-4d15-b8a6-f51c0651fe83\") " pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.449937 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hwt8\" (UniqueName: \"kubernetes.io/projected/4351c9fc-c207-4d15-b8a6-f51c0651fe83-kube-api-access-4hwt8\") pod \"marketplace-operator-79b997595-m2gc7\" (UID: \"4351c9fc-c207-4d15-b8a6-f51c0651fe83\") " pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.539203 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.670757 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.687093 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.698946 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.721154 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.743567 4764 generic.go:334] "Generic (PLEG): container finished" podID="1ccc5b44-95ad-4f4c-8086-c176c41bbd19" containerID="2b23f753224c479f8f3f33754ceac00343e6409f6988048cb245d41d456b1666" exitCode=0 Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.743681 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.744154 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" event={"ID":"1ccc5b44-95ad-4f4c-8086-c176c41bbd19","Type":"ContainerDied","Data":"2b23f753224c479f8f3f33754ceac00343e6409f6988048cb245d41d456b1666"} Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.744205 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-d4gwh" event={"ID":"1ccc5b44-95ad-4f4c-8086-c176c41bbd19","Type":"ContainerDied","Data":"771cd63965fde5f5f03cba604e9f4e1989cf6a4881a27fbd710be5727898d90a"} Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.744226 4764 scope.go:117] "RemoveContainer" containerID="2b23f753224c479f8f3f33754ceac00343e6409f6988048cb245d41d456b1666" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.756448 4764 generic.go:334] "Generic (PLEG): container finished" podID="be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" containerID="0dca8e186618d6e08ce56c027491c8ad89ea2784547d9fbdcf530de2e5a43711" exitCode=0 Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.756523 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrc8s" event={"ID":"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97","Type":"ContainerDied","Data":"0dca8e186618d6e08ce56c027491c8ad89ea2784547d9fbdcf530de2e5a43711"} Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.756556 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrc8s" event={"ID":"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97","Type":"ContainerDied","Data":"32332cee515b03550931490beaabd836e1f122b91e9186c7afe19395bde21caa"} Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.756629 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nrc8s" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.760751 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.764461 4764 generic.go:334] "Generic (PLEG): container finished" podID="88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" containerID="5dafc18a60471b2aa074fa19f2a0f2019626c0dc18f4738cc76636c111e1e0e1" exitCode=0 Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.764530 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d627" event={"ID":"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b","Type":"ContainerDied","Data":"5dafc18a60471b2aa074fa19f2a0f2019626c0dc18f4738cc76636c111e1e0e1"} Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.764557 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8d627" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.764565 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d627" event={"ID":"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b","Type":"ContainerDied","Data":"afb89024f1733e90994f05a617baca8bd2578c08f57794c20e8e0031fa2f63e4"} Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.766258 4764 scope.go:117] "RemoveContainer" containerID="2b23f753224c479f8f3f33754ceac00343e6409f6988048cb245d41d456b1666" Mar 09 13:28:13 crc kubenswrapper[4764]: E0309 13:28:13.766603 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b23f753224c479f8f3f33754ceac00343e6409f6988048cb245d41d456b1666\": container with ID starting with 2b23f753224c479f8f3f33754ceac00343e6409f6988048cb245d41d456b1666 not found: ID does not exist" containerID="2b23f753224c479f8f3f33754ceac00343e6409f6988048cb245d41d456b1666" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.766637 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b23f753224c479f8f3f33754ceac00343e6409f6988048cb245d41d456b1666"} err="failed to get container status \"2b23f753224c479f8f3f33754ceac00343e6409f6988048cb245d41d456b1666\": rpc error: code = NotFound desc = could not find container \"2b23f753224c479f8f3f33754ceac00343e6409f6988048cb245d41d456b1666\": container with ID starting with 2b23f753224c479f8f3f33754ceac00343e6409f6988048cb245d41d456b1666 not found: ID does not exist" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.766681 4764 scope.go:117] "RemoveContainer" containerID="0dca8e186618d6e08ce56c027491c8ad89ea2784547d9fbdcf530de2e5a43711" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.772221 4764 generic.go:334] "Generic (PLEG): container finished" podID="41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" containerID="4325cf9007df54fa3b2a5bed7103ccce2df4b9e7e47622fe04491c8fac8d1503" exitCode=0 Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.772324 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tll5t" event={"ID":"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6","Type":"ContainerDied","Data":"4325cf9007df54fa3b2a5bed7103ccce2df4b9e7e47622fe04491c8fac8d1503"} Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.772742 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tll5t" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.779068 4764 generic.go:334] "Generic (PLEG): container finished" podID="691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" containerID="d02322e51b07c573018bc35bb85c4b069d5e04cb23ef61de23f558be13356214" exitCode=0 Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.779121 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhs57" event={"ID":"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df","Type":"ContainerDied","Data":"d02322e51b07c573018bc35bb85c4b069d5e04cb23ef61de23f558be13356214"} Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.779154 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhs57" event={"ID":"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df","Type":"ContainerDied","Data":"eabffbe2f3a51c427a01ad46e2c40728c19297f3e8e305f2763268cbfbeb6ba0"} Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.779304 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qhs57" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.796927 4764 scope.go:117] "RemoveContainer" containerID="bf1dacbdf950a01b7bd5a3fe093c4331659602b77ce43e68786707bbc2b89ea8" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.824036 4764 scope.go:117] "RemoveContainer" containerID="fbe32af2436a64ccf4bd2766ca3822d5d50061459cac93fe26813bb870fd850a" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.837442 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fsvt\" (UniqueName: \"kubernetes.io/projected/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-kube-api-access-5fsvt\") pod \"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b\" (UID: \"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b\") " Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.837564 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-utilities\") pod \"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b\" (UID: \"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b\") " Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.837633 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-operator-metrics\") pod \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\" (UID: \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\") " Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.837691 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-utilities\") pod \"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97\" (UID: \"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97\") " Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.837753 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-trusted-ca\") pod \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\" (UID: \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\") " Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.837782 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-catalog-content\") pod \"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97\" (UID: \"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97\") " Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.837805 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff292\" (UniqueName: \"kubernetes.io/projected/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-kube-api-access-ff292\") pod \"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97\" (UID: \"be22cbfb-d3e7-43c1-be38-f6fcadeb2c97\") " Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.837828 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4z7x\" (UniqueName: \"kubernetes.io/projected/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-kube-api-access-j4z7x\") pod \"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df\" (UID: \"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df\") " Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.837866 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-utilities\") pod \"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df\" (UID: \"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df\") " Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.837893 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-catalog-content\") pod \"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b\" (UID: \"88ba6041-7f8f-48f0-840c-8ea2a9bdc87b\") " Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.837948 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s85j9\" (UniqueName: \"kubernetes.io/projected/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-kube-api-access-s85j9\") pod \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\" (UID: \"1ccc5b44-95ad-4f4c-8086-c176c41bbd19\") " Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.837969 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-catalog-content\") pod \"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df\" (UID: \"691ffa6f-3ee6-47fa-bcef-9fdd74ac86df\") " Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.838615 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-utilities" (OuterVolumeSpecName: "utilities") pod "88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" (UID: "88ba6041-7f8f-48f0-840c-8ea2a9bdc87b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.839087 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "1ccc5b44-95ad-4f4c-8086-c176c41bbd19" (UID: "1ccc5b44-95ad-4f4c-8086-c176c41bbd19"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.839173 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-utilities" (OuterVolumeSpecName: "utilities") pod "be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" (UID: "be22cbfb-d3e7-43c1-be38-f6fcadeb2c97"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.840290 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-utilities" (OuterVolumeSpecName: "utilities") pod "691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" (UID: "691ffa6f-3ee6-47fa-bcef-9fdd74ac86df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.842339 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "1ccc5b44-95ad-4f4c-8086-c176c41bbd19" (UID: "1ccc5b44-95ad-4f4c-8086-c176c41bbd19"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.842695 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-kube-api-access-ff292" (OuterVolumeSpecName: "kube-api-access-ff292") pod "be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" (UID: "be22cbfb-d3e7-43c1-be38-f6fcadeb2c97"). InnerVolumeSpecName "kube-api-access-ff292". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.843484 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-kube-api-access-j4z7x" (OuterVolumeSpecName: "kube-api-access-j4z7x") pod "691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" (UID: "691ffa6f-3ee6-47fa-bcef-9fdd74ac86df"). InnerVolumeSpecName "kube-api-access-j4z7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.843537 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-kube-api-access-s85j9" (OuterVolumeSpecName: "kube-api-access-s85j9") pod "1ccc5b44-95ad-4f4c-8086-c176c41bbd19" (UID: "1ccc5b44-95ad-4f4c-8086-c176c41bbd19"). InnerVolumeSpecName "kube-api-access-s85j9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.848821 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-kube-api-access-5fsvt" (OuterVolumeSpecName: "kube-api-access-5fsvt") pod "88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" (UID: "88ba6041-7f8f-48f0-840c-8ea2a9bdc87b"). InnerVolumeSpecName "kube-api-access-5fsvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.857179 4764 scope.go:117] "RemoveContainer" containerID="0dca8e186618d6e08ce56c027491c8ad89ea2784547d9fbdcf530de2e5a43711" Mar 09 13:28:13 crc kubenswrapper[4764]: E0309 13:28:13.857675 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dca8e186618d6e08ce56c027491c8ad89ea2784547d9fbdcf530de2e5a43711\": container with ID starting with 0dca8e186618d6e08ce56c027491c8ad89ea2784547d9fbdcf530de2e5a43711 not found: ID does not exist" containerID="0dca8e186618d6e08ce56c027491c8ad89ea2784547d9fbdcf530de2e5a43711" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.857719 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dca8e186618d6e08ce56c027491c8ad89ea2784547d9fbdcf530de2e5a43711"} err="failed to get container status \"0dca8e186618d6e08ce56c027491c8ad89ea2784547d9fbdcf530de2e5a43711\": rpc error: code = NotFound desc = could not find container \"0dca8e186618d6e08ce56c027491c8ad89ea2784547d9fbdcf530de2e5a43711\": container with ID starting with 0dca8e186618d6e08ce56c027491c8ad89ea2784547d9fbdcf530de2e5a43711 not found: ID does not exist" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.857746 4764 scope.go:117] "RemoveContainer" containerID="bf1dacbdf950a01b7bd5a3fe093c4331659602b77ce43e68786707bbc2b89ea8" Mar 09 13:28:13 crc kubenswrapper[4764]: E0309 13:28:13.858122 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf1dacbdf950a01b7bd5a3fe093c4331659602b77ce43e68786707bbc2b89ea8\": container with ID starting with bf1dacbdf950a01b7bd5a3fe093c4331659602b77ce43e68786707bbc2b89ea8 not found: ID does not exist" containerID="bf1dacbdf950a01b7bd5a3fe093c4331659602b77ce43e68786707bbc2b89ea8" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.858168 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf1dacbdf950a01b7bd5a3fe093c4331659602b77ce43e68786707bbc2b89ea8"} err="failed to get container status \"bf1dacbdf950a01b7bd5a3fe093c4331659602b77ce43e68786707bbc2b89ea8\": rpc error: code = NotFound desc = could not find container \"bf1dacbdf950a01b7bd5a3fe093c4331659602b77ce43e68786707bbc2b89ea8\": container with ID starting with bf1dacbdf950a01b7bd5a3fe093c4331659602b77ce43e68786707bbc2b89ea8 not found: ID does not exist" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.858194 4764 scope.go:117] "RemoveContainer" containerID="fbe32af2436a64ccf4bd2766ca3822d5d50061459cac93fe26813bb870fd850a" Mar 09 13:28:13 crc kubenswrapper[4764]: E0309 13:28:13.858484 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbe32af2436a64ccf4bd2766ca3822d5d50061459cac93fe26813bb870fd850a\": container with ID starting with fbe32af2436a64ccf4bd2766ca3822d5d50061459cac93fe26813bb870fd850a not found: ID does not exist" containerID="fbe32af2436a64ccf4bd2766ca3822d5d50061459cac93fe26813bb870fd850a" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.858515 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbe32af2436a64ccf4bd2766ca3822d5d50061459cac93fe26813bb870fd850a"} err="failed to get container status \"fbe32af2436a64ccf4bd2766ca3822d5d50061459cac93fe26813bb870fd850a\": rpc error: code = NotFound desc = could not find container \"fbe32af2436a64ccf4bd2766ca3822d5d50061459cac93fe26813bb870fd850a\": container with ID starting with fbe32af2436a64ccf4bd2766ca3822d5d50061459cac93fe26813bb870fd850a not found: ID does not exist" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.858536 4764 scope.go:117] "RemoveContainer" containerID="5dafc18a60471b2aa074fa19f2a0f2019626c0dc18f4738cc76636c111e1e0e1" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.873944 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" (UID: "691ffa6f-3ee6-47fa-bcef-9fdd74ac86df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.882171 4764 scope.go:117] "RemoveContainer" containerID="8cec696202563cb93ec1099b2921beba11271e2ac2498229b87b108935e01fc2" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.907479 4764 scope.go:117] "RemoveContainer" containerID="1ac1423999a9eaea944b592473f076b1d7f0bf8296be44b2cb8bd7cc41e23c1f" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.909327 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" (UID: "88ba6041-7f8f-48f0-840c-8ea2a9bdc87b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.923078 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" (UID: "be22cbfb-d3e7-43c1-be38-f6fcadeb2c97"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.939388 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v76nj\" (UniqueName: \"kubernetes.io/projected/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-kube-api-access-v76nj\") pod \"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6\" (UID: \"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6\") " Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.939565 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-utilities\") pod \"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6\" (UID: \"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6\") " Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.939702 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-catalog-content\") pod \"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6\" (UID: \"41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6\") " Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.939751 4764 scope.go:117] "RemoveContainer" containerID="5dafc18a60471b2aa074fa19f2a0f2019626c0dc18f4738cc76636c111e1e0e1" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.940097 4764 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.940122 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.940153 4764 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.940165 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.940177 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff292\" (UniqueName: \"kubernetes.io/projected/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97-kube-api-access-ff292\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.940188 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4z7x\" (UniqueName: \"kubernetes.io/projected/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-kube-api-access-j4z7x\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.940197 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.940207 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.940313 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s85j9\" (UniqueName: \"kubernetes.io/projected/1ccc5b44-95ad-4f4c-8086-c176c41bbd19-kube-api-access-s85j9\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.940326 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.940336 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fsvt\" (UniqueName: \"kubernetes.io/projected/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-kube-api-access-5fsvt\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.940345 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:13 crc kubenswrapper[4764]: E0309 13:28:13.940323 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dafc18a60471b2aa074fa19f2a0f2019626c0dc18f4738cc76636c111e1e0e1\": container with ID starting with 5dafc18a60471b2aa074fa19f2a0f2019626c0dc18f4738cc76636c111e1e0e1 not found: ID does not exist" containerID="5dafc18a60471b2aa074fa19f2a0f2019626c0dc18f4738cc76636c111e1e0e1" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.940422 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dafc18a60471b2aa074fa19f2a0f2019626c0dc18f4738cc76636c111e1e0e1"} err="failed to get container status \"5dafc18a60471b2aa074fa19f2a0f2019626c0dc18f4738cc76636c111e1e0e1\": rpc error: code = NotFound desc = could not find container \"5dafc18a60471b2aa074fa19f2a0f2019626c0dc18f4738cc76636c111e1e0e1\": container with ID starting with 5dafc18a60471b2aa074fa19f2a0f2019626c0dc18f4738cc76636c111e1e0e1 not found: ID does not exist" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.940448 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-utilities" (OuterVolumeSpecName: "utilities") pod "41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" (UID: "41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.940472 4764 scope.go:117] "RemoveContainer" containerID="8cec696202563cb93ec1099b2921beba11271e2ac2498229b87b108935e01fc2" Mar 09 13:28:13 crc kubenswrapper[4764]: E0309 13:28:13.941275 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cec696202563cb93ec1099b2921beba11271e2ac2498229b87b108935e01fc2\": container with ID starting with 8cec696202563cb93ec1099b2921beba11271e2ac2498229b87b108935e01fc2 not found: ID does not exist" containerID="8cec696202563cb93ec1099b2921beba11271e2ac2498229b87b108935e01fc2" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.941331 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cec696202563cb93ec1099b2921beba11271e2ac2498229b87b108935e01fc2"} err="failed to get container status \"8cec696202563cb93ec1099b2921beba11271e2ac2498229b87b108935e01fc2\": rpc error: code = NotFound desc = could not find container \"8cec696202563cb93ec1099b2921beba11271e2ac2498229b87b108935e01fc2\": container with ID starting with 8cec696202563cb93ec1099b2921beba11271e2ac2498229b87b108935e01fc2 not found: ID does not exist" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.941384 4764 scope.go:117] "RemoveContainer" containerID="1ac1423999a9eaea944b592473f076b1d7f0bf8296be44b2cb8bd7cc41e23c1f" Mar 09 13:28:13 crc kubenswrapper[4764]: E0309 13:28:13.941826 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ac1423999a9eaea944b592473f076b1d7f0bf8296be44b2cb8bd7cc41e23c1f\": container with ID starting with 1ac1423999a9eaea944b592473f076b1d7f0bf8296be44b2cb8bd7cc41e23c1f not found: ID does not exist" containerID="1ac1423999a9eaea944b592473f076b1d7f0bf8296be44b2cb8bd7cc41e23c1f" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.941848 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ac1423999a9eaea944b592473f076b1d7f0bf8296be44b2cb8bd7cc41e23c1f"} err="failed to get container status \"1ac1423999a9eaea944b592473f076b1d7f0bf8296be44b2cb8bd7cc41e23c1f\": rpc error: code = NotFound desc = could not find container \"1ac1423999a9eaea944b592473f076b1d7f0bf8296be44b2cb8bd7cc41e23c1f\": container with ID starting with 1ac1423999a9eaea944b592473f076b1d7f0bf8296be44b2cb8bd7cc41e23c1f not found: ID does not exist" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.941861 4764 scope.go:117] "RemoveContainer" containerID="4325cf9007df54fa3b2a5bed7103ccce2df4b9e7e47622fe04491c8fac8d1503" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.942676 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-kube-api-access-v76nj" (OuterVolumeSpecName: "kube-api-access-v76nj") pod "41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" (UID: "41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6"). InnerVolumeSpecName "kube-api-access-v76nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.970353 4764 scope.go:117] "RemoveContainer" containerID="0ae328f899ade57662b7a57d61c5864e374b6610f4676d30b76a8c7048f7853c" Mar 09 13:28:13 crc kubenswrapper[4764]: I0309 13:28:13.992717 4764 scope.go:117] "RemoveContainer" containerID="ad5ce40518308f615c3da026f6c3fbf2d4af3ef9e2a050d739fa7a496bac2d88" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.011967 4764 scope.go:117] "RemoveContainer" containerID="d02322e51b07c573018bc35bb85c4b069d5e04cb23ef61de23f558be13356214" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.025061 4764 scope.go:117] "RemoveContainer" containerID="aca0e54cf991318058295c5dea65c4ff3d25ff689a99973580ba7e63019a2803" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.038746 4764 scope.go:117] "RemoveContainer" containerID="05c2ccc0c11f9b7d9a5361cced5932ce9d54a4f6f29e12db9d17e75834fe8ead" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.042167 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v76nj\" (UniqueName: \"kubernetes.io/projected/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-kube-api-access-v76nj\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.042191 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.053284 4764 scope.go:117] "RemoveContainer" containerID="d02322e51b07c573018bc35bb85c4b069d5e04cb23ef61de23f558be13356214" Mar 09 13:28:14 crc kubenswrapper[4764]: E0309 13:28:14.053698 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d02322e51b07c573018bc35bb85c4b069d5e04cb23ef61de23f558be13356214\": container with ID starting with d02322e51b07c573018bc35bb85c4b069d5e04cb23ef61de23f558be13356214 not found: ID does not exist" containerID="d02322e51b07c573018bc35bb85c4b069d5e04cb23ef61de23f558be13356214" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.053736 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d02322e51b07c573018bc35bb85c4b069d5e04cb23ef61de23f558be13356214"} err="failed to get container status \"d02322e51b07c573018bc35bb85c4b069d5e04cb23ef61de23f558be13356214\": rpc error: code = NotFound desc = could not find container \"d02322e51b07c573018bc35bb85c4b069d5e04cb23ef61de23f558be13356214\": container with ID starting with d02322e51b07c573018bc35bb85c4b069d5e04cb23ef61de23f558be13356214 not found: ID does not exist" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.053760 4764 scope.go:117] "RemoveContainer" containerID="aca0e54cf991318058295c5dea65c4ff3d25ff689a99973580ba7e63019a2803" Mar 09 13:28:14 crc kubenswrapper[4764]: E0309 13:28:14.054092 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aca0e54cf991318058295c5dea65c4ff3d25ff689a99973580ba7e63019a2803\": container with ID starting with aca0e54cf991318058295c5dea65c4ff3d25ff689a99973580ba7e63019a2803 not found: ID does not exist" containerID="aca0e54cf991318058295c5dea65c4ff3d25ff689a99973580ba7e63019a2803" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.054128 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aca0e54cf991318058295c5dea65c4ff3d25ff689a99973580ba7e63019a2803"} err="failed to get container status \"aca0e54cf991318058295c5dea65c4ff3d25ff689a99973580ba7e63019a2803\": rpc error: code = NotFound desc = could not find container \"aca0e54cf991318058295c5dea65c4ff3d25ff689a99973580ba7e63019a2803\": container with ID starting with aca0e54cf991318058295c5dea65c4ff3d25ff689a99973580ba7e63019a2803 not found: ID does not exist" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.054157 4764 scope.go:117] "RemoveContainer" containerID="05c2ccc0c11f9b7d9a5361cced5932ce9d54a4f6f29e12db9d17e75834fe8ead" Mar 09 13:28:14 crc kubenswrapper[4764]: E0309 13:28:14.054432 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05c2ccc0c11f9b7d9a5361cced5932ce9d54a4f6f29e12db9d17e75834fe8ead\": container with ID starting with 05c2ccc0c11f9b7d9a5361cced5932ce9d54a4f6f29e12db9d17e75834fe8ead not found: ID does not exist" containerID="05c2ccc0c11f9b7d9a5361cced5932ce9d54a4f6f29e12db9d17e75834fe8ead" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.054461 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05c2ccc0c11f9b7d9a5361cced5932ce9d54a4f6f29e12db9d17e75834fe8ead"} err="failed to get container status \"05c2ccc0c11f9b7d9a5361cced5932ce9d54a4f6f29e12db9d17e75834fe8ead\": rpc error: code = NotFound desc = could not find container \"05c2ccc0c11f9b7d9a5361cced5932ce9d54a4f6f29e12db9d17e75834fe8ead\": container with ID starting with 05c2ccc0c11f9b7d9a5361cced5932ce9d54a4f6f29e12db9d17e75834fe8ead not found: ID does not exist" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.068181 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m2gc7"] Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.085056 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-d4gwh"] Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.089850 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-d4gwh"] Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.094279 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nrc8s"] Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.098380 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nrc8s"] Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.117534 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8d627"] Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.122728 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8d627"] Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.136952 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" (UID: "41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.142368 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qhs57"] Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.143351 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.155900 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qhs57"] Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.400270 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tll5t"] Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.406798 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tll5t"] Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.788310 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" event={"ID":"4351c9fc-c207-4d15-b8a6-f51c0651fe83","Type":"ContainerStarted","Data":"7f19bb86d875ffaf829b3a885986ef08c68ca2f33cd253f50495fb450b0f2897"} Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.788783 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" event={"ID":"4351c9fc-c207-4d15-b8a6-f51c0651fe83","Type":"ContainerStarted","Data":"62146f286e07e7282d321a4b34ab0944d09b1aac7e95a0014dba4e4dc1525adc"} Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.788805 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.791897 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" Mar 09 13:28:14 crc kubenswrapper[4764]: I0309 13:28:14.809671 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-m2gc7" podStartSLOduration=1.809633409 podStartE2EDuration="1.809633409s" podCreationTimestamp="2026-03-09 13:28:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:28:14.808093117 +0000 UTC m=+450.058265045" watchObservedRunningTime="2026-03-09 13:28:14.809633409 +0000 UTC m=+450.059805317" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.211489 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bbwx5"] Mar 09 13:28:15 crc kubenswrapper[4764]: E0309 13:28:15.212125 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ccc5b44-95ad-4f4c-8086-c176c41bbd19" containerName="marketplace-operator" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212155 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ccc5b44-95ad-4f4c-8086-c176c41bbd19" containerName="marketplace-operator" Mar 09 13:28:15 crc kubenswrapper[4764]: E0309 13:28:15.212188 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" containerName="extract-content" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212197 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" containerName="extract-content" Mar 09 13:28:15 crc kubenswrapper[4764]: E0309 13:28:15.212206 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" containerName="extract-utilities" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212215 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" containerName="extract-utilities" Mar 09 13:28:15 crc kubenswrapper[4764]: E0309 13:28:15.212230 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" containerName="registry-server" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212237 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" containerName="registry-server" Mar 09 13:28:15 crc kubenswrapper[4764]: E0309 13:28:15.212253 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" containerName="extract-utilities" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212260 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" containerName="extract-utilities" Mar 09 13:28:15 crc kubenswrapper[4764]: E0309 13:28:15.212268 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" containerName="registry-server" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212278 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" containerName="registry-server" Mar 09 13:28:15 crc kubenswrapper[4764]: E0309 13:28:15.212295 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" containerName="registry-server" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212303 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" containerName="registry-server" Mar 09 13:28:15 crc kubenswrapper[4764]: E0309 13:28:15.212317 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" containerName="extract-content" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212429 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" containerName="extract-content" Mar 09 13:28:15 crc kubenswrapper[4764]: E0309 13:28:15.212445 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" containerName="extract-content" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212452 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" containerName="extract-content" Mar 09 13:28:15 crc kubenswrapper[4764]: E0309 13:28:15.212469 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" containerName="extract-utilities" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212476 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" containerName="extract-utilities" Mar 09 13:28:15 crc kubenswrapper[4764]: E0309 13:28:15.212491 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" containerName="extract-utilities" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212498 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" containerName="extract-utilities" Mar 09 13:28:15 crc kubenswrapper[4764]: E0309 13:28:15.212511 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" containerName="registry-server" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212517 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" containerName="registry-server" Mar 09 13:28:15 crc kubenswrapper[4764]: E0309 13:28:15.212529 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" containerName="extract-content" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212535 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" containerName="extract-content" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212793 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" containerName="registry-server" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212806 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ccc5b44-95ad-4f4c-8086-c176c41bbd19" containerName="marketplace-operator" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212819 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" containerName="registry-server" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212858 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" containerName="registry-server" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.212873 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" containerName="registry-server" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.214516 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bbwx5"] Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.214658 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bbwx5" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.220143 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.360456 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tldxq\" (UniqueName: \"kubernetes.io/projected/a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa-kube-api-access-tldxq\") pod \"certified-operators-bbwx5\" (UID: \"a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa\") " pod="openshift-marketplace/certified-operators-bbwx5" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.360543 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa-utilities\") pod \"certified-operators-bbwx5\" (UID: \"a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa\") " pod="openshift-marketplace/certified-operators-bbwx5" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.360589 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa-catalog-content\") pod \"certified-operators-bbwx5\" (UID: \"a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa\") " pod="openshift-marketplace/certified-operators-bbwx5" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.461748 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tldxq\" (UniqueName: \"kubernetes.io/projected/a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa-kube-api-access-tldxq\") pod \"certified-operators-bbwx5\" (UID: \"a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa\") " pod="openshift-marketplace/certified-operators-bbwx5" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.463227 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa-utilities\") pod \"certified-operators-bbwx5\" (UID: \"a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa\") " pod="openshift-marketplace/certified-operators-bbwx5" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.463347 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa-catalog-content\") pod \"certified-operators-bbwx5\" (UID: \"a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa\") " pod="openshift-marketplace/certified-operators-bbwx5" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.463909 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa-catalog-content\") pod \"certified-operators-bbwx5\" (UID: \"a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa\") " pod="openshift-marketplace/certified-operators-bbwx5" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.464284 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa-utilities\") pod \"certified-operators-bbwx5\" (UID: \"a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa\") " pod="openshift-marketplace/certified-operators-bbwx5" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.489145 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tldxq\" (UniqueName: \"kubernetes.io/projected/a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa-kube-api-access-tldxq\") pod \"certified-operators-bbwx5\" (UID: \"a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa\") " pod="openshift-marketplace/certified-operators-bbwx5" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.537462 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bbwx5" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.569795 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ccc5b44-95ad-4f4c-8086-c176c41bbd19" path="/var/lib/kubelet/pods/1ccc5b44-95ad-4f4c-8086-c176c41bbd19/volumes" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.570877 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6" path="/var/lib/kubelet/pods/41a3ce40-a2f8-4bc3-8fbb-eccfb1ec4ce6/volumes" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.572067 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="691ffa6f-3ee6-47fa-bcef-9fdd74ac86df" path="/var/lib/kubelet/pods/691ffa6f-3ee6-47fa-bcef-9fdd74ac86df/volumes" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.574118 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88ba6041-7f8f-48f0-840c-8ea2a9bdc87b" path="/var/lib/kubelet/pods/88ba6041-7f8f-48f0-840c-8ea2a9bdc87b/volumes" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.575251 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be22cbfb-d3e7-43c1-be38-f6fcadeb2c97" path="/var/lib/kubelet/pods/be22cbfb-d3e7-43c1-be38-f6fcadeb2c97/volumes" Mar 09 13:28:15 crc kubenswrapper[4764]: W0309 13:28:15.756198 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6d68e16_c0d2_4f98_9b3f_d1d392bf67fa.slice/crio-52ed88d09225c736fe314110b647e24374da840281277d81c107d4789f623960 WatchSource:0}: Error finding container 52ed88d09225c736fe314110b647e24374da840281277d81c107d4789f623960: Status 404 returned error can't find the container with id 52ed88d09225c736fe314110b647e24374da840281277d81c107d4789f623960 Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.756897 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bbwx5"] Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.782420 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-whs64"] Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.785829 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-whs64" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.788458 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.790101 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-whs64"] Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.802787 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbwx5" event={"ID":"a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa","Type":"ContainerStarted","Data":"52ed88d09225c736fe314110b647e24374da840281277d81c107d4789f623960"} Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.972433 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btsbb\" (UniqueName: \"kubernetes.io/projected/26dd13d2-9d2e-4f59-97a6-e31b76ccf74c-kube-api-access-btsbb\") pod \"redhat-marketplace-whs64\" (UID: \"26dd13d2-9d2e-4f59-97a6-e31b76ccf74c\") " pod="openshift-marketplace/redhat-marketplace-whs64" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.972507 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26dd13d2-9d2e-4f59-97a6-e31b76ccf74c-utilities\") pod \"redhat-marketplace-whs64\" (UID: \"26dd13d2-9d2e-4f59-97a6-e31b76ccf74c\") " pod="openshift-marketplace/redhat-marketplace-whs64" Mar 09 13:28:15 crc kubenswrapper[4764]: I0309 13:28:15.972535 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26dd13d2-9d2e-4f59-97a6-e31b76ccf74c-catalog-content\") pod \"redhat-marketplace-whs64\" (UID: \"26dd13d2-9d2e-4f59-97a6-e31b76ccf74c\") " pod="openshift-marketplace/redhat-marketplace-whs64" Mar 09 13:28:16 crc kubenswrapper[4764]: I0309 13:28:16.074299 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btsbb\" (UniqueName: \"kubernetes.io/projected/26dd13d2-9d2e-4f59-97a6-e31b76ccf74c-kube-api-access-btsbb\") pod \"redhat-marketplace-whs64\" (UID: \"26dd13d2-9d2e-4f59-97a6-e31b76ccf74c\") " pod="openshift-marketplace/redhat-marketplace-whs64" Mar 09 13:28:16 crc kubenswrapper[4764]: I0309 13:28:16.074362 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26dd13d2-9d2e-4f59-97a6-e31b76ccf74c-utilities\") pod \"redhat-marketplace-whs64\" (UID: \"26dd13d2-9d2e-4f59-97a6-e31b76ccf74c\") " pod="openshift-marketplace/redhat-marketplace-whs64" Mar 09 13:28:16 crc kubenswrapper[4764]: I0309 13:28:16.074390 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26dd13d2-9d2e-4f59-97a6-e31b76ccf74c-catalog-content\") pod \"redhat-marketplace-whs64\" (UID: \"26dd13d2-9d2e-4f59-97a6-e31b76ccf74c\") " pod="openshift-marketplace/redhat-marketplace-whs64" Mar 09 13:28:16 crc kubenswrapper[4764]: I0309 13:28:16.074871 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26dd13d2-9d2e-4f59-97a6-e31b76ccf74c-catalog-content\") pod \"redhat-marketplace-whs64\" (UID: \"26dd13d2-9d2e-4f59-97a6-e31b76ccf74c\") " pod="openshift-marketplace/redhat-marketplace-whs64" Mar 09 13:28:16 crc kubenswrapper[4764]: I0309 13:28:16.075048 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26dd13d2-9d2e-4f59-97a6-e31b76ccf74c-utilities\") pod \"redhat-marketplace-whs64\" (UID: \"26dd13d2-9d2e-4f59-97a6-e31b76ccf74c\") " pod="openshift-marketplace/redhat-marketplace-whs64" Mar 09 13:28:16 crc kubenswrapper[4764]: I0309 13:28:16.092930 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btsbb\" (UniqueName: \"kubernetes.io/projected/26dd13d2-9d2e-4f59-97a6-e31b76ccf74c-kube-api-access-btsbb\") pod \"redhat-marketplace-whs64\" (UID: \"26dd13d2-9d2e-4f59-97a6-e31b76ccf74c\") " pod="openshift-marketplace/redhat-marketplace-whs64" Mar 09 13:28:16 crc kubenswrapper[4764]: I0309 13:28:16.143717 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-whs64" Mar 09 13:28:16 crc kubenswrapper[4764]: I0309 13:28:16.317496 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-whs64"] Mar 09 13:28:16 crc kubenswrapper[4764]: W0309 13:28:16.329046 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26dd13d2_9d2e_4f59_97a6_e31b76ccf74c.slice/crio-6ab160557ac97ec94ac2560399e27ade1bfc2a79da5943ef4efa93130b97aae0 WatchSource:0}: Error finding container 6ab160557ac97ec94ac2560399e27ade1bfc2a79da5943ef4efa93130b97aae0: Status 404 returned error can't find the container with id 6ab160557ac97ec94ac2560399e27ade1bfc2a79da5943ef4efa93130b97aae0 Mar 09 13:28:16 crc kubenswrapper[4764]: I0309 13:28:16.805502 4764 generic.go:334] "Generic (PLEG): container finished" podID="26dd13d2-9d2e-4f59-97a6-e31b76ccf74c" containerID="350b7d2111c2f01954e1ea25694c1ad7063253e51b97f678cae864034348ca9d" exitCode=0 Mar 09 13:28:16 crc kubenswrapper[4764]: I0309 13:28:16.806686 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whs64" event={"ID":"26dd13d2-9d2e-4f59-97a6-e31b76ccf74c","Type":"ContainerDied","Data":"350b7d2111c2f01954e1ea25694c1ad7063253e51b97f678cae864034348ca9d"} Mar 09 13:28:16 crc kubenswrapper[4764]: I0309 13:28:16.806714 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whs64" event={"ID":"26dd13d2-9d2e-4f59-97a6-e31b76ccf74c","Type":"ContainerStarted","Data":"6ab160557ac97ec94ac2560399e27ade1bfc2a79da5943ef4efa93130b97aae0"} Mar 09 13:28:16 crc kubenswrapper[4764]: I0309 13:28:16.809951 4764 generic.go:334] "Generic (PLEG): container finished" podID="a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa" containerID="c086db9ee77f8fe876bbaf1f0cf47f42e173bf32d2853cc8ca88a473223d9396" exitCode=0 Mar 09 13:28:16 crc kubenswrapper[4764]: I0309 13:28:16.810827 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbwx5" event={"ID":"a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa","Type":"ContainerDied","Data":"c086db9ee77f8fe876bbaf1f0cf47f42e173bf32d2853cc8ca88a473223d9396"} Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.582665 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xn8sz"] Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.585228 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xn8sz" Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.587960 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.590692 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xn8sz"] Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.698351 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/621cdc4e-d896-4775-b654-2d6606097cb9-catalog-content\") pod \"redhat-operators-xn8sz\" (UID: \"621cdc4e-d896-4775-b654-2d6606097cb9\") " pod="openshift-marketplace/redhat-operators-xn8sz" Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.698989 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtpvd\" (UniqueName: \"kubernetes.io/projected/621cdc4e-d896-4775-b654-2d6606097cb9-kube-api-access-jtpvd\") pod \"redhat-operators-xn8sz\" (UID: \"621cdc4e-d896-4775-b654-2d6606097cb9\") " pod="openshift-marketplace/redhat-operators-xn8sz" Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.699488 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/621cdc4e-d896-4775-b654-2d6606097cb9-utilities\") pod \"redhat-operators-xn8sz\" (UID: \"621cdc4e-d896-4775-b654-2d6606097cb9\") " pod="openshift-marketplace/redhat-operators-xn8sz" Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.801096 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtpvd\" (UniqueName: \"kubernetes.io/projected/621cdc4e-d896-4775-b654-2d6606097cb9-kube-api-access-jtpvd\") pod \"redhat-operators-xn8sz\" (UID: \"621cdc4e-d896-4775-b654-2d6606097cb9\") " pod="openshift-marketplace/redhat-operators-xn8sz" Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.801165 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/621cdc4e-d896-4775-b654-2d6606097cb9-utilities\") pod \"redhat-operators-xn8sz\" (UID: \"621cdc4e-d896-4775-b654-2d6606097cb9\") " pod="openshift-marketplace/redhat-operators-xn8sz" Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.801206 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/621cdc4e-d896-4775-b654-2d6606097cb9-catalog-content\") pod \"redhat-operators-xn8sz\" (UID: \"621cdc4e-d896-4775-b654-2d6606097cb9\") " pod="openshift-marketplace/redhat-operators-xn8sz" Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.801664 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/621cdc4e-d896-4775-b654-2d6606097cb9-utilities\") pod \"redhat-operators-xn8sz\" (UID: \"621cdc4e-d896-4775-b654-2d6606097cb9\") " pod="openshift-marketplace/redhat-operators-xn8sz" Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.801742 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/621cdc4e-d896-4775-b654-2d6606097cb9-catalog-content\") pod \"redhat-operators-xn8sz\" (UID: \"621cdc4e-d896-4775-b654-2d6606097cb9\") " pod="openshift-marketplace/redhat-operators-xn8sz" Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.831953 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtpvd\" (UniqueName: \"kubernetes.io/projected/621cdc4e-d896-4775-b654-2d6606097cb9-kube-api-access-jtpvd\") pod \"redhat-operators-xn8sz\" (UID: \"621cdc4e-d896-4775-b654-2d6606097cb9\") " pod="openshift-marketplace/redhat-operators-xn8sz" Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.837345 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbwx5" event={"ID":"a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa","Type":"ContainerStarted","Data":"91f21d1192fbf7233d0ba5261d6dd79a24c5e670d2ce3ce56ee836e0d4154e5b"} Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.858134 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whs64" event={"ID":"26dd13d2-9d2e-4f59-97a6-e31b76ccf74c","Type":"ContainerStarted","Data":"b5083cf386271d338b4c15c21672fafdc012420c0f721902a67a9fc096355808"} Mar 09 13:28:17 crc kubenswrapper[4764]: I0309 13:28:17.909562 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xn8sz" Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.072057 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xn8sz"] Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.181220 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4sxc8"] Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.182464 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4sxc8" Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.185667 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.199000 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4sxc8"] Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.307704 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa6ff5f6-9328-419b-a996-05bcf478b446-utilities\") pod \"community-operators-4sxc8\" (UID: \"fa6ff5f6-9328-419b-a996-05bcf478b446\") " pod="openshift-marketplace/community-operators-4sxc8" Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.307767 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa6ff5f6-9328-419b-a996-05bcf478b446-catalog-content\") pod \"community-operators-4sxc8\" (UID: \"fa6ff5f6-9328-419b-a996-05bcf478b446\") " pod="openshift-marketplace/community-operators-4sxc8" Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.307908 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p26kd\" (UniqueName: \"kubernetes.io/projected/fa6ff5f6-9328-419b-a996-05bcf478b446-kube-api-access-p26kd\") pod \"community-operators-4sxc8\" (UID: \"fa6ff5f6-9328-419b-a996-05bcf478b446\") " pod="openshift-marketplace/community-operators-4sxc8" Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.408895 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa6ff5f6-9328-419b-a996-05bcf478b446-catalog-content\") pod \"community-operators-4sxc8\" (UID: \"fa6ff5f6-9328-419b-a996-05bcf478b446\") " pod="openshift-marketplace/community-operators-4sxc8" Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.409042 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p26kd\" (UniqueName: \"kubernetes.io/projected/fa6ff5f6-9328-419b-a996-05bcf478b446-kube-api-access-p26kd\") pod \"community-operators-4sxc8\" (UID: \"fa6ff5f6-9328-419b-a996-05bcf478b446\") " pod="openshift-marketplace/community-operators-4sxc8" Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.409112 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa6ff5f6-9328-419b-a996-05bcf478b446-utilities\") pod \"community-operators-4sxc8\" (UID: \"fa6ff5f6-9328-419b-a996-05bcf478b446\") " pod="openshift-marketplace/community-operators-4sxc8" Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.409471 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa6ff5f6-9328-419b-a996-05bcf478b446-catalog-content\") pod \"community-operators-4sxc8\" (UID: \"fa6ff5f6-9328-419b-a996-05bcf478b446\") " pod="openshift-marketplace/community-operators-4sxc8" Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.409490 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa6ff5f6-9328-419b-a996-05bcf478b446-utilities\") pod \"community-operators-4sxc8\" (UID: \"fa6ff5f6-9328-419b-a996-05bcf478b446\") " pod="openshift-marketplace/community-operators-4sxc8" Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.429021 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p26kd\" (UniqueName: \"kubernetes.io/projected/fa6ff5f6-9328-419b-a996-05bcf478b446-kube-api-access-p26kd\") pod \"community-operators-4sxc8\" (UID: \"fa6ff5f6-9328-419b-a996-05bcf478b446\") " pod="openshift-marketplace/community-operators-4sxc8" Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.583420 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4sxc8" Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.782081 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4sxc8"] Mar 09 13:28:18 crc kubenswrapper[4764]: W0309 13:28:18.788529 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa6ff5f6_9328_419b_a996_05bcf478b446.slice/crio-fa2faca2b6f361aab53dbf1e7c047f8fa427577595f01228a4ee0b2b2c128cfa WatchSource:0}: Error finding container fa2faca2b6f361aab53dbf1e7c047f8fa427577595f01228a4ee0b2b2c128cfa: Status 404 returned error can't find the container with id fa2faca2b6f361aab53dbf1e7c047f8fa427577595f01228a4ee0b2b2c128cfa Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.865690 4764 generic.go:334] "Generic (PLEG): container finished" podID="621cdc4e-d896-4775-b654-2d6606097cb9" containerID="21b7752d01ed5157b1358caf6432ea4435920cfb27ea11e67bee506887d1aece" exitCode=0 Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.865849 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xn8sz" event={"ID":"621cdc4e-d896-4775-b654-2d6606097cb9","Type":"ContainerDied","Data":"21b7752d01ed5157b1358caf6432ea4435920cfb27ea11e67bee506887d1aece"} Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.866236 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xn8sz" event={"ID":"621cdc4e-d896-4775-b654-2d6606097cb9","Type":"ContainerStarted","Data":"2302eb7ccaa0400069ae2934df34cee77b390ca877fa548b26f505702d7c6bcc"} Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.870529 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbwx5" event={"ID":"a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa","Type":"ContainerDied","Data":"91f21d1192fbf7233d0ba5261d6dd79a24c5e670d2ce3ce56ee836e0d4154e5b"} Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.870466 4764 generic.go:334] "Generic (PLEG): container finished" podID="a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa" containerID="91f21d1192fbf7233d0ba5261d6dd79a24c5e670d2ce3ce56ee836e0d4154e5b" exitCode=0 Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.873739 4764 generic.go:334] "Generic (PLEG): container finished" podID="26dd13d2-9d2e-4f59-97a6-e31b76ccf74c" containerID="b5083cf386271d338b4c15c21672fafdc012420c0f721902a67a9fc096355808" exitCode=0 Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.873804 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whs64" event={"ID":"26dd13d2-9d2e-4f59-97a6-e31b76ccf74c","Type":"ContainerDied","Data":"b5083cf386271d338b4c15c21672fafdc012420c0f721902a67a9fc096355808"} Mar 09 13:28:18 crc kubenswrapper[4764]: I0309 13:28:18.875477 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4sxc8" event={"ID":"fa6ff5f6-9328-419b-a996-05bcf478b446","Type":"ContainerStarted","Data":"fa2faca2b6f361aab53dbf1e7c047f8fa427577595f01228a4ee0b2b2c128cfa"} Mar 09 13:28:19 crc kubenswrapper[4764]: I0309 13:28:19.883790 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whs64" event={"ID":"26dd13d2-9d2e-4f59-97a6-e31b76ccf74c","Type":"ContainerStarted","Data":"f1d420ae1949fed35e711d47be2d28ad84d0eb2096c4440052f1371c212a3b0b"} Mar 09 13:28:19 crc kubenswrapper[4764]: I0309 13:28:19.887596 4764 generic.go:334] "Generic (PLEG): container finished" podID="fa6ff5f6-9328-419b-a996-05bcf478b446" containerID="dab29cc25a32a534e61bf039add40e1f06d5ae10ca8ef07579cb0bf91b124a9d" exitCode=0 Mar 09 13:28:19 crc kubenswrapper[4764]: I0309 13:28:19.887630 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4sxc8" event={"ID":"fa6ff5f6-9328-419b-a996-05bcf478b446","Type":"ContainerDied","Data":"dab29cc25a32a534e61bf039add40e1f06d5ae10ca8ef07579cb0bf91b124a9d"} Mar 09 13:28:19 crc kubenswrapper[4764]: I0309 13:28:19.920333 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-whs64" podStartSLOduration=2.347893043 podStartE2EDuration="4.920298903s" podCreationTimestamp="2026-03-09 13:28:15 +0000 UTC" firstStartedPulling="2026-03-09 13:28:16.808125938 +0000 UTC m=+452.058297846" lastFinishedPulling="2026-03-09 13:28:19.380531788 +0000 UTC m=+454.630703706" observedRunningTime="2026-03-09 13:28:19.913557058 +0000 UTC m=+455.163728976" watchObservedRunningTime="2026-03-09 13:28:19.920298903 +0000 UTC m=+455.170470811" Mar 09 13:28:19 crc kubenswrapper[4764]: I0309 13:28:19.923072 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbwx5" event={"ID":"a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa","Type":"ContainerStarted","Data":"3e36f3856d0e3975472304508e0beefe03cae8bac38bfcdfeed65a7529c78eed"} Mar 09 13:28:19 crc kubenswrapper[4764]: I0309 13:28:19.981709 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bbwx5" podStartSLOduration=2.486247766 podStartE2EDuration="4.981692226s" podCreationTimestamp="2026-03-09 13:28:15 +0000 UTC" firstStartedPulling="2026-03-09 13:28:16.811352317 +0000 UTC m=+452.061524215" lastFinishedPulling="2026-03-09 13:28:19.306796777 +0000 UTC m=+454.556968675" observedRunningTime="2026-03-09 13:28:19.979400183 +0000 UTC m=+455.229572101" watchObservedRunningTime="2026-03-09 13:28:19.981692226 +0000 UTC m=+455.231864134" Mar 09 13:28:20 crc kubenswrapper[4764]: I0309 13:28:20.931259 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4sxc8" event={"ID":"fa6ff5f6-9328-419b-a996-05bcf478b446","Type":"ContainerStarted","Data":"81f2426c617949ba2e886b6a6713e69e1e2306d5a032ede7615f1cde56b297b4"} Mar 09 13:28:20 crc kubenswrapper[4764]: I0309 13:28:20.933046 4764 generic.go:334] "Generic (PLEG): container finished" podID="621cdc4e-d896-4775-b654-2d6606097cb9" containerID="fb1be1a72f3234dac4f59f874d4e626091a8abe81a68404c7028805d3e2518ea" exitCode=0 Mar 09 13:28:20 crc kubenswrapper[4764]: I0309 13:28:20.933201 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xn8sz" event={"ID":"621cdc4e-d896-4775-b654-2d6606097cb9","Type":"ContainerDied","Data":"fb1be1a72f3234dac4f59f874d4e626091a8abe81a68404c7028805d3e2518ea"} Mar 09 13:28:21 crc kubenswrapper[4764]: I0309 13:28:21.943917 4764 generic.go:334] "Generic (PLEG): container finished" podID="fa6ff5f6-9328-419b-a996-05bcf478b446" containerID="81f2426c617949ba2e886b6a6713e69e1e2306d5a032ede7615f1cde56b297b4" exitCode=0 Mar 09 13:28:21 crc kubenswrapper[4764]: I0309 13:28:21.943993 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4sxc8" event={"ID":"fa6ff5f6-9328-419b-a996-05bcf478b446","Type":"ContainerDied","Data":"81f2426c617949ba2e886b6a6713e69e1e2306d5a032ede7615f1cde56b297b4"} Mar 09 13:28:21 crc kubenswrapper[4764]: I0309 13:28:21.949046 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xn8sz" event={"ID":"621cdc4e-d896-4775-b654-2d6606097cb9","Type":"ContainerStarted","Data":"678fe7f391aa999be0810d250bf35094d5e83712c163ecc09372d0f1c2a64457"} Mar 09 13:28:21 crc kubenswrapper[4764]: I0309 13:28:21.999716 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xn8sz" podStartSLOduration=2.538070135 podStartE2EDuration="4.999633884s" podCreationTimestamp="2026-03-09 13:28:17 +0000 UTC" firstStartedPulling="2026-03-09 13:28:18.868705289 +0000 UTC m=+454.118877197" lastFinishedPulling="2026-03-09 13:28:21.330269038 +0000 UTC m=+456.580440946" observedRunningTime="2026-03-09 13:28:21.997091207 +0000 UTC m=+457.247263125" watchObservedRunningTime="2026-03-09 13:28:21.999633884 +0000 UTC m=+457.249805802" Mar 09 13:28:22 crc kubenswrapper[4764]: I0309 13:28:22.961595 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4sxc8" event={"ID":"fa6ff5f6-9328-419b-a996-05bcf478b446","Type":"ContainerStarted","Data":"cbaaee31f94d5b978452176039ab59d2520f7f3d0d136e8695a1528c41a1e96e"} Mar 09 13:28:22 crc kubenswrapper[4764]: I0309 13:28:22.984958 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4sxc8" podStartSLOduration=2.5611369440000002 podStartE2EDuration="4.984937949s" podCreationTimestamp="2026-03-09 13:28:18 +0000 UTC" firstStartedPulling="2026-03-09 13:28:19.889443397 +0000 UTC m=+455.139615305" lastFinishedPulling="2026-03-09 13:28:22.313244352 +0000 UTC m=+457.563416310" observedRunningTime="2026-03-09 13:28:22.984748374 +0000 UTC m=+458.234920292" watchObservedRunningTime="2026-03-09 13:28:22.984937949 +0000 UTC m=+458.235109857" Mar 09 13:28:25 crc kubenswrapper[4764]: I0309 13:28:25.538606 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bbwx5" Mar 09 13:28:25 crc kubenswrapper[4764]: I0309 13:28:25.539117 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bbwx5" Mar 09 13:28:25 crc kubenswrapper[4764]: I0309 13:28:25.600540 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bbwx5" Mar 09 13:28:26 crc kubenswrapper[4764]: I0309 13:28:26.025274 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bbwx5" Mar 09 13:28:26 crc kubenswrapper[4764]: I0309 13:28:26.144236 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-whs64" Mar 09 13:28:26 crc kubenswrapper[4764]: I0309 13:28:26.144680 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-whs64" Mar 09 13:28:26 crc kubenswrapper[4764]: I0309 13:28:26.181124 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-whs64" Mar 09 13:28:27 crc kubenswrapper[4764]: I0309 13:28:27.036397 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-whs64" Mar 09 13:28:27 crc kubenswrapper[4764]: I0309 13:28:27.668738 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:28:27 crc kubenswrapper[4764]: I0309 13:28:27.669186 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:28:27 crc kubenswrapper[4764]: I0309 13:28:27.670316 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:28:27 crc kubenswrapper[4764]: I0309 13:28:27.678025 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:28:27 crc kubenswrapper[4764]: I0309 13:28:27.760332 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:28:27 crc kubenswrapper[4764]: I0309 13:28:27.910166 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xn8sz" Mar 09 13:28:27 crc kubenswrapper[4764]: I0309 13:28:27.910257 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xn8sz" Mar 09 13:28:27 crc kubenswrapper[4764]: I0309 13:28:27.957581 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xn8sz" Mar 09 13:28:28 crc kubenswrapper[4764]: W0309 13:28:28.004554 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-c587b0b9f778265ef0d6ecfa8659f9de6da5b53a8719e98811d74683bc3fdf0c WatchSource:0}: Error finding container c587b0b9f778265ef0d6ecfa8659f9de6da5b53a8719e98811d74683bc3fdf0c: Status 404 returned error can't find the container with id c587b0b9f778265ef0d6ecfa8659f9de6da5b53a8719e98811d74683bc3fdf0c Mar 09 13:28:28 crc kubenswrapper[4764]: I0309 13:28:28.041896 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xn8sz" Mar 09 13:28:28 crc kubenswrapper[4764]: I0309 13:28:28.370368 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:28:28 crc kubenswrapper[4764]: I0309 13:28:28.370468 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:28:28 crc kubenswrapper[4764]: I0309 13:28:28.583523 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4sxc8" Mar 09 13:28:28 crc kubenswrapper[4764]: I0309 13:28:28.583572 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4sxc8" Mar 09 13:28:28 crc kubenswrapper[4764]: I0309 13:28:28.629476 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4sxc8" Mar 09 13:28:28 crc kubenswrapper[4764]: I0309 13:28:28.684091 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:28:28 crc kubenswrapper[4764]: I0309 13:28:28.684192 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:28:28 crc kubenswrapper[4764]: I0309 13:28:28.691120 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:28:28 crc kubenswrapper[4764]: I0309 13:28:28.691134 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:28:28 crc kubenswrapper[4764]: I0309 13:28:28.860858 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:28:28 crc kubenswrapper[4764]: I0309 13:28:28.960284 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:28:29 crc kubenswrapper[4764]: I0309 13:28:29.015101 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a84715ddddcc93a14a8cb4cf3872420ec468facb9923c9290e93bd3c37b10b31"} Mar 09 13:28:29 crc kubenswrapper[4764]: I0309 13:28:29.015173 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c587b0b9f778265ef0d6ecfa8659f9de6da5b53a8719e98811d74683bc3fdf0c"} Mar 09 13:28:29 crc kubenswrapper[4764]: I0309 13:28:29.095552 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4sxc8" Mar 09 13:28:29 crc kubenswrapper[4764]: W0309 13:28:29.330466 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-6c8947a1f915848c48c6c0419f75f88ef3a66ca240e783f2db1dc846f908de65 WatchSource:0}: Error finding container 6c8947a1f915848c48c6c0419f75f88ef3a66ca240e783f2db1dc846f908de65: Status 404 returned error can't find the container with id 6c8947a1f915848c48c6c0419f75f88ef3a66ca240e783f2db1dc846f908de65 Mar 09 13:28:30 crc kubenswrapper[4764]: I0309 13:28:30.022276 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"0fd4770d347eabdfc31bf15a33ee5028197544bb396e1ae5fb89adaef69c34d5"} Mar 09 13:28:30 crc kubenswrapper[4764]: I0309 13:28:30.022766 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6c8947a1f915848c48c6c0419f75f88ef3a66ca240e783f2db1dc846f908de65"} Mar 09 13:28:30 crc kubenswrapper[4764]: I0309 13:28:30.025690 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ec57b55e16d15942e8a8d592497f69bea44f72b5da852d3582dba57da9a2033c"} Mar 09 13:28:30 crc kubenswrapper[4764]: I0309 13:28:30.025721 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"096db084c373c866c7fb651144d795fa04cc36153c4327aba08cace8e3c46039"} Mar 09 13:28:30 crc kubenswrapper[4764]: I0309 13:28:30.026057 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:28:58 crc kubenswrapper[4764]: I0309 13:28:58.370805 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:28:58 crc kubenswrapper[4764]: I0309 13:28:58.371455 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:28:58 crc kubenswrapper[4764]: I0309 13:28:58.371525 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:28:58 crc kubenswrapper[4764]: I0309 13:28:58.372834 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8bb39ae24112881ead01c36905f25d69cb36d1e4d3c6e8aa79f283e7dd0d444b"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:28:58 crc kubenswrapper[4764]: I0309 13:28:58.372919 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://8bb39ae24112881ead01c36905f25d69cb36d1e4d3c6e8aa79f283e7dd0d444b" gracePeriod=600 Mar 09 13:28:59 crc kubenswrapper[4764]: I0309 13:28:59.212370 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="8bb39ae24112881ead01c36905f25d69cb36d1e4d3c6e8aa79f283e7dd0d444b" exitCode=0 Mar 09 13:28:59 crc kubenswrapper[4764]: I0309 13:28:59.212439 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"8bb39ae24112881ead01c36905f25d69cb36d1e4d3c6e8aa79f283e7dd0d444b"} Mar 09 13:28:59 crc kubenswrapper[4764]: I0309 13:28:59.213320 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"6be6253041773ccdd98bbdfbbb5e4447fbae62fa99ca2320328889b4671f6c14"} Mar 09 13:28:59 crc kubenswrapper[4764]: I0309 13:28:59.213351 4764 scope.go:117] "RemoveContainer" containerID="a541959fd435ba385dfa711ec03b7d78fb75528fed89a7bd3620aa50fbab26ad" Mar 09 13:29:08 crc kubenswrapper[4764]: I0309 13:29:08.870566 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.142586 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551050-zmlhn"] Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.143895 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551050-zmlhn" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.147597 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.147737 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp"] Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.148001 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.148808 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.149310 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551050-zmlhn"] Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.149604 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.150676 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.151047 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.153379 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp"] Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.272350 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtshq\" (UniqueName: \"kubernetes.io/projected/7f815cd5-462f-4994-bab1-beef4157b06e-kube-api-access-wtshq\") pod \"auto-csr-approver-29551050-zmlhn\" (UID: \"7f815cd5-462f-4994-bab1-beef4157b06e\") " pod="openshift-infra/auto-csr-approver-29551050-zmlhn" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.272446 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a7784d6-384a-426a-8c7f-17738461c327-config-volume\") pod \"collect-profiles-29551050-9wvsp\" (UID: \"1a7784d6-384a-426a-8c7f-17738461c327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.272477 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a7784d6-384a-426a-8c7f-17738461c327-secret-volume\") pod \"collect-profiles-29551050-9wvsp\" (UID: \"1a7784d6-384a-426a-8c7f-17738461c327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.272499 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwwtz\" (UniqueName: \"kubernetes.io/projected/1a7784d6-384a-426a-8c7f-17738461c327-kube-api-access-kwwtz\") pod \"collect-profiles-29551050-9wvsp\" (UID: \"1a7784d6-384a-426a-8c7f-17738461c327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.373588 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a7784d6-384a-426a-8c7f-17738461c327-config-volume\") pod \"collect-profiles-29551050-9wvsp\" (UID: \"1a7784d6-384a-426a-8c7f-17738461c327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.373678 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a7784d6-384a-426a-8c7f-17738461c327-secret-volume\") pod \"collect-profiles-29551050-9wvsp\" (UID: \"1a7784d6-384a-426a-8c7f-17738461c327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.373712 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwwtz\" (UniqueName: \"kubernetes.io/projected/1a7784d6-384a-426a-8c7f-17738461c327-kube-api-access-kwwtz\") pod \"collect-profiles-29551050-9wvsp\" (UID: \"1a7784d6-384a-426a-8c7f-17738461c327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.373739 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtshq\" (UniqueName: \"kubernetes.io/projected/7f815cd5-462f-4994-bab1-beef4157b06e-kube-api-access-wtshq\") pod \"auto-csr-approver-29551050-zmlhn\" (UID: \"7f815cd5-462f-4994-bab1-beef4157b06e\") " pod="openshift-infra/auto-csr-approver-29551050-zmlhn" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.374826 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a7784d6-384a-426a-8c7f-17738461c327-config-volume\") pod \"collect-profiles-29551050-9wvsp\" (UID: \"1a7784d6-384a-426a-8c7f-17738461c327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.382960 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a7784d6-384a-426a-8c7f-17738461c327-secret-volume\") pod \"collect-profiles-29551050-9wvsp\" (UID: \"1a7784d6-384a-426a-8c7f-17738461c327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.392432 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtshq\" (UniqueName: \"kubernetes.io/projected/7f815cd5-462f-4994-bab1-beef4157b06e-kube-api-access-wtshq\") pod \"auto-csr-approver-29551050-zmlhn\" (UID: \"7f815cd5-462f-4994-bab1-beef4157b06e\") " pod="openshift-infra/auto-csr-approver-29551050-zmlhn" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.392739 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwwtz\" (UniqueName: \"kubernetes.io/projected/1a7784d6-384a-426a-8c7f-17738461c327-kube-api-access-kwwtz\") pod \"collect-profiles-29551050-9wvsp\" (UID: \"1a7784d6-384a-426a-8c7f-17738461c327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.477862 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551050-zmlhn" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.496328 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.773575 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551050-zmlhn"] Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.788124 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:30:00 crc kubenswrapper[4764]: I0309 13:30:00.933212 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp"] Mar 09 13:30:00 crc kubenswrapper[4764]: W0309 13:30:00.939442 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a7784d6_384a_426a_8c7f_17738461c327.slice/crio-bd22d72b8374d245cd7927a10057d1dc10b1552a1c9306e29d1f5b840279060d WatchSource:0}: Error finding container bd22d72b8374d245cd7927a10057d1dc10b1552a1c9306e29d1f5b840279060d: Status 404 returned error can't find the container with id bd22d72b8374d245cd7927a10057d1dc10b1552a1c9306e29d1f5b840279060d Mar 09 13:30:01 crc kubenswrapper[4764]: I0309 13:30:01.743512 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551050-zmlhn" event={"ID":"7f815cd5-462f-4994-bab1-beef4157b06e","Type":"ContainerStarted","Data":"fee757faf62e596d9a24630a4cb2fe2b56b3f58f6cb88a77fee32c0a8f6e491c"} Mar 09 13:30:01 crc kubenswrapper[4764]: I0309 13:30:01.744359 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" event={"ID":"1a7784d6-384a-426a-8c7f-17738461c327","Type":"ContainerStarted","Data":"bd22d72b8374d245cd7927a10057d1dc10b1552a1c9306e29d1f5b840279060d"} Mar 09 13:30:02 crc kubenswrapper[4764]: I0309 13:30:02.752018 4764 generic.go:334] "Generic (PLEG): container finished" podID="7f815cd5-462f-4994-bab1-beef4157b06e" containerID="a7ac41644a3901488ef405782554d6dd08becac720d51b607ff6a4cba78e912f" exitCode=0 Mar 09 13:30:02 crc kubenswrapper[4764]: I0309 13:30:02.752094 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551050-zmlhn" event={"ID":"7f815cd5-462f-4994-bab1-beef4157b06e","Type":"ContainerDied","Data":"a7ac41644a3901488ef405782554d6dd08becac720d51b607ff6a4cba78e912f"} Mar 09 13:30:02 crc kubenswrapper[4764]: I0309 13:30:02.754082 4764 generic.go:334] "Generic (PLEG): container finished" podID="1a7784d6-384a-426a-8c7f-17738461c327" containerID="aa846508ebc812b1aaea2ee2e48b6017bd527c31a06deb61dd1001786fb1b811" exitCode=0 Mar 09 13:30:02 crc kubenswrapper[4764]: I0309 13:30:02.754128 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" event={"ID":"1a7784d6-384a-426a-8c7f-17738461c327","Type":"ContainerDied","Data":"aa846508ebc812b1aaea2ee2e48b6017bd527c31a06deb61dd1001786fb1b811"} Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.083160 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.088226 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551050-zmlhn" Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.226565 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a7784d6-384a-426a-8c7f-17738461c327-config-volume\") pod \"1a7784d6-384a-426a-8c7f-17738461c327\" (UID: \"1a7784d6-384a-426a-8c7f-17738461c327\") " Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.226674 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtshq\" (UniqueName: \"kubernetes.io/projected/7f815cd5-462f-4994-bab1-beef4157b06e-kube-api-access-wtshq\") pod \"7f815cd5-462f-4994-bab1-beef4157b06e\" (UID: \"7f815cd5-462f-4994-bab1-beef4157b06e\") " Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.226740 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwwtz\" (UniqueName: \"kubernetes.io/projected/1a7784d6-384a-426a-8c7f-17738461c327-kube-api-access-kwwtz\") pod \"1a7784d6-384a-426a-8c7f-17738461c327\" (UID: \"1a7784d6-384a-426a-8c7f-17738461c327\") " Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.226902 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a7784d6-384a-426a-8c7f-17738461c327-secret-volume\") pod \"1a7784d6-384a-426a-8c7f-17738461c327\" (UID: \"1a7784d6-384a-426a-8c7f-17738461c327\") " Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.227612 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a7784d6-384a-426a-8c7f-17738461c327-config-volume" (OuterVolumeSpecName: "config-volume") pod "1a7784d6-384a-426a-8c7f-17738461c327" (UID: "1a7784d6-384a-426a-8c7f-17738461c327"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.233528 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a7784d6-384a-426a-8c7f-17738461c327-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1a7784d6-384a-426a-8c7f-17738461c327" (UID: "1a7784d6-384a-426a-8c7f-17738461c327"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.233967 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a7784d6-384a-426a-8c7f-17738461c327-kube-api-access-kwwtz" (OuterVolumeSpecName: "kube-api-access-kwwtz") pod "1a7784d6-384a-426a-8c7f-17738461c327" (UID: "1a7784d6-384a-426a-8c7f-17738461c327"). InnerVolumeSpecName "kube-api-access-kwwtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.240858 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f815cd5-462f-4994-bab1-beef4157b06e-kube-api-access-wtshq" (OuterVolumeSpecName: "kube-api-access-wtshq") pod "7f815cd5-462f-4994-bab1-beef4157b06e" (UID: "7f815cd5-462f-4994-bab1-beef4157b06e"). InnerVolumeSpecName "kube-api-access-wtshq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.327965 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a7784d6-384a-426a-8c7f-17738461c327-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.328004 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtshq\" (UniqueName: \"kubernetes.io/projected/7f815cd5-462f-4994-bab1-beef4157b06e-kube-api-access-wtshq\") on node \"crc\" DevicePath \"\"" Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.328015 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwwtz\" (UniqueName: \"kubernetes.io/projected/1a7784d6-384a-426a-8c7f-17738461c327-kube-api-access-kwwtz\") on node \"crc\" DevicePath \"\"" Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.328024 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a7784d6-384a-426a-8c7f-17738461c327-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.771187 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551050-zmlhn" event={"ID":"7f815cd5-462f-4994-bab1-beef4157b06e","Type":"ContainerDied","Data":"fee757faf62e596d9a24630a4cb2fe2b56b3f58f6cb88a77fee32c0a8f6e491c"} Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.771237 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fee757faf62e596d9a24630a4cb2fe2b56b3f58f6cb88a77fee32c0a8f6e491c" Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.771347 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551050-zmlhn" Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.773546 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" event={"ID":"1a7784d6-384a-426a-8c7f-17738461c327","Type":"ContainerDied","Data":"bd22d72b8374d245cd7927a10057d1dc10b1552a1c9306e29d1f5b840279060d"} Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.773611 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp" Mar 09 13:30:04 crc kubenswrapper[4764]: I0309 13:30:04.773640 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd22d72b8374d245cd7927a10057d1dc10b1552a1c9306e29d1f5b840279060d" Mar 09 13:30:05 crc kubenswrapper[4764]: I0309 13:30:05.177769 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551044-p748f"] Mar 09 13:30:05 crc kubenswrapper[4764]: I0309 13:30:05.183325 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551044-p748f"] Mar 09 13:30:05 crc kubenswrapper[4764]: I0309 13:30:05.572349 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a005f65-920a-4cdd-b4da-a270953113aa" path="/var/lib/kubelet/pods/0a005f65-920a-4cdd-b4da-a270953113aa/volumes" Mar 09 13:30:58 crc kubenswrapper[4764]: I0309 13:30:58.370008 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:30:58 crc kubenswrapper[4764]: I0309 13:30:58.370419 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:31:28 crc kubenswrapper[4764]: I0309 13:31:28.370636 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:31:28 crc kubenswrapper[4764]: I0309 13:31:28.371756 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:31:49 crc kubenswrapper[4764]: I0309 13:31:49.333395 4764 scope.go:117] "RemoveContainer" containerID="3ce66a9ae238a55280ff899673763223d028dea40122d545386701984ba576ae" Mar 09 13:31:58 crc kubenswrapper[4764]: I0309 13:31:58.370572 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:31:58 crc kubenswrapper[4764]: I0309 13:31:58.371638 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:31:58 crc kubenswrapper[4764]: I0309 13:31:58.371767 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:31:58 crc kubenswrapper[4764]: I0309 13:31:58.372758 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6be6253041773ccdd98bbdfbbb5e4447fbae62fa99ca2320328889b4671f6c14"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:31:58 crc kubenswrapper[4764]: I0309 13:31:58.372891 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://6be6253041773ccdd98bbdfbbb5e4447fbae62fa99ca2320328889b4671f6c14" gracePeriod=600 Mar 09 13:31:58 crc kubenswrapper[4764]: I0309 13:31:58.585954 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="6be6253041773ccdd98bbdfbbb5e4447fbae62fa99ca2320328889b4671f6c14" exitCode=0 Mar 09 13:31:58 crc kubenswrapper[4764]: I0309 13:31:58.586022 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"6be6253041773ccdd98bbdfbbb5e4447fbae62fa99ca2320328889b4671f6c14"} Mar 09 13:31:58 crc kubenswrapper[4764]: I0309 13:31:58.586156 4764 scope.go:117] "RemoveContainer" containerID="8bb39ae24112881ead01c36905f25d69cb36d1e4d3c6e8aa79f283e7dd0d444b" Mar 09 13:31:59 crc kubenswrapper[4764]: I0309 13:31:59.596831 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"3e004d46e664a823c675c177e537cdd2b21dcfc6ff9afcb1b390da1939340817"} Mar 09 13:32:00 crc kubenswrapper[4764]: I0309 13:32:00.147724 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551052-t652n"] Mar 09 13:32:00 crc kubenswrapper[4764]: E0309 13:32:00.148028 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a7784d6-384a-426a-8c7f-17738461c327" containerName="collect-profiles" Mar 09 13:32:00 crc kubenswrapper[4764]: I0309 13:32:00.148047 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a7784d6-384a-426a-8c7f-17738461c327" containerName="collect-profiles" Mar 09 13:32:00 crc kubenswrapper[4764]: E0309 13:32:00.148067 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f815cd5-462f-4994-bab1-beef4157b06e" containerName="oc" Mar 09 13:32:00 crc kubenswrapper[4764]: I0309 13:32:00.148075 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f815cd5-462f-4994-bab1-beef4157b06e" containerName="oc" Mar 09 13:32:00 crc kubenswrapper[4764]: I0309 13:32:00.148219 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a7784d6-384a-426a-8c7f-17738461c327" containerName="collect-profiles" Mar 09 13:32:00 crc kubenswrapper[4764]: I0309 13:32:00.148235 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f815cd5-462f-4994-bab1-beef4157b06e" containerName="oc" Mar 09 13:32:00 crc kubenswrapper[4764]: I0309 13:32:00.148772 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551052-t652n" Mar 09 13:32:00 crc kubenswrapper[4764]: I0309 13:32:00.151934 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:32:00 crc kubenswrapper[4764]: I0309 13:32:00.151989 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:32:00 crc kubenswrapper[4764]: I0309 13:32:00.152086 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:32:00 crc kubenswrapper[4764]: I0309 13:32:00.186898 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551052-t652n"] Mar 09 13:32:00 crc kubenswrapper[4764]: I0309 13:32:00.194229 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b66fj\" (UniqueName: \"kubernetes.io/projected/ee50d407-01a6-43e7-833e-b803dbb4792f-kube-api-access-b66fj\") pod \"auto-csr-approver-29551052-t652n\" (UID: \"ee50d407-01a6-43e7-833e-b803dbb4792f\") " pod="openshift-infra/auto-csr-approver-29551052-t652n" Mar 09 13:32:00 crc kubenswrapper[4764]: I0309 13:32:00.295366 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b66fj\" (UniqueName: \"kubernetes.io/projected/ee50d407-01a6-43e7-833e-b803dbb4792f-kube-api-access-b66fj\") pod \"auto-csr-approver-29551052-t652n\" (UID: \"ee50d407-01a6-43e7-833e-b803dbb4792f\") " pod="openshift-infra/auto-csr-approver-29551052-t652n" Mar 09 13:32:00 crc kubenswrapper[4764]: I0309 13:32:00.317070 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b66fj\" (UniqueName: \"kubernetes.io/projected/ee50d407-01a6-43e7-833e-b803dbb4792f-kube-api-access-b66fj\") pod \"auto-csr-approver-29551052-t652n\" (UID: \"ee50d407-01a6-43e7-833e-b803dbb4792f\") " pod="openshift-infra/auto-csr-approver-29551052-t652n" Mar 09 13:32:00 crc kubenswrapper[4764]: I0309 13:32:00.490017 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551052-t652n" Mar 09 13:32:00 crc kubenswrapper[4764]: I0309 13:32:00.738677 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551052-t652n"] Mar 09 13:32:01 crc kubenswrapper[4764]: I0309 13:32:01.621488 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551052-t652n" event={"ID":"ee50d407-01a6-43e7-833e-b803dbb4792f","Type":"ContainerStarted","Data":"11bea6b1c69c2fe5f00de917bcf3bd6c5542470358427980ac9d3ff037bf4fb1"} Mar 09 13:32:02 crc kubenswrapper[4764]: I0309 13:32:02.633754 4764 generic.go:334] "Generic (PLEG): container finished" podID="ee50d407-01a6-43e7-833e-b803dbb4792f" containerID="492bc9f6bc85937ff3b8aee6d3f29e3e150f1bc426852bf9cdf6e5878286e321" exitCode=0 Mar 09 13:32:02 crc kubenswrapper[4764]: I0309 13:32:02.633973 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551052-t652n" event={"ID":"ee50d407-01a6-43e7-833e-b803dbb4792f","Type":"ContainerDied","Data":"492bc9f6bc85937ff3b8aee6d3f29e3e150f1bc426852bf9cdf6e5878286e321"} Mar 09 13:32:03 crc kubenswrapper[4764]: I0309 13:32:03.875115 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551052-t652n" Mar 09 13:32:03 crc kubenswrapper[4764]: I0309 13:32:03.946633 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b66fj\" (UniqueName: \"kubernetes.io/projected/ee50d407-01a6-43e7-833e-b803dbb4792f-kube-api-access-b66fj\") pod \"ee50d407-01a6-43e7-833e-b803dbb4792f\" (UID: \"ee50d407-01a6-43e7-833e-b803dbb4792f\") " Mar 09 13:32:03 crc kubenswrapper[4764]: I0309 13:32:03.954162 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee50d407-01a6-43e7-833e-b803dbb4792f-kube-api-access-b66fj" (OuterVolumeSpecName: "kube-api-access-b66fj") pod "ee50d407-01a6-43e7-833e-b803dbb4792f" (UID: "ee50d407-01a6-43e7-833e-b803dbb4792f"). InnerVolumeSpecName "kube-api-access-b66fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:32:04 crc kubenswrapper[4764]: I0309 13:32:04.047600 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b66fj\" (UniqueName: \"kubernetes.io/projected/ee50d407-01a6-43e7-833e-b803dbb4792f-kube-api-access-b66fj\") on node \"crc\" DevicePath \"\"" Mar 09 13:32:04 crc kubenswrapper[4764]: I0309 13:32:04.658501 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551052-t652n" event={"ID":"ee50d407-01a6-43e7-833e-b803dbb4792f","Type":"ContainerDied","Data":"11bea6b1c69c2fe5f00de917bcf3bd6c5542470358427980ac9d3ff037bf4fb1"} Mar 09 13:32:04 crc kubenswrapper[4764]: I0309 13:32:04.658869 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11bea6b1c69c2fe5f00de917bcf3bd6c5542470358427980ac9d3ff037bf4fb1" Mar 09 13:32:04 crc kubenswrapper[4764]: I0309 13:32:04.659273 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551052-t652n" Mar 09 13:32:04 crc kubenswrapper[4764]: I0309 13:32:04.954209 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551046-ll87d"] Mar 09 13:32:04 crc kubenswrapper[4764]: I0309 13:32:04.958951 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551046-ll87d"] Mar 09 13:32:05 crc kubenswrapper[4764]: I0309 13:32:05.571135 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c09230f9-b117-44a0-b3ed-ab6dc7ce0285" path="/var/lib/kubelet/pods/c09230f9-b117-44a0-b3ed-ab6dc7ce0285/volumes" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.163236 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mfcng"] Mar 09 13:32:12 crc kubenswrapper[4764]: E0309 13:32:12.164467 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee50d407-01a6-43e7-833e-b803dbb4792f" containerName="oc" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.164484 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee50d407-01a6-43e7-833e-b803dbb4792f" containerName="oc" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.164615 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee50d407-01a6-43e7-833e-b803dbb4792f" containerName="oc" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.165212 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.186850 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mfcng"] Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.272547 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d889004a-dd34-46e9-ad61-d5bfb627ca16-registry-tls\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.272620 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.272706 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d889004a-dd34-46e9-ad61-d5bfb627ca16-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.272738 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pms6q\" (UniqueName: \"kubernetes.io/projected/d889004a-dd34-46e9-ad61-d5bfb627ca16-kube-api-access-pms6q\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.272766 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d889004a-dd34-46e9-ad61-d5bfb627ca16-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.272790 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d889004a-dd34-46e9-ad61-d5bfb627ca16-trusted-ca\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.272824 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d889004a-dd34-46e9-ad61-d5bfb627ca16-registry-certificates\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.272930 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d889004a-dd34-46e9-ad61-d5bfb627ca16-bound-sa-token\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.298763 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.374114 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d889004a-dd34-46e9-ad61-d5bfb627ca16-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.374180 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d889004a-dd34-46e9-ad61-d5bfb627ca16-trusted-ca\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.374222 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d889004a-dd34-46e9-ad61-d5bfb627ca16-registry-certificates\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.374250 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d889004a-dd34-46e9-ad61-d5bfb627ca16-bound-sa-token\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.374293 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d889004a-dd34-46e9-ad61-d5bfb627ca16-registry-tls\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.374343 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d889004a-dd34-46e9-ad61-d5bfb627ca16-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.374379 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pms6q\" (UniqueName: \"kubernetes.io/projected/d889004a-dd34-46e9-ad61-d5bfb627ca16-kube-api-access-pms6q\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.375736 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d889004a-dd34-46e9-ad61-d5bfb627ca16-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.376243 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d889004a-dd34-46e9-ad61-d5bfb627ca16-registry-certificates\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.376460 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d889004a-dd34-46e9-ad61-d5bfb627ca16-trusted-ca\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.380980 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d889004a-dd34-46e9-ad61-d5bfb627ca16-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.384741 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d889004a-dd34-46e9-ad61-d5bfb627ca16-registry-tls\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.391986 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d889004a-dd34-46e9-ad61-d5bfb627ca16-bound-sa-token\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.392279 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pms6q\" (UniqueName: \"kubernetes.io/projected/d889004a-dd34-46e9-ad61-d5bfb627ca16-kube-api-access-pms6q\") pod \"image-registry-66df7c8f76-mfcng\" (UID: \"d889004a-dd34-46e9-ad61-d5bfb627ca16\") " pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.482663 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.702398 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mfcng"] Mar 09 13:32:12 crc kubenswrapper[4764]: W0309 13:32:12.707287 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd889004a_dd34_46e9_ad61_d5bfb627ca16.slice/crio-7985ecb6085e5782c562962cad926d710f5b5e1293068580f4a4447c44f61bce WatchSource:0}: Error finding container 7985ecb6085e5782c562962cad926d710f5b5e1293068580f4a4447c44f61bce: Status 404 returned error can't find the container with id 7985ecb6085e5782c562962cad926d710f5b5e1293068580f4a4447c44f61bce Mar 09 13:32:12 crc kubenswrapper[4764]: I0309 13:32:12.718185 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" event={"ID":"d889004a-dd34-46e9-ad61-d5bfb627ca16","Type":"ContainerStarted","Data":"7985ecb6085e5782c562962cad926d710f5b5e1293068580f4a4447c44f61bce"} Mar 09 13:32:13 crc kubenswrapper[4764]: I0309 13:32:13.727862 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" event={"ID":"d889004a-dd34-46e9-ad61-d5bfb627ca16","Type":"ContainerStarted","Data":"724b8184d1dc96d6b624e07c336ea0f919ff1107762c45217d5416879d550b7d"} Mar 09 13:32:13 crc kubenswrapper[4764]: I0309 13:32:13.728446 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:13 crc kubenswrapper[4764]: I0309 13:32:13.759500 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" podStartSLOduration=1.759465625 podStartE2EDuration="1.759465625s" podCreationTimestamp="2026-03-09 13:32:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:32:13.755816597 +0000 UTC m=+689.005988585" watchObservedRunningTime="2026-03-09 13:32:13.759465625 +0000 UTC m=+689.009637583" Mar 09 13:32:32 crc kubenswrapper[4764]: I0309 13:32:32.487868 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-mfcng" Mar 09 13:32:32 crc kubenswrapper[4764]: I0309 13:32:32.562898 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-w49hn"] Mar 09 13:32:49 crc kubenswrapper[4764]: I0309 13:32:49.409103 4764 scope.go:117] "RemoveContainer" containerID="bde3db10cbf6b95804c8c69e0b70c34f476ab92c998e4a7ae6079e0e0e9d7b98" Mar 09 13:32:57 crc kubenswrapper[4764]: I0309 13:32:57.627671 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" podUID="d3652fe0-4889-432f-af3f-787dd19c60d6" containerName="registry" containerID="cri-o://d4f1dae0a1e050bab0d3f5cbb5a68f2b2bcb5dd4773d77210e4e118d9cdca6b4" gracePeriod=30 Mar 09 13:32:57 crc kubenswrapper[4764]: I0309 13:32:57.981714 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.049416 4764 generic.go:334] "Generic (PLEG): container finished" podID="d3652fe0-4889-432f-af3f-787dd19c60d6" containerID="d4f1dae0a1e050bab0d3f5cbb5a68f2b2bcb5dd4773d77210e4e118d9cdca6b4" exitCode=0 Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.049452 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" event={"ID":"d3652fe0-4889-432f-af3f-787dd19c60d6","Type":"ContainerDied","Data":"d4f1dae0a1e050bab0d3f5cbb5a68f2b2bcb5dd4773d77210e4e118d9cdca6b4"} Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.049481 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.049499 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-w49hn" event={"ID":"d3652fe0-4889-432f-af3f-787dd19c60d6","Type":"ContainerDied","Data":"f63eb7186d86d6bf6062656b177ec8adc1e47100bcb490f0e226ebe524f4ea53"} Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.049514 4764 scope.go:117] "RemoveContainer" containerID="d4f1dae0a1e050bab0d3f5cbb5a68f2b2bcb5dd4773d77210e4e118d9cdca6b4" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.066917 4764 scope.go:117] "RemoveContainer" containerID="d4f1dae0a1e050bab0d3f5cbb5a68f2b2bcb5dd4773d77210e4e118d9cdca6b4" Mar 09 13:32:58 crc kubenswrapper[4764]: E0309 13:32:58.067268 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4f1dae0a1e050bab0d3f5cbb5a68f2b2bcb5dd4773d77210e4e118d9cdca6b4\": container with ID starting with d4f1dae0a1e050bab0d3f5cbb5a68f2b2bcb5dd4773d77210e4e118d9cdca6b4 not found: ID does not exist" containerID="d4f1dae0a1e050bab0d3f5cbb5a68f2b2bcb5dd4773d77210e4e118d9cdca6b4" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.067311 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4f1dae0a1e050bab0d3f5cbb5a68f2b2bcb5dd4773d77210e4e118d9cdca6b4"} err="failed to get container status \"d4f1dae0a1e050bab0d3f5cbb5a68f2b2bcb5dd4773d77210e4e118d9cdca6b4\": rpc error: code = NotFound desc = could not find container \"d4f1dae0a1e050bab0d3f5cbb5a68f2b2bcb5dd4773d77210e4e118d9cdca6b4\": container with ID starting with d4f1dae0a1e050bab0d3f5cbb5a68f2b2bcb5dd4773d77210e4e118d9cdca6b4 not found: ID does not exist" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.137485 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-bound-sa-token\") pod \"d3652fe0-4889-432f-af3f-787dd19c60d6\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.137889 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d3652fe0-4889-432f-af3f-787dd19c60d6-ca-trust-extracted\") pod \"d3652fe0-4889-432f-af3f-787dd19c60d6\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.138053 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"d3652fe0-4889-432f-af3f-787dd19c60d6\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.138081 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d3652fe0-4889-432f-af3f-787dd19c60d6-registry-certificates\") pod \"d3652fe0-4889-432f-af3f-787dd19c60d6\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.138185 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3652fe0-4889-432f-af3f-787dd19c60d6-trusted-ca\") pod \"d3652fe0-4889-432f-af3f-787dd19c60d6\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.138215 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d3652fe0-4889-432f-af3f-787dd19c60d6-installation-pull-secrets\") pod \"d3652fe0-4889-432f-af3f-787dd19c60d6\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.138234 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-registry-tls\") pod \"d3652fe0-4889-432f-af3f-787dd19c60d6\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.138280 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsz44\" (UniqueName: \"kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-kube-api-access-xsz44\") pod \"d3652fe0-4889-432f-af3f-787dd19c60d6\" (UID: \"d3652fe0-4889-432f-af3f-787dd19c60d6\") " Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.138864 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3652fe0-4889-432f-af3f-787dd19c60d6-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d3652fe0-4889-432f-af3f-787dd19c60d6" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.139183 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3652fe0-4889-432f-af3f-787dd19c60d6-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "d3652fe0-4889-432f-af3f-787dd19c60d6" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.144487 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "d3652fe0-4889-432f-af3f-787dd19c60d6" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.144513 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "d3652fe0-4889-432f-af3f-787dd19c60d6" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.144570 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3652fe0-4889-432f-af3f-787dd19c60d6-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "d3652fe0-4889-432f-af3f-787dd19c60d6" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.144854 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-kube-api-access-xsz44" (OuterVolumeSpecName: "kube-api-access-xsz44") pod "d3652fe0-4889-432f-af3f-787dd19c60d6" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6"). InnerVolumeSpecName "kube-api-access-xsz44". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.151631 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "d3652fe0-4889-432f-af3f-787dd19c60d6" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.155599 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3652fe0-4889-432f-af3f-787dd19c60d6-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "d3652fe0-4889-432f-af3f-787dd19c60d6" (UID: "d3652fe0-4889-432f-af3f-787dd19c60d6"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.239446 4764 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.239485 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsz44\" (UniqueName: \"kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-kube-api-access-xsz44\") on node \"crc\" DevicePath \"\"" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.239497 4764 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3652fe0-4889-432f-af3f-787dd19c60d6-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.239506 4764 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d3652fe0-4889-432f-af3f-787dd19c60d6-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.239515 4764 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d3652fe0-4889-432f-af3f-787dd19c60d6-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.239523 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3652fe0-4889-432f-af3f-787dd19c60d6-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.239531 4764 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d3652fe0-4889-432f-af3f-787dd19c60d6-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.378438 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-w49hn"] Mar 09 13:32:58 crc kubenswrapper[4764]: I0309 13:32:58.382296 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-w49hn"] Mar 09 13:32:59 crc kubenswrapper[4764]: I0309 13:32:59.568177 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3652fe0-4889-432f-af3f-787dd19c60d6" path="/var/lib/kubelet/pods/d3652fe0-4889-432f-af3f-787dd19c60d6/volumes" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.150771 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-qr7fv"] Mar 09 13:33:34 crc kubenswrapper[4764]: E0309 13:33:34.151631 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3652fe0-4889-432f-af3f-787dd19c60d6" containerName="registry" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.151660 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3652fe0-4889-432f-af3f-787dd19c60d6" containerName="registry" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.151753 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3652fe0-4889-432f-af3f-787dd19c60d6" containerName="registry" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.152170 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-qr7fv" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.154413 4764 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-24l57" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.154417 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.158500 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-qr7fv"] Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.158636 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.170706 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-j8rlp"] Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.171471 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-j8rlp" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.173301 4764 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-4tlmc" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.177004 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-lhpfw"] Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.177887 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-lhpfw" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.179953 4764 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-q2chc" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.189467 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-lhpfw"] Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.198754 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-j8rlp"] Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.349274 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxtwr\" (UniqueName: \"kubernetes.io/projected/09aeffa2-590d-4062-95ff-40dbdda54df7-kube-api-access-fxtwr\") pod \"cert-manager-webhook-687f57d79b-j8rlp\" (UID: \"09aeffa2-590d-4062-95ff-40dbdda54df7\") " pod="cert-manager/cert-manager-webhook-687f57d79b-j8rlp" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.349423 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcdfn\" (UniqueName: \"kubernetes.io/projected/0a35f012-3965-4680-aa01-9fa97f956c68-kube-api-access-jcdfn\") pod \"cert-manager-cainjector-cf98fcc89-qr7fv\" (UID: \"0a35f012-3965-4680-aa01-9fa97f956c68\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-qr7fv" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.349735 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfg2w\" (UniqueName: \"kubernetes.io/projected/2eef62f2-5973-47e2-b921-9e1a05b9f8fb-kube-api-access-qfg2w\") pod \"cert-manager-858654f9db-lhpfw\" (UID: \"2eef62f2-5973-47e2-b921-9e1a05b9f8fb\") " pod="cert-manager/cert-manager-858654f9db-lhpfw" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.450746 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfg2w\" (UniqueName: \"kubernetes.io/projected/2eef62f2-5973-47e2-b921-9e1a05b9f8fb-kube-api-access-qfg2w\") pod \"cert-manager-858654f9db-lhpfw\" (UID: \"2eef62f2-5973-47e2-b921-9e1a05b9f8fb\") " pod="cert-manager/cert-manager-858654f9db-lhpfw" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.450825 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxtwr\" (UniqueName: \"kubernetes.io/projected/09aeffa2-590d-4062-95ff-40dbdda54df7-kube-api-access-fxtwr\") pod \"cert-manager-webhook-687f57d79b-j8rlp\" (UID: \"09aeffa2-590d-4062-95ff-40dbdda54df7\") " pod="cert-manager/cert-manager-webhook-687f57d79b-j8rlp" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.450895 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcdfn\" (UniqueName: \"kubernetes.io/projected/0a35f012-3965-4680-aa01-9fa97f956c68-kube-api-access-jcdfn\") pod \"cert-manager-cainjector-cf98fcc89-qr7fv\" (UID: \"0a35f012-3965-4680-aa01-9fa97f956c68\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-qr7fv" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.472601 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfg2w\" (UniqueName: \"kubernetes.io/projected/2eef62f2-5973-47e2-b921-9e1a05b9f8fb-kube-api-access-qfg2w\") pod \"cert-manager-858654f9db-lhpfw\" (UID: \"2eef62f2-5973-47e2-b921-9e1a05b9f8fb\") " pod="cert-manager/cert-manager-858654f9db-lhpfw" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.474022 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcdfn\" (UniqueName: \"kubernetes.io/projected/0a35f012-3965-4680-aa01-9fa97f956c68-kube-api-access-jcdfn\") pod \"cert-manager-cainjector-cf98fcc89-qr7fv\" (UID: \"0a35f012-3965-4680-aa01-9fa97f956c68\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-qr7fv" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.474833 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxtwr\" (UniqueName: \"kubernetes.io/projected/09aeffa2-590d-4062-95ff-40dbdda54df7-kube-api-access-fxtwr\") pod \"cert-manager-webhook-687f57d79b-j8rlp\" (UID: \"09aeffa2-590d-4062-95ff-40dbdda54df7\") " pod="cert-manager/cert-manager-webhook-687f57d79b-j8rlp" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.489513 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-j8rlp" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.499635 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-lhpfw" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.770025 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-qr7fv" Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.914219 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-j8rlp"] Mar 09 13:33:34 crc kubenswrapper[4764]: I0309 13:33:34.950772 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-lhpfw"] Mar 09 13:33:35 crc kubenswrapper[4764]: I0309 13:33:35.008767 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-qr7fv"] Mar 09 13:33:35 crc kubenswrapper[4764]: W0309 13:33:35.009525 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a35f012_3965_4680_aa01_9fa97f956c68.slice/crio-d1cb5ec4000d3723379906e3955229a2d55d6879e00779e674670085fb728cc2 WatchSource:0}: Error finding container d1cb5ec4000d3723379906e3955229a2d55d6879e00779e674670085fb728cc2: Status 404 returned error can't find the container with id d1cb5ec4000d3723379906e3955229a2d55d6879e00779e674670085fb728cc2 Mar 09 13:33:35 crc kubenswrapper[4764]: I0309 13:33:35.266593 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-j8rlp" event={"ID":"09aeffa2-590d-4062-95ff-40dbdda54df7","Type":"ContainerStarted","Data":"d4978b2c656585075e63b58f95bc18a19903727c4830d4aebbc64b22f9312a79"} Mar 09 13:33:35 crc kubenswrapper[4764]: I0309 13:33:35.267728 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-lhpfw" event={"ID":"2eef62f2-5973-47e2-b921-9e1a05b9f8fb","Type":"ContainerStarted","Data":"5011261d07ae7d19f0c1eb436f07682a98b94db716205069e7f407639312cbfa"} Mar 09 13:33:35 crc kubenswrapper[4764]: I0309 13:33:35.269345 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-qr7fv" event={"ID":"0a35f012-3965-4680-aa01-9fa97f956c68","Type":"ContainerStarted","Data":"d1cb5ec4000d3723379906e3955229a2d55d6879e00779e674670085fb728cc2"} Mar 09 13:33:39 crc kubenswrapper[4764]: I0309 13:33:39.294071 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-qr7fv" event={"ID":"0a35f012-3965-4680-aa01-9fa97f956c68","Type":"ContainerStarted","Data":"afc8b1247bf6e0d5f3094f3398496746d4f571421819ba50153ad48035c09c70"} Mar 09 13:33:39 crc kubenswrapper[4764]: I0309 13:33:39.295922 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-j8rlp" event={"ID":"09aeffa2-590d-4062-95ff-40dbdda54df7","Type":"ContainerStarted","Data":"612f15629d8e9cf120f39849624aa663b8348f7e24b5bc70f3853c80c8ccad71"} Mar 09 13:33:39 crc kubenswrapper[4764]: I0309 13:33:39.296077 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-j8rlp" Mar 09 13:33:39 crc kubenswrapper[4764]: I0309 13:33:39.297217 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-lhpfw" event={"ID":"2eef62f2-5973-47e2-b921-9e1a05b9f8fb","Type":"ContainerStarted","Data":"960fe3c14eef15d17901031ba54697393727791b60a66308cbbf53854e95f0df"} Mar 09 13:33:39 crc kubenswrapper[4764]: I0309 13:33:39.314122 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-qr7fv" podStartSLOduration=1.5016224729999998 podStartE2EDuration="5.314100875s" podCreationTimestamp="2026-03-09 13:33:34 +0000 UTC" firstStartedPulling="2026-03-09 13:33:35.012464646 +0000 UTC m=+770.262636554" lastFinishedPulling="2026-03-09 13:33:38.824943048 +0000 UTC m=+774.075114956" observedRunningTime="2026-03-09 13:33:39.311572397 +0000 UTC m=+774.561744315" watchObservedRunningTime="2026-03-09 13:33:39.314100875 +0000 UTC m=+774.564272783" Mar 09 13:33:39 crc kubenswrapper[4764]: I0309 13:33:39.327857 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-j8rlp" podStartSLOduration=1.433348928 podStartE2EDuration="5.327832904s" podCreationTimestamp="2026-03-09 13:33:34 +0000 UTC" firstStartedPulling="2026-03-09 13:33:34.927483332 +0000 UTC m=+770.177655240" lastFinishedPulling="2026-03-09 13:33:38.821967308 +0000 UTC m=+774.072139216" observedRunningTime="2026-03-09 13:33:39.327519655 +0000 UTC m=+774.577691553" watchObservedRunningTime="2026-03-09 13:33:39.327832904 +0000 UTC m=+774.578004822" Mar 09 13:33:39 crc kubenswrapper[4764]: I0309 13:33:39.350307 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-lhpfw" podStartSLOduration=1.423264047 podStartE2EDuration="5.350287667s" podCreationTimestamp="2026-03-09 13:33:34 +0000 UTC" firstStartedPulling="2026-03-09 13:33:34.961290791 +0000 UTC m=+770.211462699" lastFinishedPulling="2026-03-09 13:33:38.888314411 +0000 UTC m=+774.138486319" observedRunningTime="2026-03-09 13:33:39.349224819 +0000 UTC m=+774.599396727" watchObservedRunningTime="2026-03-09 13:33:39.350287667 +0000 UTC m=+774.600459575" Mar 09 13:33:44 crc kubenswrapper[4764]: I0309 13:33:44.492193 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-j8rlp" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.370363 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.371186 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.492091 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7kggv"] Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.492632 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="northd" containerID="cri-o://20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519" gracePeriod=30 Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.492638 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="sbdb" containerID="cri-o://df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017" gracePeriod=30 Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.492577 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="nbdb" containerID="cri-o://6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d" gracePeriod=30 Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.492818 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="kube-rbac-proxy-node" containerID="cri-o://9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01" gracePeriod=30 Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.492687 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29" gracePeriod=30 Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.492736 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovn-controller" containerID="cri-o://9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4" gracePeriod=30 Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.492674 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovn-acl-logging" containerID="cri-o://8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb" gracePeriod=30 Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.536406 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" containerID="cri-o://00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29" gracePeriod=30 Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.722673 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d is running failed: container process not found" containerID="6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.722745 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29 is running failed: container process not found" containerID="00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.722814 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017 is running failed: container process not found" containerID="df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.723532 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29 is running failed: container process not found" containerID="00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.723594 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017 is running failed: container process not found" containerID="df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.723665 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d is running failed: container process not found" containerID="6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.727150 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017 is running failed: container process not found" containerID="df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.727173 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d is running failed: container process not found" containerID="6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.727199 4764 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="sbdb" Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.727231 4764 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="nbdb" Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.727286 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29 is running failed: container process not found" containerID="00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.727313 4764 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.844009 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovnkube-controller/3.log" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.846079 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovn-acl-logging/0.log" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.846654 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovn-controller/0.log" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.847101 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905054 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-65sdb"] Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.905314 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905326 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.905337 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="kube-rbac-proxy-ovn-metrics" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905346 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="kube-rbac-proxy-ovn-metrics" Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.905354 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="kube-rbac-proxy-node" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905362 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="kube-rbac-proxy-node" Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.905379 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="kubecfg-setup" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905386 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="kubecfg-setup" Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.905396 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovn-acl-logging" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905403 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovn-acl-logging" Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.905416 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905423 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.905430 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="northd" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905437 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="northd" Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.905445 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="sbdb" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905454 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="sbdb" Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.905464 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905470 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.905478 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovn-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905487 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovn-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.905494 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="nbdb" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905500 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="nbdb" Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.905512 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905519 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905621 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905633 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905659 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905667 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905678 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="sbdb" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905685 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="northd" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905695 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="kube-rbac-proxy-node" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905703 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="kube-rbac-proxy-ovn-metrics" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905711 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovn-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905720 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="nbdb" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905729 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovn-acl-logging" Mar 09 13:33:58 crc kubenswrapper[4764]: E0309 13:33:58.905864 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905875 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.905995 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerName="ovnkube-controller" Mar 09 13:33:58 crc kubenswrapper[4764]: I0309 13:33:58.908251 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.004199 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.004274 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.004278 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-run-netns\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.004335 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.004714 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovnkube-config\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.004784 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-log-socket\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.004802 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-openvswitch\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.004829 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovn-node-metrics-cert\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.004860 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-env-overrides\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.004887 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovnkube-script-lib\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.004908 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.004940 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.004908 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-log-socket" (OuterVolumeSpecName: "log-socket") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.004919 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-var-lib-openvswitch\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.004979 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-run-ovn-kubernetes\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005002 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-slash\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005036 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-cni-bin\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005065 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5xrv\" (UniqueName: \"kubernetes.io/projected/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-kube-api-access-r5xrv\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005082 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-systemd\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005098 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-ovn\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005121 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-etc-openvswitch\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005166 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-node-log\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005204 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-systemd-units\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005221 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-kubelet\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005259 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-cni-netd\") pod \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\" (UID: \"b8ccb4f5-550a-41b2-b39d-201cdd5d902a\") " Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005326 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005364 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005411 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-node-log" (OuterVolumeSpecName: "node-log") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005408 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005438 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005449 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005458 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005480 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005483 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-slash" (OuterVolumeSpecName: "host-slash") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005491 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-run-systemd\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005498 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005531 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-kubelet\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005511 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005512 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005609 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-run-netns\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005693 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9ce85b1d-79b3-4669-a169-bfcd058c8931-ovnkube-script-lib\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005774 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-node-log\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005829 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-slash\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005876 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-cni-netd\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005959 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-cni-bin\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.005996 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9ce85b1d-79b3-4669-a169-bfcd058c8931-ovnkube-config\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006024 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-systemd-units\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006075 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9ce85b1d-79b3-4669-a169-bfcd058c8931-env-overrides\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006104 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-run-openvswitch\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006180 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006209 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcw8n\" (UniqueName: \"kubernetes.io/projected/9ce85b1d-79b3-4669-a169-bfcd058c8931-kube-api-access-kcw8n\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006229 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-etc-openvswitch\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006303 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-log-socket\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006335 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9ce85b1d-79b3-4669-a169-bfcd058c8931-ovn-node-metrics-cert\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006378 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-run-ovn-kubernetes\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006401 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-run-ovn\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006419 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-var-lib-openvswitch\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006510 4764 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006531 4764 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006547 4764 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006560 4764 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006573 4764 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-slash\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006584 4764 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006598 4764 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006612 4764 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006627 4764 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-node-log\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006661 4764 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006677 4764 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006692 4764 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006707 4764 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006719 4764 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006731 4764 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006745 4764 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-log-socket\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.006755 4764 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.010778 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.011242 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-kube-api-access-r5xrv" (OuterVolumeSpecName: "kube-api-access-r5xrv") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "kube-api-access-r5xrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.018382 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "b8ccb4f5-550a-41b2-b39d-201cdd5d902a" (UID: "b8ccb4f5-550a-41b2-b39d-201cdd5d902a"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107151 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9ce85b1d-79b3-4669-a169-bfcd058c8931-env-overrides\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107209 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-run-openvswitch\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107246 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107268 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcw8n\" (UniqueName: \"kubernetes.io/projected/9ce85b1d-79b3-4669-a169-bfcd058c8931-kube-api-access-kcw8n\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107294 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-etc-openvswitch\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107299 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-run-openvswitch\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107315 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9ce85b1d-79b3-4669-a169-bfcd058c8931-ovn-node-metrics-cert\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107337 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-log-socket\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107351 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-etc-openvswitch\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107361 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-run-ovn-kubernetes\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107386 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-run-ovn\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107408 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-var-lib-openvswitch\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107416 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107460 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-run-systemd\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107430 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-run-systemd\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107504 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-run-ovn-kubernetes\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107513 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-log-socket\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107560 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-kubelet\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107541 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-run-ovn\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107529 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-kubelet\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107565 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-var-lib-openvswitch\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107735 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-run-netns\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107780 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9ce85b1d-79b3-4669-a169-bfcd058c8931-ovnkube-script-lib\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107820 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-node-log\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107852 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-run-netns\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107885 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-slash\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107911 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-node-log\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107917 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-cni-netd\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107927 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-slash\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.107979 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-cni-netd\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.108005 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-cni-bin\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.108038 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9ce85b1d-79b3-4669-a169-bfcd058c8931-ovnkube-config\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.108077 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-systemd-units\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.108108 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-host-cni-bin\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.108201 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9ce85b1d-79b3-4669-a169-bfcd058c8931-systemd-units\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.108229 4764 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.108293 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5xrv\" (UniqueName: \"kubernetes.io/projected/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-kube-api-access-r5xrv\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.108312 4764 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b8ccb4f5-550a-41b2-b39d-201cdd5d902a-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.108488 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9ce85b1d-79b3-4669-a169-bfcd058c8931-ovnkube-script-lib\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.109308 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9ce85b1d-79b3-4669-a169-bfcd058c8931-env-overrides\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.109666 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9ce85b1d-79b3-4669-a169-bfcd058c8931-ovnkube-config\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.110289 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9ce85b1d-79b3-4669-a169-bfcd058c8931-ovn-node-metrics-cert\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.126848 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcw8n\" (UniqueName: \"kubernetes.io/projected/9ce85b1d-79b3-4669-a169-bfcd058c8931-kube-api-access-kcw8n\") pod \"ovnkube-node-65sdb\" (UID: \"9ce85b1d-79b3-4669-a169-bfcd058c8931\") " pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.223068 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:33:59 crc kubenswrapper[4764]: W0309 13:33:59.241968 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ce85b1d_79b3_4669_a169_bfcd058c8931.slice/crio-3d8a2a2f3b83361d950fc0816e501c19b17d60a681dbf475fb25e36cffe9c59e WatchSource:0}: Error finding container 3d8a2a2f3b83361d950fc0816e501c19b17d60a681dbf475fb25e36cffe9c59e: Status 404 returned error can't find the container with id 3d8a2a2f3b83361d950fc0816e501c19b17d60a681dbf475fb25e36cffe9c59e Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.420519 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovnkube-controller/3.log" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.424107 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovn-acl-logging/0.log" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.424860 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kggv_b8ccb4f5-550a-41b2-b39d-201cdd5d902a/ovn-controller/0.log" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425360 4764 generic.go:334] "Generic (PLEG): container finished" podID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerID="00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29" exitCode=0 Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425400 4764 generic.go:334] "Generic (PLEG): container finished" podID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerID="df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017" exitCode=0 Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425421 4764 generic.go:334] "Generic (PLEG): container finished" podID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerID="6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d" exitCode=0 Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425434 4764 generic.go:334] "Generic (PLEG): container finished" podID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerID="20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519" exitCode=0 Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425447 4764 generic.go:334] "Generic (PLEG): container finished" podID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerID="21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29" exitCode=0 Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425462 4764 generic.go:334] "Generic (PLEG): container finished" podID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerID="9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01" exitCode=0 Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425469 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425492 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerDied","Data":"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425529 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerDied","Data":"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425547 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerDied","Data":"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425565 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerDied","Data":"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425584 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerDied","Data":"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425602 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerDied","Data":"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425621 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425637 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425679 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425690 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425701 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425710 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425721 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425732 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425742 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425757 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerDied","Data":"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425774 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425786 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425475 4764 generic.go:334] "Generic (PLEG): container finished" podID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerID="8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb" exitCode=143 Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425812 4764 scope.go:117] "RemoveContainer" containerID="00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425818 4764 generic.go:334] "Generic (PLEG): container finished" podID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" containerID="9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4" exitCode=143 Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425798 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425864 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425871 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425878 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425883 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425888 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425894 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425899 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425908 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerDied","Data":"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425920 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425927 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425933 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425939 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425945 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425951 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425957 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425963 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425969 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425982 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.425991 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kggv" event={"ID":"b8ccb4f5-550a-41b2-b39d-201cdd5d902a","Type":"ContainerDied","Data":"65054061a375abf79424d7095dced33c492d7e93ee2f22f668abd5a0c5ced956"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.426000 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.426007 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.426012 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.426018 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.426023 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.426028 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.426035 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.426041 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.426046 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.426052 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.429561 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" event={"ID":"9ce85b1d-79b3-4669-a169-bfcd058c8931","Type":"ContainerDied","Data":"14662e1e519ca2bde52ecac49d404da06daa59e83344f6bc1a62155769c808e5"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.445607 4764 scope.go:117] "RemoveContainer" containerID="9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.429527 4764 generic.go:334] "Generic (PLEG): container finished" podID="9ce85b1d-79b3-4669-a169-bfcd058c8931" containerID="14662e1e519ca2bde52ecac49d404da06daa59e83344f6bc1a62155769c808e5" exitCode=0 Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.445936 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" event={"ID":"9ce85b1d-79b3-4669-a169-bfcd058c8931","Type":"ContainerStarted","Data":"3d8a2a2f3b83361d950fc0816e501c19b17d60a681dbf475fb25e36cffe9c59e"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.448741 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zmzm7_202a1f58-ce83-4374-ac48-dc806f7b9d6b/kube-multus/2.log" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.449615 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zmzm7_202a1f58-ce83-4374-ac48-dc806f7b9d6b/kube-multus/1.log" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.449734 4764 generic.go:334] "Generic (PLEG): container finished" podID="202a1f58-ce83-4374-ac48-dc806f7b9d6b" containerID="63894bb3203376238f9013fa9abe0d5e50f67b5675ce93e7ce0b6cdd92b326b3" exitCode=2 Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.449765 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zmzm7" event={"ID":"202a1f58-ce83-4374-ac48-dc806f7b9d6b","Type":"ContainerDied","Data":"63894bb3203376238f9013fa9abe0d5e50f67b5675ce93e7ce0b6cdd92b326b3"} Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.450369 4764 scope.go:117] "RemoveContainer" containerID="63894bb3203376238f9013fa9abe0d5e50f67b5675ce93e7ce0b6cdd92b326b3" Mar 09 13:33:59 crc kubenswrapper[4764]: E0309 13:33:59.451314 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-zmzm7_openshift-multus(202a1f58-ce83-4374-ac48-dc806f7b9d6b)\"" pod="openshift-multus/multus-zmzm7" podUID="202a1f58-ce83-4374-ac48-dc806f7b9d6b" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.492709 4764 scope.go:117] "RemoveContainer" containerID="df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.514048 4764 scope.go:117] "RemoveContainer" containerID="6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.526006 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7kggv"] Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.532271 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7kggv"] Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.535834 4764 scope.go:117] "RemoveContainer" containerID="20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.558715 4764 scope.go:117] "RemoveContainer" containerID="21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.566851 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8ccb4f5-550a-41b2-b39d-201cdd5d902a" path="/var/lib/kubelet/pods/b8ccb4f5-550a-41b2-b39d-201cdd5d902a/volumes" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.575534 4764 scope.go:117] "RemoveContainer" containerID="9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.590093 4764 scope.go:117] "RemoveContainer" containerID="8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.606021 4764 scope.go:117] "RemoveContainer" containerID="9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.633468 4764 scope.go:117] "RemoveContainer" containerID="cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.675802 4764 scope.go:117] "RemoveContainer" containerID="00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29" Mar 09 13:33:59 crc kubenswrapper[4764]: E0309 13:33:59.676302 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29\": container with ID starting with 00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29 not found: ID does not exist" containerID="00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.676333 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29"} err="failed to get container status \"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29\": rpc error: code = NotFound desc = could not find container \"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29\": container with ID starting with 00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.676355 4764 scope.go:117] "RemoveContainer" containerID="9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f" Mar 09 13:33:59 crc kubenswrapper[4764]: E0309 13:33:59.676760 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f\": container with ID starting with 9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f not found: ID does not exist" containerID="9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.676781 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f"} err="failed to get container status \"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f\": rpc error: code = NotFound desc = could not find container \"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f\": container with ID starting with 9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.676794 4764 scope.go:117] "RemoveContainer" containerID="df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017" Mar 09 13:33:59 crc kubenswrapper[4764]: E0309 13:33:59.677061 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\": container with ID starting with df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017 not found: ID does not exist" containerID="df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.677082 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017"} err="failed to get container status \"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\": rpc error: code = NotFound desc = could not find container \"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\": container with ID starting with df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.677096 4764 scope.go:117] "RemoveContainer" containerID="6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d" Mar 09 13:33:59 crc kubenswrapper[4764]: E0309 13:33:59.677450 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\": container with ID starting with 6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d not found: ID does not exist" containerID="6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.677468 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d"} err="failed to get container status \"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\": rpc error: code = NotFound desc = could not find container \"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\": container with ID starting with 6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.677479 4764 scope.go:117] "RemoveContainer" containerID="20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519" Mar 09 13:33:59 crc kubenswrapper[4764]: E0309 13:33:59.677835 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\": container with ID starting with 20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519 not found: ID does not exist" containerID="20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.677855 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519"} err="failed to get container status \"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\": rpc error: code = NotFound desc = could not find container \"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\": container with ID starting with 20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.677868 4764 scope.go:117] "RemoveContainer" containerID="21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29" Mar 09 13:33:59 crc kubenswrapper[4764]: E0309 13:33:59.678104 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\": container with ID starting with 21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29 not found: ID does not exist" containerID="21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.678120 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29"} err="failed to get container status \"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\": rpc error: code = NotFound desc = could not find container \"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\": container with ID starting with 21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.678133 4764 scope.go:117] "RemoveContainer" containerID="9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01" Mar 09 13:33:59 crc kubenswrapper[4764]: E0309 13:33:59.678403 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\": container with ID starting with 9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01 not found: ID does not exist" containerID="9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.678429 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01"} err="failed to get container status \"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\": rpc error: code = NotFound desc = could not find container \"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\": container with ID starting with 9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.678443 4764 scope.go:117] "RemoveContainer" containerID="8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb" Mar 09 13:33:59 crc kubenswrapper[4764]: E0309 13:33:59.678677 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\": container with ID starting with 8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb not found: ID does not exist" containerID="8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.678697 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb"} err="failed to get container status \"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\": rpc error: code = NotFound desc = could not find container \"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\": container with ID starting with 8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.678711 4764 scope.go:117] "RemoveContainer" containerID="9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4" Mar 09 13:33:59 crc kubenswrapper[4764]: E0309 13:33:59.678927 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\": container with ID starting with 9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4 not found: ID does not exist" containerID="9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.678946 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4"} err="failed to get container status \"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\": rpc error: code = NotFound desc = could not find container \"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\": container with ID starting with 9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.678959 4764 scope.go:117] "RemoveContainer" containerID="cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66" Mar 09 13:33:59 crc kubenswrapper[4764]: E0309 13:33:59.679251 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\": container with ID starting with cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66 not found: ID does not exist" containerID="cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.679269 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66"} err="failed to get container status \"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\": rpc error: code = NotFound desc = could not find container \"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\": container with ID starting with cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.679281 4764 scope.go:117] "RemoveContainer" containerID="00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.679479 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29"} err="failed to get container status \"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29\": rpc error: code = NotFound desc = could not find container \"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29\": container with ID starting with 00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.679495 4764 scope.go:117] "RemoveContainer" containerID="9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.679742 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f"} err="failed to get container status \"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f\": rpc error: code = NotFound desc = could not find container \"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f\": container with ID starting with 9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.679759 4764 scope.go:117] "RemoveContainer" containerID="df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.679966 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017"} err="failed to get container status \"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\": rpc error: code = NotFound desc = could not find container \"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\": container with ID starting with df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.679983 4764 scope.go:117] "RemoveContainer" containerID="6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.680234 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d"} err="failed to get container status \"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\": rpc error: code = NotFound desc = could not find container \"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\": container with ID starting with 6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.680253 4764 scope.go:117] "RemoveContainer" containerID="20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.681165 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519"} err="failed to get container status \"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\": rpc error: code = NotFound desc = could not find container \"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\": container with ID starting with 20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.681190 4764 scope.go:117] "RemoveContainer" containerID="21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.681482 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29"} err="failed to get container status \"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\": rpc error: code = NotFound desc = could not find container \"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\": container with ID starting with 21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.681501 4764 scope.go:117] "RemoveContainer" containerID="9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.681854 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01"} err="failed to get container status \"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\": rpc error: code = NotFound desc = could not find container \"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\": container with ID starting with 9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.681874 4764 scope.go:117] "RemoveContainer" containerID="8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.682440 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb"} err="failed to get container status \"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\": rpc error: code = NotFound desc = could not find container \"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\": container with ID starting with 8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.682458 4764 scope.go:117] "RemoveContainer" containerID="9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.682690 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4"} err="failed to get container status \"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\": rpc error: code = NotFound desc = could not find container \"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\": container with ID starting with 9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.682712 4764 scope.go:117] "RemoveContainer" containerID="cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.683280 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66"} err="failed to get container status \"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\": rpc error: code = NotFound desc = could not find container \"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\": container with ID starting with cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.683325 4764 scope.go:117] "RemoveContainer" containerID="00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.683704 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29"} err="failed to get container status \"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29\": rpc error: code = NotFound desc = could not find container \"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29\": container with ID starting with 00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.683723 4764 scope.go:117] "RemoveContainer" containerID="9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.683922 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f"} err="failed to get container status \"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f\": rpc error: code = NotFound desc = could not find container \"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f\": container with ID starting with 9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.683936 4764 scope.go:117] "RemoveContainer" containerID="df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.684140 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017"} err="failed to get container status \"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\": rpc error: code = NotFound desc = could not find container \"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\": container with ID starting with df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.684151 4764 scope.go:117] "RemoveContainer" containerID="6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.684497 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d"} err="failed to get container status \"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\": rpc error: code = NotFound desc = could not find container \"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\": container with ID starting with 6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.684512 4764 scope.go:117] "RemoveContainer" containerID="20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.684809 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519"} err="failed to get container status \"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\": rpc error: code = NotFound desc = could not find container \"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\": container with ID starting with 20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.684826 4764 scope.go:117] "RemoveContainer" containerID="21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.685346 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29"} err="failed to get container status \"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\": rpc error: code = NotFound desc = could not find container \"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\": container with ID starting with 21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.685373 4764 scope.go:117] "RemoveContainer" containerID="9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.685689 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01"} err="failed to get container status \"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\": rpc error: code = NotFound desc = could not find container \"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\": container with ID starting with 9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.685714 4764 scope.go:117] "RemoveContainer" containerID="8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.686005 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb"} err="failed to get container status \"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\": rpc error: code = NotFound desc = could not find container \"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\": container with ID starting with 8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.686030 4764 scope.go:117] "RemoveContainer" containerID="9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.686219 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4"} err="failed to get container status \"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\": rpc error: code = NotFound desc = could not find container \"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\": container with ID starting with 9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.686238 4764 scope.go:117] "RemoveContainer" containerID="cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.686417 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66"} err="failed to get container status \"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\": rpc error: code = NotFound desc = could not find container \"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\": container with ID starting with cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.686436 4764 scope.go:117] "RemoveContainer" containerID="00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.686693 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29"} err="failed to get container status \"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29\": rpc error: code = NotFound desc = could not find container \"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29\": container with ID starting with 00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.686715 4764 scope.go:117] "RemoveContainer" containerID="9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.687034 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f"} err="failed to get container status \"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f\": rpc error: code = NotFound desc = could not find container \"9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f\": container with ID starting with 9162f20dc3577683eb7f764a353d0245d4748a1f3cdc3d3886dbea7e03e9714f not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.687054 4764 scope.go:117] "RemoveContainer" containerID="df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.687340 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017"} err="failed to get container status \"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\": rpc error: code = NotFound desc = could not find container \"df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017\": container with ID starting with df799033cab3d457eef490d885e4440024ba2c7daa430d489ee9907d908d1017 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.687360 4764 scope.go:117] "RemoveContainer" containerID="6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.687525 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d"} err="failed to get container status \"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\": rpc error: code = NotFound desc = could not find container \"6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d\": container with ID starting with 6859bb282db6608def0d7983e715cf3f5f5f454bf896bb21dca124a075e5701d not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.687547 4764 scope.go:117] "RemoveContainer" containerID="20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.687851 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519"} err="failed to get container status \"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\": rpc error: code = NotFound desc = could not find container \"20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519\": container with ID starting with 20f0f31508039b62436cf5b317419ffc5ca18d03050bfa830060f40e12684519 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.687879 4764 scope.go:117] "RemoveContainer" containerID="21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.688158 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29"} err="failed to get container status \"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\": rpc error: code = NotFound desc = could not find container \"21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29\": container with ID starting with 21feffb4bef29e65989455ec8c5a040a53de90f01ede1a9f1848a67f64a35d29 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.688199 4764 scope.go:117] "RemoveContainer" containerID="9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.688590 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01"} err="failed to get container status \"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\": rpc error: code = NotFound desc = could not find container \"9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01\": container with ID starting with 9713dea51462b7a0ce8cb68a64d8707b76dfca5f8c770cac3b54339d538dec01 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.688620 4764 scope.go:117] "RemoveContainer" containerID="8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.688896 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb"} err="failed to get container status \"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\": rpc error: code = NotFound desc = could not find container \"8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb\": container with ID starting with 8ceb1c19d0ff9c03c8790b54d5dc48bafd77f6c289fc6225a88d6bfb8e04a8eb not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.688923 4764 scope.go:117] "RemoveContainer" containerID="9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.689173 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4"} err="failed to get container status \"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\": rpc error: code = NotFound desc = could not find container \"9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4\": container with ID starting with 9ad4c0df0c830a9ef102a630db33f055925c0b9b1734a792be013f8e261523c4 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.689196 4764 scope.go:117] "RemoveContainer" containerID="cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.689363 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66"} err="failed to get container status \"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\": rpc error: code = NotFound desc = could not find container \"cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66\": container with ID starting with cd65db46dfb71ce3080adad69890c363fb149c5bf12c903e66a212a28888ae66 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.689384 4764 scope.go:117] "RemoveContainer" containerID="00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.689563 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29"} err="failed to get container status \"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29\": rpc error: code = NotFound desc = could not find container \"00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29\": container with ID starting with 00007c279b62a4251162cd96ec8fdba9360c1525e9c8d4622efd50b129e51d29 not found: ID does not exist" Mar 09 13:33:59 crc kubenswrapper[4764]: I0309 13:33:59.689580 4764 scope.go:117] "RemoveContainer" containerID="ae16a6074b17ec1df33139f5a8f220d15bea5658fc20ad653e2af7b2427dec6a" Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.143677 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551054-n54vp"] Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.145442 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.147143 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.147766 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.147868 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.227726 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5wg8\" (UniqueName: \"kubernetes.io/projected/034371f5-4d6d-4a44-9678-9093ffaf3f9d-kube-api-access-c5wg8\") pod \"auto-csr-approver-29551054-n54vp\" (UID: \"034371f5-4d6d-4a44-9678-9093ffaf3f9d\") " pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.328873 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5wg8\" (UniqueName: \"kubernetes.io/projected/034371f5-4d6d-4a44-9678-9093ffaf3f9d-kube-api-access-c5wg8\") pod \"auto-csr-approver-29551054-n54vp\" (UID: \"034371f5-4d6d-4a44-9678-9093ffaf3f9d\") " pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.348863 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5wg8\" (UniqueName: \"kubernetes.io/projected/034371f5-4d6d-4a44-9678-9093ffaf3f9d-kube-api-access-c5wg8\") pod \"auto-csr-approver-29551054-n54vp\" (UID: \"034371f5-4d6d-4a44-9678-9093ffaf3f9d\") " pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.465467 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" event={"ID":"9ce85b1d-79b3-4669-a169-bfcd058c8931","Type":"ContainerStarted","Data":"3a848f9e9d70d363afd145136252d0d21b1181fb7c9635668eaf736178838a28"} Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.465511 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" event={"ID":"9ce85b1d-79b3-4669-a169-bfcd058c8931","Type":"ContainerStarted","Data":"48ef6cb01cdd4dbf12a234f8bbe08162d6b5f52d24cd627d574666c158014d27"} Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.465529 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" event={"ID":"9ce85b1d-79b3-4669-a169-bfcd058c8931","Type":"ContainerStarted","Data":"f69f8673226b23344a41861642361b428c6c3e1d43f01287b2dd7b6e001dfe23"} Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.465546 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" event={"ID":"9ce85b1d-79b3-4669-a169-bfcd058c8931","Type":"ContainerStarted","Data":"0d63bb6c85375e48eceb743e37a5a244484ac0d049204cd33b8129a9bd67940d"} Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.465562 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" event={"ID":"9ce85b1d-79b3-4669-a169-bfcd058c8931","Type":"ContainerStarted","Data":"cfdb7889f60234e9433e89a0e499d8835f26ab539859bcf0cf95231ed76a4806"} Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.465582 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" event={"ID":"9ce85b1d-79b3-4669-a169-bfcd058c8931","Type":"ContainerStarted","Data":"5c89cfde8a656ee956f7f0ec25cad509f0fe6c8e2d7060684cadebb71cfa2ac0"} Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.467378 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zmzm7_202a1f58-ce83-4374-ac48-dc806f7b9d6b/kube-multus/2.log" Mar 09 13:34:00 crc kubenswrapper[4764]: I0309 13:34:00.467523 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:00 crc kubenswrapper[4764]: E0309 13:34:00.518483 4764 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29551054-n54vp_openshift-infra_034371f5-4d6d-4a44-9678-9093ffaf3f9d_0(9371105bcadd648276b8c0628bf26340a2e88cbd213aba3b9c529f1027adc240): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:34:00 crc kubenswrapper[4764]: E0309 13:34:00.518570 4764 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29551054-n54vp_openshift-infra_034371f5-4d6d-4a44-9678-9093ffaf3f9d_0(9371105bcadd648276b8c0628bf26340a2e88cbd213aba3b9c529f1027adc240): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:00 crc kubenswrapper[4764]: E0309 13:34:00.518599 4764 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29551054-n54vp_openshift-infra_034371f5-4d6d-4a44-9678-9093ffaf3f9d_0(9371105bcadd648276b8c0628bf26340a2e88cbd213aba3b9c529f1027adc240): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:00 crc kubenswrapper[4764]: E0309 13:34:00.518680 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29551054-n54vp_openshift-infra(034371f5-4d6d-4a44-9678-9093ffaf3f9d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29551054-n54vp_openshift-infra(034371f5-4d6d-4a44-9678-9093ffaf3f9d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29551054-n54vp_openshift-infra_034371f5-4d6d-4a44-9678-9093ffaf3f9d_0(9371105bcadd648276b8c0628bf26340a2e88cbd213aba3b9c529f1027adc240): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29551054-n54vp" podUID="034371f5-4d6d-4a44-9678-9093ffaf3f9d" Mar 09 13:34:02 crc kubenswrapper[4764]: I0309 13:34:02.485263 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" event={"ID":"9ce85b1d-79b3-4669-a169-bfcd058c8931","Type":"ContainerStarted","Data":"d6c45db45d998d410c60e72adb261563bb996c6a11ee9e2df9458db6ea6f17de"} Mar 09 13:34:05 crc kubenswrapper[4764]: I0309 13:34:05.391519 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551054-n54vp"] Mar 09 13:34:05 crc kubenswrapper[4764]: I0309 13:34:05.392254 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:05 crc kubenswrapper[4764]: I0309 13:34:05.392707 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:05 crc kubenswrapper[4764]: E0309 13:34:05.420729 4764 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29551054-n54vp_openshift-infra_034371f5-4d6d-4a44-9678-9093ffaf3f9d_0(490a791bba571dc39fa37f985dc6a1cdbff4000817835154f83fb9398601bb34): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:34:05 crc kubenswrapper[4764]: E0309 13:34:05.420831 4764 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29551054-n54vp_openshift-infra_034371f5-4d6d-4a44-9678-9093ffaf3f9d_0(490a791bba571dc39fa37f985dc6a1cdbff4000817835154f83fb9398601bb34): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:05 crc kubenswrapper[4764]: E0309 13:34:05.420858 4764 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29551054-n54vp_openshift-infra_034371f5-4d6d-4a44-9678-9093ffaf3f9d_0(490a791bba571dc39fa37f985dc6a1cdbff4000817835154f83fb9398601bb34): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:05 crc kubenswrapper[4764]: E0309 13:34:05.420935 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29551054-n54vp_openshift-infra(034371f5-4d6d-4a44-9678-9093ffaf3f9d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29551054-n54vp_openshift-infra(034371f5-4d6d-4a44-9678-9093ffaf3f9d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29551054-n54vp_openshift-infra_034371f5-4d6d-4a44-9678-9093ffaf3f9d_0(490a791bba571dc39fa37f985dc6a1cdbff4000817835154f83fb9398601bb34): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29551054-n54vp" podUID="034371f5-4d6d-4a44-9678-9093ffaf3f9d" Mar 09 13:34:05 crc kubenswrapper[4764]: I0309 13:34:05.508253 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" event={"ID":"9ce85b1d-79b3-4669-a169-bfcd058c8931","Type":"ContainerStarted","Data":"0b696165846fc1f0bdec336ce0be47fd6364eb835170524ef7e828f6e1debe3b"} Mar 09 13:34:05 crc kubenswrapper[4764]: I0309 13:34:05.508872 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:34:05 crc kubenswrapper[4764]: I0309 13:34:05.508966 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:34:05 crc kubenswrapper[4764]: I0309 13:34:05.540369 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" podStartSLOduration=7.540341433 podStartE2EDuration="7.540341433s" podCreationTimestamp="2026-03-09 13:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:34:05.536562522 +0000 UTC m=+800.786734420" watchObservedRunningTime="2026-03-09 13:34:05.540341433 +0000 UTC m=+800.790513351" Mar 09 13:34:05 crc kubenswrapper[4764]: I0309 13:34:05.547164 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:34:06 crc kubenswrapper[4764]: I0309 13:34:06.514768 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:34:06 crc kubenswrapper[4764]: I0309 13:34:06.550699 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:34:14 crc kubenswrapper[4764]: I0309 13:34:14.559294 4764 scope.go:117] "RemoveContainer" containerID="63894bb3203376238f9013fa9abe0d5e50f67b5675ce93e7ce0b6cdd92b326b3" Mar 09 13:34:14 crc kubenswrapper[4764]: E0309 13:34:14.560154 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-zmzm7_openshift-multus(202a1f58-ce83-4374-ac48-dc806f7b9d6b)\"" pod="openshift-multus/multus-zmzm7" podUID="202a1f58-ce83-4374-ac48-dc806f7b9d6b" Mar 09 13:34:19 crc kubenswrapper[4764]: I0309 13:34:19.940915 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v"] Mar 09 13:34:19 crc kubenswrapper[4764]: I0309 13:34:19.942817 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:19 crc kubenswrapper[4764]: I0309 13:34:19.946038 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 09 13:34:19 crc kubenswrapper[4764]: I0309 13:34:19.953180 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v"] Mar 09 13:34:20 crc kubenswrapper[4764]: I0309 13:34:20.049331 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/704ddae7-42eb-4609-b4a3-64d5078c2126-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v\" (UID: \"704ddae7-42eb-4609-b4a3-64d5078c2126\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: I0309 13:34:20.049407 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/704ddae7-42eb-4609-b4a3-64d5078c2126-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v\" (UID: \"704ddae7-42eb-4609-b4a3-64d5078c2126\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: I0309 13:34:20.049433 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-775rf\" (UniqueName: \"kubernetes.io/projected/704ddae7-42eb-4609-b4a3-64d5078c2126-kube-api-access-775rf\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v\" (UID: \"704ddae7-42eb-4609-b4a3-64d5078c2126\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: I0309 13:34:20.150719 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/704ddae7-42eb-4609-b4a3-64d5078c2126-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v\" (UID: \"704ddae7-42eb-4609-b4a3-64d5078c2126\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: I0309 13:34:20.150766 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-775rf\" (UniqueName: \"kubernetes.io/projected/704ddae7-42eb-4609-b4a3-64d5078c2126-kube-api-access-775rf\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v\" (UID: \"704ddae7-42eb-4609-b4a3-64d5078c2126\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: I0309 13:34:20.150821 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/704ddae7-42eb-4609-b4a3-64d5078c2126-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v\" (UID: \"704ddae7-42eb-4609-b4a3-64d5078c2126\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: I0309 13:34:20.151243 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/704ddae7-42eb-4609-b4a3-64d5078c2126-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v\" (UID: \"704ddae7-42eb-4609-b4a3-64d5078c2126\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: I0309 13:34:20.151291 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/704ddae7-42eb-4609-b4a3-64d5078c2126-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v\" (UID: \"704ddae7-42eb-4609-b4a3-64d5078c2126\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: I0309 13:34:20.170470 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-775rf\" (UniqueName: \"kubernetes.io/projected/704ddae7-42eb-4609-b4a3-64d5078c2126-kube-api-access-775rf\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v\" (UID: \"704ddae7-42eb-4609-b4a3-64d5078c2126\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: I0309 13:34:20.259529 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: E0309 13:34:20.283297 4764 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_openshift-marketplace_704ddae7-42eb-4609-b4a3-64d5078c2126_0(6ea19c3eacc0d52e3b0d6826607088e7cd227b4b59271aefdeacb66597509e63): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:34:20 crc kubenswrapper[4764]: E0309 13:34:20.283360 4764 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_openshift-marketplace_704ddae7-42eb-4609-b4a3-64d5078c2126_0(6ea19c3eacc0d52e3b0d6826607088e7cd227b4b59271aefdeacb66597509e63): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: E0309 13:34:20.283384 4764 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_openshift-marketplace_704ddae7-42eb-4609-b4a3-64d5078c2126_0(6ea19c3eacc0d52e3b0d6826607088e7cd227b4b59271aefdeacb66597509e63): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: E0309 13:34:20.283437 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_openshift-marketplace(704ddae7-42eb-4609-b4a3-64d5078c2126)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_openshift-marketplace(704ddae7-42eb-4609-b4a3-64d5078c2126)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_openshift-marketplace_704ddae7-42eb-4609-b4a3-64d5078c2126_0(6ea19c3eacc0d52e3b0d6826607088e7cd227b4b59271aefdeacb66597509e63): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" podUID="704ddae7-42eb-4609-b4a3-64d5078c2126" Mar 09 13:34:20 crc kubenswrapper[4764]: I0309 13:34:20.559058 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:20 crc kubenswrapper[4764]: I0309 13:34:20.560111 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:20 crc kubenswrapper[4764]: E0309 13:34:20.582450 4764 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29551054-n54vp_openshift-infra_034371f5-4d6d-4a44-9678-9093ffaf3f9d_0(aed54c1a64f25b2e95214eed47e79e32b233372ce821e9e07390830d4703c06c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:34:20 crc kubenswrapper[4764]: E0309 13:34:20.582545 4764 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29551054-n54vp_openshift-infra_034371f5-4d6d-4a44-9678-9093ffaf3f9d_0(aed54c1a64f25b2e95214eed47e79e32b233372ce821e9e07390830d4703c06c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:20 crc kubenswrapper[4764]: E0309 13:34:20.582578 4764 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29551054-n54vp_openshift-infra_034371f5-4d6d-4a44-9678-9093ffaf3f9d_0(aed54c1a64f25b2e95214eed47e79e32b233372ce821e9e07390830d4703c06c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:20 crc kubenswrapper[4764]: E0309 13:34:20.582662 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29551054-n54vp_openshift-infra(034371f5-4d6d-4a44-9678-9093ffaf3f9d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29551054-n54vp_openshift-infra(034371f5-4d6d-4a44-9678-9093ffaf3f9d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29551054-n54vp_openshift-infra_034371f5-4d6d-4a44-9678-9093ffaf3f9d_0(aed54c1a64f25b2e95214eed47e79e32b233372ce821e9e07390830d4703c06c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29551054-n54vp" podUID="034371f5-4d6d-4a44-9678-9093ffaf3f9d" Mar 09 13:34:20 crc kubenswrapper[4764]: I0309 13:34:20.608821 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: I0309 13:34:20.609398 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: E0309 13:34:20.630565 4764 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_openshift-marketplace_704ddae7-42eb-4609-b4a3-64d5078c2126_0(9cc09cde10f092620ca58e98e418d3793b0b63299d57f13e1dba229626cd06c2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:34:20 crc kubenswrapper[4764]: E0309 13:34:20.630634 4764 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_openshift-marketplace_704ddae7-42eb-4609-b4a3-64d5078c2126_0(9cc09cde10f092620ca58e98e418d3793b0b63299d57f13e1dba229626cd06c2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: E0309 13:34:20.630677 4764 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_openshift-marketplace_704ddae7-42eb-4609-b4a3-64d5078c2126_0(9cc09cde10f092620ca58e98e418d3793b0b63299d57f13e1dba229626cd06c2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:20 crc kubenswrapper[4764]: E0309 13:34:20.630727 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_openshift-marketplace(704ddae7-42eb-4609-b4a3-64d5078c2126)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_openshift-marketplace(704ddae7-42eb-4609-b4a3-64d5078c2126)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_openshift-marketplace_704ddae7-42eb-4609-b4a3-64d5078c2126_0(9cc09cde10f092620ca58e98e418d3793b0b63299d57f13e1dba229626cd06c2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" podUID="704ddae7-42eb-4609-b4a3-64d5078c2126" Mar 09 13:34:28 crc kubenswrapper[4764]: I0309 13:34:28.370512 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:34:28 crc kubenswrapper[4764]: I0309 13:34:28.371954 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:34:28 crc kubenswrapper[4764]: I0309 13:34:28.559301 4764 scope.go:117] "RemoveContainer" containerID="63894bb3203376238f9013fa9abe0d5e50f67b5675ce93e7ce0b6cdd92b326b3" Mar 09 13:34:29 crc kubenswrapper[4764]: I0309 13:34:29.248178 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-65sdb" Mar 09 13:34:29 crc kubenswrapper[4764]: I0309 13:34:29.676968 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zmzm7_202a1f58-ce83-4374-ac48-dc806f7b9d6b/kube-multus/2.log" Mar 09 13:34:29 crc kubenswrapper[4764]: I0309 13:34:29.677976 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zmzm7" event={"ID":"202a1f58-ce83-4374-ac48-dc806f7b9d6b","Type":"ContainerStarted","Data":"77ff1aa6c5c0eeb845444895ddf031c60c4d096f2fb0c65b0a27e0b43cd150c0"} Mar 09 13:34:34 crc kubenswrapper[4764]: I0309 13:34:34.559768 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:34 crc kubenswrapper[4764]: I0309 13:34:34.559823 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:34 crc kubenswrapper[4764]: I0309 13:34:34.560498 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:34 crc kubenswrapper[4764]: I0309 13:34:34.560772 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:34 crc kubenswrapper[4764]: I0309 13:34:34.775201 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551054-n54vp"] Mar 09 13:34:34 crc kubenswrapper[4764]: I0309 13:34:34.822375 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v"] Mar 09 13:34:34 crc kubenswrapper[4764]: W0309 13:34:34.827434 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod704ddae7_42eb_4609_b4a3_64d5078c2126.slice/crio-1aae20624cf4f8dfb3e9bc0fa86cea95ff0b6ca3747b8493fe018a0d37430bcf WatchSource:0}: Error finding container 1aae20624cf4f8dfb3e9bc0fa86cea95ff0b6ca3747b8493fe018a0d37430bcf: Status 404 returned error can't find the container with id 1aae20624cf4f8dfb3e9bc0fa86cea95ff0b6ca3747b8493fe018a0d37430bcf Mar 09 13:34:34 crc kubenswrapper[4764]: I0309 13:34:34.855080 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" event={"ID":"704ddae7-42eb-4609-b4a3-64d5078c2126","Type":"ContainerStarted","Data":"1aae20624cf4f8dfb3e9bc0fa86cea95ff0b6ca3747b8493fe018a0d37430bcf"} Mar 09 13:34:34 crc kubenswrapper[4764]: I0309 13:34:34.856539 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551054-n54vp" event={"ID":"034371f5-4d6d-4a44-9678-9093ffaf3f9d","Type":"ContainerStarted","Data":"d80da71a220cdeff4fcb886a07bef2e0df9a89242a263d781a6886d65252f40a"} Mar 09 13:34:35 crc kubenswrapper[4764]: I0309 13:34:35.866771 4764 generic.go:334] "Generic (PLEG): container finished" podID="704ddae7-42eb-4609-b4a3-64d5078c2126" containerID="7b5418c090975c0574a7d218bf987e53c3ec2e85e4ff9a68edfbfb8766c1af5a" exitCode=0 Mar 09 13:34:35 crc kubenswrapper[4764]: I0309 13:34:35.866845 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" event={"ID":"704ddae7-42eb-4609-b4a3-64d5078c2126","Type":"ContainerDied","Data":"7b5418c090975c0574a7d218bf987e53c3ec2e85e4ff9a68edfbfb8766c1af5a"} Mar 09 13:34:36 crc kubenswrapper[4764]: I0309 13:34:36.878162 4764 generic.go:334] "Generic (PLEG): container finished" podID="034371f5-4d6d-4a44-9678-9093ffaf3f9d" containerID="33a445a6adcd631bbd08283597c203355f22e04e83cf13fd98ba9c1f71c00bd6" exitCode=0 Mar 09 13:34:36 crc kubenswrapper[4764]: I0309 13:34:36.878302 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551054-n54vp" event={"ID":"034371f5-4d6d-4a44-9678-9093ffaf3f9d","Type":"ContainerDied","Data":"33a445a6adcd631bbd08283597c203355f22e04e83cf13fd98ba9c1f71c00bd6"} Mar 09 13:34:38 crc kubenswrapper[4764]: I0309 13:34:38.116550 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:38 crc kubenswrapper[4764]: I0309 13:34:38.260331 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5wg8\" (UniqueName: \"kubernetes.io/projected/034371f5-4d6d-4a44-9678-9093ffaf3f9d-kube-api-access-c5wg8\") pod \"034371f5-4d6d-4a44-9678-9093ffaf3f9d\" (UID: \"034371f5-4d6d-4a44-9678-9093ffaf3f9d\") " Mar 09 13:34:38 crc kubenswrapper[4764]: I0309 13:34:38.268099 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/034371f5-4d6d-4a44-9678-9093ffaf3f9d-kube-api-access-c5wg8" (OuterVolumeSpecName: "kube-api-access-c5wg8") pod "034371f5-4d6d-4a44-9678-9093ffaf3f9d" (UID: "034371f5-4d6d-4a44-9678-9093ffaf3f9d"). InnerVolumeSpecName "kube-api-access-c5wg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:34:38 crc kubenswrapper[4764]: I0309 13:34:38.363691 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5wg8\" (UniqueName: \"kubernetes.io/projected/034371f5-4d6d-4a44-9678-9093ffaf3f9d-kube-api-access-c5wg8\") on node \"crc\" DevicePath \"\"" Mar 09 13:34:38 crc kubenswrapper[4764]: I0309 13:34:38.895346 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551054-n54vp" event={"ID":"034371f5-4d6d-4a44-9678-9093ffaf3f9d","Type":"ContainerDied","Data":"d80da71a220cdeff4fcb886a07bef2e0df9a89242a263d781a6886d65252f40a"} Mar 09 13:34:38 crc kubenswrapper[4764]: I0309 13:34:38.895394 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d80da71a220cdeff4fcb886a07bef2e0df9a89242a263d781a6886d65252f40a" Mar 09 13:34:38 crc kubenswrapper[4764]: I0309 13:34:38.895423 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551054-n54vp" Mar 09 13:34:39 crc kubenswrapper[4764]: I0309 13:34:39.181078 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551048-fgf8g"] Mar 09 13:34:39 crc kubenswrapper[4764]: I0309 13:34:39.184134 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551048-fgf8g"] Mar 09 13:34:39 crc kubenswrapper[4764]: I0309 13:34:39.577580 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f" path="/var/lib/kubelet/pods/0c3ec5bd-35e8-4d0a-8541-46eab64a0f8f/volumes" Mar 09 13:34:41 crc kubenswrapper[4764]: I0309 13:34:41.920482 4764 generic.go:334] "Generic (PLEG): container finished" podID="704ddae7-42eb-4609-b4a3-64d5078c2126" containerID="4d4fb12cd75aa4f4f3ec85ed29ddf07e07feed48b0270424b85035c7f01f3e24" exitCode=0 Mar 09 13:34:41 crc kubenswrapper[4764]: I0309 13:34:41.921083 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" event={"ID":"704ddae7-42eb-4609-b4a3-64d5078c2126","Type":"ContainerDied","Data":"4d4fb12cd75aa4f4f3ec85ed29ddf07e07feed48b0270424b85035c7f01f3e24"} Mar 09 13:34:42 crc kubenswrapper[4764]: I0309 13:34:42.931296 4764 generic.go:334] "Generic (PLEG): container finished" podID="704ddae7-42eb-4609-b4a3-64d5078c2126" containerID="fa6fe455ac1be47605b80c072783a99991320268046d70c8097db7878a15cacb" exitCode=0 Mar 09 13:34:42 crc kubenswrapper[4764]: I0309 13:34:42.931352 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" event={"ID":"704ddae7-42eb-4609-b4a3-64d5078c2126","Type":"ContainerDied","Data":"fa6fe455ac1be47605b80c072783a99991320268046d70c8097db7878a15cacb"} Mar 09 13:34:44 crc kubenswrapper[4764]: I0309 13:34:44.239245 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:44 crc kubenswrapper[4764]: I0309 13:34:44.264322 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-775rf\" (UniqueName: \"kubernetes.io/projected/704ddae7-42eb-4609-b4a3-64d5078c2126-kube-api-access-775rf\") pod \"704ddae7-42eb-4609-b4a3-64d5078c2126\" (UID: \"704ddae7-42eb-4609-b4a3-64d5078c2126\") " Mar 09 13:34:44 crc kubenswrapper[4764]: I0309 13:34:44.264384 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/704ddae7-42eb-4609-b4a3-64d5078c2126-bundle\") pod \"704ddae7-42eb-4609-b4a3-64d5078c2126\" (UID: \"704ddae7-42eb-4609-b4a3-64d5078c2126\") " Mar 09 13:34:44 crc kubenswrapper[4764]: I0309 13:34:44.264564 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/704ddae7-42eb-4609-b4a3-64d5078c2126-util\") pod \"704ddae7-42eb-4609-b4a3-64d5078c2126\" (UID: \"704ddae7-42eb-4609-b4a3-64d5078c2126\") " Mar 09 13:34:44 crc kubenswrapper[4764]: I0309 13:34:44.265385 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/704ddae7-42eb-4609-b4a3-64d5078c2126-bundle" (OuterVolumeSpecName: "bundle") pod "704ddae7-42eb-4609-b4a3-64d5078c2126" (UID: "704ddae7-42eb-4609-b4a3-64d5078c2126"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:34:44 crc kubenswrapper[4764]: I0309 13:34:44.272873 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/704ddae7-42eb-4609-b4a3-64d5078c2126-kube-api-access-775rf" (OuterVolumeSpecName: "kube-api-access-775rf") pod "704ddae7-42eb-4609-b4a3-64d5078c2126" (UID: "704ddae7-42eb-4609-b4a3-64d5078c2126"). InnerVolumeSpecName "kube-api-access-775rf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:34:44 crc kubenswrapper[4764]: I0309 13:34:44.276106 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/704ddae7-42eb-4609-b4a3-64d5078c2126-util" (OuterVolumeSpecName: "util") pod "704ddae7-42eb-4609-b4a3-64d5078c2126" (UID: "704ddae7-42eb-4609-b4a3-64d5078c2126"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:34:44 crc kubenswrapper[4764]: I0309 13:34:44.366634 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-775rf\" (UniqueName: \"kubernetes.io/projected/704ddae7-42eb-4609-b4a3-64d5078c2126-kube-api-access-775rf\") on node \"crc\" DevicePath \"\"" Mar 09 13:34:44 crc kubenswrapper[4764]: I0309 13:34:44.367121 4764 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/704ddae7-42eb-4609-b4a3-64d5078c2126-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:34:44 crc kubenswrapper[4764]: I0309 13:34:44.367137 4764 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/704ddae7-42eb-4609-b4a3-64d5078c2126-util\") on node \"crc\" DevicePath \"\"" Mar 09 13:34:44 crc kubenswrapper[4764]: I0309 13:34:44.950598 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" event={"ID":"704ddae7-42eb-4609-b4a3-64d5078c2126","Type":"ContainerDied","Data":"1aae20624cf4f8dfb3e9bc0fa86cea95ff0b6ca3747b8493fe018a0d37430bcf"} Mar 09 13:34:44 crc kubenswrapper[4764]: I0309 13:34:44.950684 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v" Mar 09 13:34:44 crc kubenswrapper[4764]: I0309 13:34:44.950707 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1aae20624cf4f8dfb3e9bc0fa86cea95ff0b6ca3747b8493fe018a0d37430bcf" Mar 09 13:34:49 crc kubenswrapper[4764]: I0309 13:34:49.503538 4764 scope.go:117] "RemoveContainer" containerID="f44561c47745677a2b6e923d3f449a7c01740fc8b6e465eeb90cec0a7d1ebe67" Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.200288 4764 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.726608 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-q5kvf"] Mar 09 13:34:51 crc kubenswrapper[4764]: E0309 13:34:51.726974 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="704ddae7-42eb-4609-b4a3-64d5078c2126" containerName="util" Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.727000 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="704ddae7-42eb-4609-b4a3-64d5078c2126" containerName="util" Mar 09 13:34:51 crc kubenswrapper[4764]: E0309 13:34:51.727032 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="704ddae7-42eb-4609-b4a3-64d5078c2126" containerName="extract" Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.727040 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="704ddae7-42eb-4609-b4a3-64d5078c2126" containerName="extract" Mar 09 13:34:51 crc kubenswrapper[4764]: E0309 13:34:51.727055 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="034371f5-4d6d-4a44-9678-9093ffaf3f9d" containerName="oc" Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.727063 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="034371f5-4d6d-4a44-9678-9093ffaf3f9d" containerName="oc" Mar 09 13:34:51 crc kubenswrapper[4764]: E0309 13:34:51.727078 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="704ddae7-42eb-4609-b4a3-64d5078c2126" containerName="pull" Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.727089 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="704ddae7-42eb-4609-b4a3-64d5078c2126" containerName="pull" Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.727224 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="034371f5-4d6d-4a44-9678-9093ffaf3f9d" containerName="oc" Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.727236 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="704ddae7-42eb-4609-b4a3-64d5078c2126" containerName="extract" Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.727826 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-q5kvf" Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.729797 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-hmssh" Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.731114 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.731338 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.737928 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-q5kvf"] Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.783172 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhmc4\" (UniqueName: \"kubernetes.io/projected/33e9b814-6368-46c6-aae2-5a3df1839d29-kube-api-access-nhmc4\") pod \"nmstate-operator-75c5dccd6c-q5kvf\" (UID: \"33e9b814-6368-46c6-aae2-5a3df1839d29\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-q5kvf" Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.885340 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhmc4\" (UniqueName: \"kubernetes.io/projected/33e9b814-6368-46c6-aae2-5a3df1839d29-kube-api-access-nhmc4\") pod \"nmstate-operator-75c5dccd6c-q5kvf\" (UID: \"33e9b814-6368-46c6-aae2-5a3df1839d29\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-q5kvf" Mar 09 13:34:51 crc kubenswrapper[4764]: I0309 13:34:51.906563 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhmc4\" (UniqueName: \"kubernetes.io/projected/33e9b814-6368-46c6-aae2-5a3df1839d29-kube-api-access-nhmc4\") pod \"nmstate-operator-75c5dccd6c-q5kvf\" (UID: \"33e9b814-6368-46c6-aae2-5a3df1839d29\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-q5kvf" Mar 09 13:34:52 crc kubenswrapper[4764]: I0309 13:34:52.046487 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-q5kvf" Mar 09 13:34:52 crc kubenswrapper[4764]: I0309 13:34:52.275034 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-q5kvf"] Mar 09 13:34:53 crc kubenswrapper[4764]: I0309 13:34:53.008207 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-q5kvf" event={"ID":"33e9b814-6368-46c6-aae2-5a3df1839d29","Type":"ContainerStarted","Data":"a9e4a4aa7dba2e5c4d2a5d2a874c3b217272816a0a21b27f82cdc46a0c24caa7"} Mar 09 13:34:55 crc kubenswrapper[4764]: I0309 13:34:55.026667 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-q5kvf" event={"ID":"33e9b814-6368-46c6-aae2-5a3df1839d29","Type":"ContainerStarted","Data":"ba9af486bc1be26b8631581b44cec1c0e3ba996aec7be5a13baa05e7eb699625"} Mar 09 13:34:55 crc kubenswrapper[4764]: I0309 13:34:55.051097 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-q5kvf" podStartSLOduration=1.788189456 podStartE2EDuration="4.05107395s" podCreationTimestamp="2026-03-09 13:34:51 +0000 UTC" firstStartedPulling="2026-03-09 13:34:52.293604296 +0000 UTC m=+847.543776204" lastFinishedPulling="2026-03-09 13:34:54.55648879 +0000 UTC m=+849.806660698" observedRunningTime="2026-03-09 13:34:55.044157164 +0000 UTC m=+850.294329092" watchObservedRunningTime="2026-03-09 13:34:55.05107395 +0000 UTC m=+850.301245858" Mar 09 13:34:58 crc kubenswrapper[4764]: I0309 13:34:58.370791 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:34:58 crc kubenswrapper[4764]: I0309 13:34:58.371966 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:34:58 crc kubenswrapper[4764]: I0309 13:34:58.372062 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:34:58 crc kubenswrapper[4764]: I0309 13:34:58.372956 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3e004d46e664a823c675c177e537cdd2b21dcfc6ff9afcb1b390da1939340817"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:34:58 crc kubenswrapper[4764]: I0309 13:34:58.373022 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://3e004d46e664a823c675c177e537cdd2b21dcfc6ff9afcb1b390da1939340817" gracePeriod=600 Mar 09 13:34:59 crc kubenswrapper[4764]: I0309 13:34:59.053140 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="3e004d46e664a823c675c177e537cdd2b21dcfc6ff9afcb1b390da1939340817" exitCode=0 Mar 09 13:34:59 crc kubenswrapper[4764]: I0309 13:34:59.053609 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"3e004d46e664a823c675c177e537cdd2b21dcfc6ff9afcb1b390da1939340817"} Mar 09 13:34:59 crc kubenswrapper[4764]: I0309 13:34:59.053638 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"bd218b04a62035d950083134e237631295493daecba2baa1eef096560f891819"} Mar 09 13:34:59 crc kubenswrapper[4764]: I0309 13:34:59.053678 4764 scope.go:117] "RemoveContainer" containerID="6be6253041773ccdd98bbdfbbb5e4447fbae62fa99ca2320328889b4671f6c14" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.130400 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-jschs"] Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.132250 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-jschs" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.133849 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-wbdq9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.135494 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-wv755"] Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.136496 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-wv755" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.138104 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.150480 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-wv755"] Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.167800 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-sl5hn"] Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.168807 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.191272 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-jschs"] Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.214413 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6dc5759c-db8c-4025-bc16-a07e4dc6278a-dbus-socket\") pod \"nmstate-handler-sl5hn\" (UID: \"6dc5759c-db8c-4025-bc16-a07e4dc6278a\") " pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.214466 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6dc5759c-db8c-4025-bc16-a07e4dc6278a-nmstate-lock\") pod \"nmstate-handler-sl5hn\" (UID: \"6dc5759c-db8c-4025-bc16-a07e4dc6278a\") " pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.214511 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6t2l\" (UniqueName: \"kubernetes.io/projected/f339e495-f347-45b8-b9da-2cd832ac4300-kube-api-access-x6t2l\") pod \"nmstate-webhook-786f45cff4-wv755\" (UID: \"f339e495-f347-45b8-b9da-2cd832ac4300\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-wv755" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.214560 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6dc5759c-db8c-4025-bc16-a07e4dc6278a-ovs-socket\") pod \"nmstate-handler-sl5hn\" (UID: \"6dc5759c-db8c-4025-bc16-a07e4dc6278a\") " pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.214579 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f339e495-f347-45b8-b9da-2cd832ac4300-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-wv755\" (UID: \"f339e495-f347-45b8-b9da-2cd832ac4300\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-wv755" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.214600 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf2p2\" (UniqueName: \"kubernetes.io/projected/6dc5759c-db8c-4025-bc16-a07e4dc6278a-kube-api-access-tf2p2\") pod \"nmstate-handler-sl5hn\" (UID: \"6dc5759c-db8c-4025-bc16-a07e4dc6278a\") " pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.214624 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8lfj\" (UniqueName: \"kubernetes.io/projected/abca721f-d47f-4e38-ab9e-0832de2c70e6-kube-api-access-n8lfj\") pod \"nmstate-metrics-69594cc75-jschs\" (UID: \"abca721f-d47f-4e38-ab9e-0832de2c70e6\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-jschs" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.271857 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4"] Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.272614 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.275562 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.275810 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.275929 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-94p9n" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.291437 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4"] Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.315884 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf2p2\" (UniqueName: \"kubernetes.io/projected/6dc5759c-db8c-4025-bc16-a07e4dc6278a-kube-api-access-tf2p2\") pod \"nmstate-handler-sl5hn\" (UID: \"6dc5759c-db8c-4025-bc16-a07e4dc6278a\") " pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.315950 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8lfj\" (UniqueName: \"kubernetes.io/projected/abca721f-d47f-4e38-ab9e-0832de2c70e6-kube-api-access-n8lfj\") pod \"nmstate-metrics-69594cc75-jschs\" (UID: \"abca721f-d47f-4e38-ab9e-0832de2c70e6\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-jschs" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.315978 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6rmd\" (UniqueName: \"kubernetes.io/projected/fc521772-06d5-47ec-85d0-6162bb98af30-kube-api-access-r6rmd\") pod \"nmstate-console-plugin-5dcbbd79cf-qpjz4\" (UID: \"fc521772-06d5-47ec-85d0-6162bb98af30\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.316014 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc521772-06d5-47ec-85d0-6162bb98af30-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-qpjz4\" (UID: \"fc521772-06d5-47ec-85d0-6162bb98af30\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.316044 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fc521772-06d5-47ec-85d0-6162bb98af30-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-qpjz4\" (UID: \"fc521772-06d5-47ec-85d0-6162bb98af30\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.316560 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6dc5759c-db8c-4025-bc16-a07e4dc6278a-dbus-socket\") pod \"nmstate-handler-sl5hn\" (UID: \"6dc5759c-db8c-4025-bc16-a07e4dc6278a\") " pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.316591 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6dc5759c-db8c-4025-bc16-a07e4dc6278a-nmstate-lock\") pod \"nmstate-handler-sl5hn\" (UID: \"6dc5759c-db8c-4025-bc16-a07e4dc6278a\") " pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.317036 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6t2l\" (UniqueName: \"kubernetes.io/projected/f339e495-f347-45b8-b9da-2cd832ac4300-kube-api-access-x6t2l\") pod \"nmstate-webhook-786f45cff4-wv755\" (UID: \"f339e495-f347-45b8-b9da-2cd832ac4300\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-wv755" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.316776 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6dc5759c-db8c-4025-bc16-a07e4dc6278a-nmstate-lock\") pod \"nmstate-handler-sl5hn\" (UID: \"6dc5759c-db8c-4025-bc16-a07e4dc6278a\") " pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.316977 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6dc5759c-db8c-4025-bc16-a07e4dc6278a-dbus-socket\") pod \"nmstate-handler-sl5hn\" (UID: \"6dc5759c-db8c-4025-bc16-a07e4dc6278a\") " pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.317382 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6dc5759c-db8c-4025-bc16-a07e4dc6278a-ovs-socket\") pod \"nmstate-handler-sl5hn\" (UID: \"6dc5759c-db8c-4025-bc16-a07e4dc6278a\") " pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.317448 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6dc5759c-db8c-4025-bc16-a07e4dc6278a-ovs-socket\") pod \"nmstate-handler-sl5hn\" (UID: \"6dc5759c-db8c-4025-bc16-a07e4dc6278a\") " pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.317415 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f339e495-f347-45b8-b9da-2cd832ac4300-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-wv755\" (UID: \"f339e495-f347-45b8-b9da-2cd832ac4300\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-wv755" Mar 09 13:35:00 crc kubenswrapper[4764]: E0309 13:35:00.317590 4764 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 09 13:35:00 crc kubenswrapper[4764]: E0309 13:35:00.317944 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f339e495-f347-45b8-b9da-2cd832ac4300-tls-key-pair podName:f339e495-f347-45b8-b9da-2cd832ac4300 nodeName:}" failed. No retries permitted until 2026-03-09 13:35:00.817635187 +0000 UTC m=+856.067807095 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/f339e495-f347-45b8-b9da-2cd832ac4300-tls-key-pair") pod "nmstate-webhook-786f45cff4-wv755" (UID: "f339e495-f347-45b8-b9da-2cd832ac4300") : secret "openshift-nmstate-webhook" not found Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.338310 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf2p2\" (UniqueName: \"kubernetes.io/projected/6dc5759c-db8c-4025-bc16-a07e4dc6278a-kube-api-access-tf2p2\") pod \"nmstate-handler-sl5hn\" (UID: \"6dc5759c-db8c-4025-bc16-a07e4dc6278a\") " pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.339032 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6t2l\" (UniqueName: \"kubernetes.io/projected/f339e495-f347-45b8-b9da-2cd832ac4300-kube-api-access-x6t2l\") pod \"nmstate-webhook-786f45cff4-wv755\" (UID: \"f339e495-f347-45b8-b9da-2cd832ac4300\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-wv755" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.339716 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8lfj\" (UniqueName: \"kubernetes.io/projected/abca721f-d47f-4e38-ab9e-0832de2c70e6-kube-api-access-n8lfj\") pod \"nmstate-metrics-69594cc75-jschs\" (UID: \"abca721f-d47f-4e38-ab9e-0832de2c70e6\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-jschs" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.419865 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6rmd\" (UniqueName: \"kubernetes.io/projected/fc521772-06d5-47ec-85d0-6162bb98af30-kube-api-access-r6rmd\") pod \"nmstate-console-plugin-5dcbbd79cf-qpjz4\" (UID: \"fc521772-06d5-47ec-85d0-6162bb98af30\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.420303 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc521772-06d5-47ec-85d0-6162bb98af30-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-qpjz4\" (UID: \"fc521772-06d5-47ec-85d0-6162bb98af30\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.420339 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fc521772-06d5-47ec-85d0-6162bb98af30-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-qpjz4\" (UID: \"fc521772-06d5-47ec-85d0-6162bb98af30\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4" Mar 09 13:35:00 crc kubenswrapper[4764]: E0309 13:35:00.420455 4764 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 09 13:35:00 crc kubenswrapper[4764]: E0309 13:35:00.420522 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc521772-06d5-47ec-85d0-6162bb98af30-plugin-serving-cert podName:fc521772-06d5-47ec-85d0-6162bb98af30 nodeName:}" failed. No retries permitted until 2026-03-09 13:35:00.920498141 +0000 UTC m=+856.170670049 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/fc521772-06d5-47ec-85d0-6162bb98af30-plugin-serving-cert") pod "nmstate-console-plugin-5dcbbd79cf-qpjz4" (UID: "fc521772-06d5-47ec-85d0-6162bb98af30") : secret "plugin-serving-cert" not found Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.421577 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fc521772-06d5-47ec-85d0-6162bb98af30-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-qpjz4\" (UID: \"fc521772-06d5-47ec-85d0-6162bb98af30\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.441587 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6rmd\" (UniqueName: \"kubernetes.io/projected/fc521772-06d5-47ec-85d0-6162bb98af30-kube-api-access-r6rmd\") pod \"nmstate-console-plugin-5dcbbd79cf-qpjz4\" (UID: \"fc521772-06d5-47ec-85d0-6162bb98af30\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.479239 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d959df5d8-nhrp9"] Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.480398 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.504566 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d959df5d8-nhrp9"] Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.515212 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-jschs" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.522340 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff3abc33-b00a-400d-b1fb-c22dc5faf810-console-config\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.522420 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9t7k\" (UniqueName: \"kubernetes.io/projected/ff3abc33-b00a-400d-b1fb-c22dc5faf810-kube-api-access-w9t7k\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.522509 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff3abc33-b00a-400d-b1fb-c22dc5faf810-console-serving-cert\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.522581 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff3abc33-b00a-400d-b1fb-c22dc5faf810-trusted-ca-bundle\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.522688 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff3abc33-b00a-400d-b1fb-c22dc5faf810-oauth-serving-cert\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.522737 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff3abc33-b00a-400d-b1fb-c22dc5faf810-service-ca\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.522790 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff3abc33-b00a-400d-b1fb-c22dc5faf810-console-oauth-config\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.534410 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.623715 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff3abc33-b00a-400d-b1fb-c22dc5faf810-console-config\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.623763 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9t7k\" (UniqueName: \"kubernetes.io/projected/ff3abc33-b00a-400d-b1fb-c22dc5faf810-kube-api-access-w9t7k\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.623811 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff3abc33-b00a-400d-b1fb-c22dc5faf810-console-serving-cert\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.623886 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff3abc33-b00a-400d-b1fb-c22dc5faf810-trusted-ca-bundle\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.623907 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff3abc33-b00a-400d-b1fb-c22dc5faf810-oauth-serving-cert\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.623935 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff3abc33-b00a-400d-b1fb-c22dc5faf810-service-ca\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.623965 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff3abc33-b00a-400d-b1fb-c22dc5faf810-console-oauth-config\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.625350 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff3abc33-b00a-400d-b1fb-c22dc5faf810-console-config\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.630072 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff3abc33-b00a-400d-b1fb-c22dc5faf810-console-serving-cert\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.631465 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff3abc33-b00a-400d-b1fb-c22dc5faf810-service-ca\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.631522 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff3abc33-b00a-400d-b1fb-c22dc5faf810-oauth-serving-cert\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.631475 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff3abc33-b00a-400d-b1fb-c22dc5faf810-trusted-ca-bundle\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.631618 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff3abc33-b00a-400d-b1fb-c22dc5faf810-console-oauth-config\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.647981 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9t7k\" (UniqueName: \"kubernetes.io/projected/ff3abc33-b00a-400d-b1fb-c22dc5faf810-kube-api-access-w9t7k\") pod \"console-5d959df5d8-nhrp9\" (UID: \"ff3abc33-b00a-400d-b1fb-c22dc5faf810\") " pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.801210 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.828187 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f339e495-f347-45b8-b9da-2cd832ac4300-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-wv755\" (UID: \"f339e495-f347-45b8-b9da-2cd832ac4300\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-wv755" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.840081 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-jschs"] Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.840957 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f339e495-f347-45b8-b9da-2cd832ac4300-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-wv755\" (UID: \"f339e495-f347-45b8-b9da-2cd832ac4300\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-wv755" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.859463 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.929581 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc521772-06d5-47ec-85d0-6162bb98af30-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-qpjz4\" (UID: \"fc521772-06d5-47ec-85d0-6162bb98af30\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4" Mar 09 13:35:00 crc kubenswrapper[4764]: I0309 13:35:00.933831 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc521772-06d5-47ec-85d0-6162bb98af30-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-qpjz4\" (UID: \"fc521772-06d5-47ec-85d0-6162bb98af30\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4" Mar 09 13:35:01 crc kubenswrapper[4764]: I0309 13:35:01.005227 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d959df5d8-nhrp9"] Mar 09 13:35:01 crc kubenswrapper[4764]: W0309 13:35:01.013521 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff3abc33_b00a_400d_b1fb_c22dc5faf810.slice/crio-3de88e0490d67bb2012521e7663e9aaa9bc48e47450b3d17f45e0af89f24ff97 WatchSource:0}: Error finding container 3de88e0490d67bb2012521e7663e9aaa9bc48e47450b3d17f45e0af89f24ff97: Status 404 returned error can't find the container with id 3de88e0490d67bb2012521e7663e9aaa9bc48e47450b3d17f45e0af89f24ff97 Mar 09 13:35:01 crc kubenswrapper[4764]: I0309 13:35:01.074364 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-sl5hn" event={"ID":"6dc5759c-db8c-4025-bc16-a07e4dc6278a","Type":"ContainerStarted","Data":"87c879ff16aebfb3cdb302cf6fd8844416a459a934a74b6cd69bccff7b2c00b2"} Mar 09 13:35:01 crc kubenswrapper[4764]: I0309 13:35:01.075270 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-jschs" event={"ID":"abca721f-d47f-4e38-ab9e-0832de2c70e6","Type":"ContainerStarted","Data":"8d86efa7254330940104bfd148bae8dd455ff209b9ba97e02d72d3ff20ba27b3"} Mar 09 13:35:01 crc kubenswrapper[4764]: I0309 13:35:01.076759 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d959df5d8-nhrp9" event={"ID":"ff3abc33-b00a-400d-b1fb-c22dc5faf810","Type":"ContainerStarted","Data":"3de88e0490d67bb2012521e7663e9aaa9bc48e47450b3d17f45e0af89f24ff97"} Mar 09 13:35:01 crc kubenswrapper[4764]: I0309 13:35:01.125305 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-wv755" Mar 09 13:35:01 crc kubenswrapper[4764]: I0309 13:35:01.189277 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4" Mar 09 13:35:01 crc kubenswrapper[4764]: I0309 13:35:01.340764 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-wv755"] Mar 09 13:35:01 crc kubenswrapper[4764]: W0309 13:35:01.354125 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf339e495_f347_45b8_b9da_2cd832ac4300.slice/crio-9b42941dc94448878e40daa831a6ee288803393ccf2709213365cb94122d1321 WatchSource:0}: Error finding container 9b42941dc94448878e40daa831a6ee288803393ccf2709213365cb94122d1321: Status 404 returned error can't find the container with id 9b42941dc94448878e40daa831a6ee288803393ccf2709213365cb94122d1321 Mar 09 13:35:01 crc kubenswrapper[4764]: I0309 13:35:01.460285 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4"] Mar 09 13:35:01 crc kubenswrapper[4764]: W0309 13:35:01.465331 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc521772_06d5_47ec_85d0_6162bb98af30.slice/crio-410a51d1864d9bb6349cb99f0e102fe5fd1af4ad08cc90a661f9e6cbb1a1b5f7 WatchSource:0}: Error finding container 410a51d1864d9bb6349cb99f0e102fe5fd1af4ad08cc90a661f9e6cbb1a1b5f7: Status 404 returned error can't find the container with id 410a51d1864d9bb6349cb99f0e102fe5fd1af4ad08cc90a661f9e6cbb1a1b5f7 Mar 09 13:35:02 crc kubenswrapper[4764]: I0309 13:35:02.085812 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d959df5d8-nhrp9" event={"ID":"ff3abc33-b00a-400d-b1fb-c22dc5faf810","Type":"ContainerStarted","Data":"a3c715fa70c37d03b3eb50dc6f1fd611ac22eb383a18eec7881d70c171d35559"} Mar 09 13:35:02 crc kubenswrapper[4764]: I0309 13:35:02.087166 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-wv755" event={"ID":"f339e495-f347-45b8-b9da-2cd832ac4300","Type":"ContainerStarted","Data":"9b42941dc94448878e40daa831a6ee288803393ccf2709213365cb94122d1321"} Mar 09 13:35:02 crc kubenswrapper[4764]: I0309 13:35:02.088303 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4" event={"ID":"fc521772-06d5-47ec-85d0-6162bb98af30","Type":"ContainerStarted","Data":"410a51d1864d9bb6349cb99f0e102fe5fd1af4ad08cc90a661f9e6cbb1a1b5f7"} Mar 09 13:35:02 crc kubenswrapper[4764]: I0309 13:35:02.120341 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d959df5d8-nhrp9" podStartSLOduration=2.120316716 podStartE2EDuration="2.120316716s" podCreationTimestamp="2026-03-09 13:35:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:35:02.11039783 +0000 UTC m=+857.360569748" watchObservedRunningTime="2026-03-09 13:35:02.120316716 +0000 UTC m=+857.370488634" Mar 09 13:35:05 crc kubenswrapper[4764]: I0309 13:35:05.122191 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-wv755" event={"ID":"f339e495-f347-45b8-b9da-2cd832ac4300","Type":"ContainerStarted","Data":"08254217e02f70211175f7112a5ef7ce9b4439544efc3e7084840e3bd68ef37a"} Mar 09 13:35:05 crc kubenswrapper[4764]: I0309 13:35:05.126325 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-sl5hn" event={"ID":"6dc5759c-db8c-4025-bc16-a07e4dc6278a","Type":"ContainerStarted","Data":"81432e70fc72636dc4294964eb5847dd4fbe6ec8ddf9dbc0d515b0c84d2c789f"} Mar 09 13:35:05 crc kubenswrapper[4764]: I0309 13:35:05.126533 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:05 crc kubenswrapper[4764]: I0309 13:35:05.129115 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-jschs" event={"ID":"abca721f-d47f-4e38-ab9e-0832de2c70e6","Type":"ContainerStarted","Data":"67bd949603aca51a6bcd4d16a552df568393a8b48f3ccb5ec5991adf6aaf86b8"} Mar 09 13:35:05 crc kubenswrapper[4764]: I0309 13:35:05.131406 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4" event={"ID":"fc521772-06d5-47ec-85d0-6162bb98af30","Type":"ContainerStarted","Data":"2cb071964e579b6015b8222b504fe82fd5087ddb44f756db99776aa34099944d"} Mar 09 13:35:05 crc kubenswrapper[4764]: I0309 13:35:05.179344 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-wv755" podStartSLOduration=2.007996908 podStartE2EDuration="5.179319824s" podCreationTimestamp="2026-03-09 13:35:00 +0000 UTC" firstStartedPulling="2026-03-09 13:35:01.360424028 +0000 UTC m=+856.610595936" lastFinishedPulling="2026-03-09 13:35:04.531746924 +0000 UTC m=+859.781918852" observedRunningTime="2026-03-09 13:35:05.157072036 +0000 UTC m=+860.407243964" watchObservedRunningTime="2026-03-09 13:35:05.179319824 +0000 UTC m=+860.429491732" Mar 09 13:35:05 crc kubenswrapper[4764]: I0309 13:35:05.181772 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-sl5hn" podStartSLOduration=1.253065872 podStartE2EDuration="5.181764869s" podCreationTimestamp="2026-03-09 13:35:00 +0000 UTC" firstStartedPulling="2026-03-09 13:35:00.631055619 +0000 UTC m=+855.881227527" lastFinishedPulling="2026-03-09 13:35:04.559754616 +0000 UTC m=+859.809926524" observedRunningTime="2026-03-09 13:35:05.174797712 +0000 UTC m=+860.424969620" watchObservedRunningTime="2026-03-09 13:35:05.181764869 +0000 UTC m=+860.431936777" Mar 09 13:35:05 crc kubenswrapper[4764]: I0309 13:35:05.209966 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-qpjz4" podStartSLOduration=2.156406564 podStartE2EDuration="5.209938826s" podCreationTimestamp="2026-03-09 13:35:00 +0000 UTC" firstStartedPulling="2026-03-09 13:35:01.467675379 +0000 UTC m=+856.717847287" lastFinishedPulling="2026-03-09 13:35:04.521207621 +0000 UTC m=+859.771379549" observedRunningTime="2026-03-09 13:35:05.192985861 +0000 UTC m=+860.443157789" watchObservedRunningTime="2026-03-09 13:35:05.209938826 +0000 UTC m=+860.460110734" Mar 09 13:35:06 crc kubenswrapper[4764]: I0309 13:35:06.137636 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-wv755" Mar 09 13:35:07 crc kubenswrapper[4764]: I0309 13:35:07.145991 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-jschs" event={"ID":"abca721f-d47f-4e38-ab9e-0832de2c70e6","Type":"ContainerStarted","Data":"89b16c77fb82d30b0368a6d05c390d89970c7f7ed24264d1bcd3aed56be68a9e"} Mar 09 13:35:07 crc kubenswrapper[4764]: I0309 13:35:07.165340 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-jschs" podStartSLOduration=1.167404181 podStartE2EDuration="7.165312679s" podCreationTimestamp="2026-03-09 13:35:00 +0000 UTC" firstStartedPulling="2026-03-09 13:35:00.859194859 +0000 UTC m=+856.109366767" lastFinishedPulling="2026-03-09 13:35:06.857103357 +0000 UTC m=+862.107275265" observedRunningTime="2026-03-09 13:35:07.160690215 +0000 UTC m=+862.410862133" watchObservedRunningTime="2026-03-09 13:35:07.165312679 +0000 UTC m=+862.415484587" Mar 09 13:35:10 crc kubenswrapper[4764]: I0309 13:35:10.556758 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-sl5hn" Mar 09 13:35:10 crc kubenswrapper[4764]: I0309 13:35:10.802961 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:10 crc kubenswrapper[4764]: I0309 13:35:10.803029 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:10 crc kubenswrapper[4764]: I0309 13:35:10.808111 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:11 crc kubenswrapper[4764]: I0309 13:35:11.174862 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5d959df5d8-nhrp9" Mar 09 13:35:11 crc kubenswrapper[4764]: I0309 13:35:11.232492 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-8g9lj"] Mar 09 13:35:21 crc kubenswrapper[4764]: I0309 13:35:21.132530 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-wv755" Mar 09 13:35:34 crc kubenswrapper[4764]: I0309 13:35:34.988552 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg"] Mar 09 13:35:34 crc kubenswrapper[4764]: I0309 13:35:34.991497 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" Mar 09 13:35:34 crc kubenswrapper[4764]: I0309 13:35:34.993711 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 09 13:35:34 crc kubenswrapper[4764]: I0309 13:35:34.994867 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg"] Mar 09 13:35:35 crc kubenswrapper[4764]: I0309 13:35:35.019158 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/413f45cc-5916-45bb-a2a2-7b33029445af-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg\" (UID: \"413f45cc-5916-45bb-a2a2-7b33029445af\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" Mar 09 13:35:35 crc kubenswrapper[4764]: I0309 13:35:35.019253 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9cl5\" (UniqueName: \"kubernetes.io/projected/413f45cc-5916-45bb-a2a2-7b33029445af-kube-api-access-q9cl5\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg\" (UID: \"413f45cc-5916-45bb-a2a2-7b33029445af\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" Mar 09 13:35:35 crc kubenswrapper[4764]: I0309 13:35:35.019380 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/413f45cc-5916-45bb-a2a2-7b33029445af-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg\" (UID: \"413f45cc-5916-45bb-a2a2-7b33029445af\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" Mar 09 13:35:35 crc kubenswrapper[4764]: I0309 13:35:35.121120 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/413f45cc-5916-45bb-a2a2-7b33029445af-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg\" (UID: \"413f45cc-5916-45bb-a2a2-7b33029445af\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" Mar 09 13:35:35 crc kubenswrapper[4764]: I0309 13:35:35.121209 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9cl5\" (UniqueName: \"kubernetes.io/projected/413f45cc-5916-45bb-a2a2-7b33029445af-kube-api-access-q9cl5\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg\" (UID: \"413f45cc-5916-45bb-a2a2-7b33029445af\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" Mar 09 13:35:35 crc kubenswrapper[4764]: I0309 13:35:35.121294 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/413f45cc-5916-45bb-a2a2-7b33029445af-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg\" (UID: \"413f45cc-5916-45bb-a2a2-7b33029445af\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" Mar 09 13:35:35 crc kubenswrapper[4764]: I0309 13:35:35.122131 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/413f45cc-5916-45bb-a2a2-7b33029445af-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg\" (UID: \"413f45cc-5916-45bb-a2a2-7b33029445af\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" Mar 09 13:35:35 crc kubenswrapper[4764]: I0309 13:35:35.122263 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/413f45cc-5916-45bb-a2a2-7b33029445af-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg\" (UID: \"413f45cc-5916-45bb-a2a2-7b33029445af\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" Mar 09 13:35:35 crc kubenswrapper[4764]: I0309 13:35:35.157260 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9cl5\" (UniqueName: \"kubernetes.io/projected/413f45cc-5916-45bb-a2a2-7b33029445af-kube-api-access-q9cl5\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg\" (UID: \"413f45cc-5916-45bb-a2a2-7b33029445af\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" Mar 09 13:35:35 crc kubenswrapper[4764]: I0309 13:35:35.316726 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" Mar 09 13:35:35 crc kubenswrapper[4764]: I0309 13:35:35.757609 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg"] Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.281100 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-8g9lj" podUID="a1b4dc0b-edea-4c0d-8d61-3e3d3133605d" containerName="console" containerID="cri-o://472485e262204415197653c119165e15c1dde77e4c46e1bf7f2b023ba959af99" gracePeriod=15 Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.330374 4764 generic.go:334] "Generic (PLEG): container finished" podID="413f45cc-5916-45bb-a2a2-7b33029445af" containerID="b1914f15bf2404060d84165094e65d28117ea53289297f26ad60197b1fda3e40" exitCode=0 Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.330447 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" event={"ID":"413f45cc-5916-45bb-a2a2-7b33029445af","Type":"ContainerDied","Data":"b1914f15bf2404060d84165094e65d28117ea53289297f26ad60197b1fda3e40"} Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.330509 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" event={"ID":"413f45cc-5916-45bb-a2a2-7b33029445af","Type":"ContainerStarted","Data":"72224442d8862bb7b64a86203fad18ff6f25d0b361354db122490f417650fe17"} Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.631216 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-8g9lj_a1b4dc0b-edea-4c0d-8d61-3e3d3133605d/console/0.log" Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.631282 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.741243 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-config\") pod \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.741312 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f9zl\" (UniqueName: \"kubernetes.io/projected/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-kube-api-access-9f9zl\") pod \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.741349 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-service-ca\") pod \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.741382 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-serving-cert\") pod \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.741413 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-oauth-config\") pod \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.741429 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-trusted-ca-bundle\") pod \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.741467 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-oauth-serving-cert\") pod \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\" (UID: \"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d\") " Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.742048 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-config" (OuterVolumeSpecName: "console-config") pod "a1b4dc0b-edea-4c0d-8d61-3e3d3133605d" (UID: "a1b4dc0b-edea-4c0d-8d61-3e3d3133605d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.742061 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a1b4dc0b-edea-4c0d-8d61-3e3d3133605d" (UID: "a1b4dc0b-edea-4c0d-8d61-3e3d3133605d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.742374 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a1b4dc0b-edea-4c0d-8d61-3e3d3133605d" (UID: "a1b4dc0b-edea-4c0d-8d61-3e3d3133605d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.742369 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-service-ca" (OuterVolumeSpecName: "service-ca") pod "a1b4dc0b-edea-4c0d-8d61-3e3d3133605d" (UID: "a1b4dc0b-edea-4c0d-8d61-3e3d3133605d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.746792 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-kube-api-access-9f9zl" (OuterVolumeSpecName: "kube-api-access-9f9zl") pod "a1b4dc0b-edea-4c0d-8d61-3e3d3133605d" (UID: "a1b4dc0b-edea-4c0d-8d61-3e3d3133605d"). InnerVolumeSpecName "kube-api-access-9f9zl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.747029 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a1b4dc0b-edea-4c0d-8d61-3e3d3133605d" (UID: "a1b4dc0b-edea-4c0d-8d61-3e3d3133605d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.747199 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a1b4dc0b-edea-4c0d-8d61-3e3d3133605d" (UID: "a1b4dc0b-edea-4c0d-8d61-3e3d3133605d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.842688 4764 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.842731 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9f9zl\" (UniqueName: \"kubernetes.io/projected/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-kube-api-access-9f9zl\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.842742 4764 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.842751 4764 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.842762 4764 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.842770 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:36 crc kubenswrapper[4764]: I0309 13:35:36.842779 4764 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:37 crc kubenswrapper[4764]: I0309 13:35:37.338411 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-8g9lj_a1b4dc0b-edea-4c0d-8d61-3e3d3133605d/console/0.log" Mar 09 13:35:37 crc kubenswrapper[4764]: I0309 13:35:37.338478 4764 generic.go:334] "Generic (PLEG): container finished" podID="a1b4dc0b-edea-4c0d-8d61-3e3d3133605d" containerID="472485e262204415197653c119165e15c1dde77e4c46e1bf7f2b023ba959af99" exitCode=2 Mar 09 13:35:37 crc kubenswrapper[4764]: I0309 13:35:37.338512 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8g9lj" event={"ID":"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d","Type":"ContainerDied","Data":"472485e262204415197653c119165e15c1dde77e4c46e1bf7f2b023ba959af99"} Mar 09 13:35:37 crc kubenswrapper[4764]: I0309 13:35:37.338544 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8g9lj" event={"ID":"a1b4dc0b-edea-4c0d-8d61-3e3d3133605d","Type":"ContainerDied","Data":"74b142593d2c61fe165941847cdc507401c02d77a28ca82096827ce65a85b3e5"} Mar 09 13:35:37 crc kubenswrapper[4764]: I0309 13:35:37.338563 4764 scope.go:117] "RemoveContainer" containerID="472485e262204415197653c119165e15c1dde77e4c46e1bf7f2b023ba959af99" Mar 09 13:35:37 crc kubenswrapper[4764]: I0309 13:35:37.338722 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8g9lj" Mar 09 13:35:37 crc kubenswrapper[4764]: I0309 13:35:37.385739 4764 scope.go:117] "RemoveContainer" containerID="472485e262204415197653c119165e15c1dde77e4c46e1bf7f2b023ba959af99" Mar 09 13:35:37 crc kubenswrapper[4764]: E0309 13:35:37.386149 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"472485e262204415197653c119165e15c1dde77e4c46e1bf7f2b023ba959af99\": container with ID starting with 472485e262204415197653c119165e15c1dde77e4c46e1bf7f2b023ba959af99 not found: ID does not exist" containerID="472485e262204415197653c119165e15c1dde77e4c46e1bf7f2b023ba959af99" Mar 09 13:35:37 crc kubenswrapper[4764]: I0309 13:35:37.386193 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"472485e262204415197653c119165e15c1dde77e4c46e1bf7f2b023ba959af99"} err="failed to get container status \"472485e262204415197653c119165e15c1dde77e4c46e1bf7f2b023ba959af99\": rpc error: code = NotFound desc = could not find container \"472485e262204415197653c119165e15c1dde77e4c46e1bf7f2b023ba959af99\": container with ID starting with 472485e262204415197653c119165e15c1dde77e4c46e1bf7f2b023ba959af99 not found: ID does not exist" Mar 09 13:35:37 crc kubenswrapper[4764]: I0309 13:35:37.387446 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-8g9lj"] Mar 09 13:35:37 crc kubenswrapper[4764]: I0309 13:35:37.394605 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-8g9lj"] Mar 09 13:35:37 crc kubenswrapper[4764]: I0309 13:35:37.568070 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1b4dc0b-edea-4c0d-8d61-3e3d3133605d" path="/var/lib/kubelet/pods/a1b4dc0b-edea-4c0d-8d61-3e3d3133605d/volumes" Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.330304 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vtshp"] Mar 09 13:35:38 crc kubenswrapper[4764]: E0309 13:35:38.330608 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1b4dc0b-edea-4c0d-8d61-3e3d3133605d" containerName="console" Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.330630 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1b4dc0b-edea-4c0d-8d61-3e3d3133605d" containerName="console" Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.330770 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1b4dc0b-edea-4c0d-8d61-3e3d3133605d" containerName="console" Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.332130 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.343616 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vtshp"] Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.368722 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-utilities\") pod \"redhat-operators-vtshp\" (UID: \"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a\") " pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.368802 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-catalog-content\") pod \"redhat-operators-vtshp\" (UID: \"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a\") " pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.368955 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55cnb\" (UniqueName: \"kubernetes.io/projected/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-kube-api-access-55cnb\") pod \"redhat-operators-vtshp\" (UID: \"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a\") " pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.471214 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55cnb\" (UniqueName: \"kubernetes.io/projected/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-kube-api-access-55cnb\") pod \"redhat-operators-vtshp\" (UID: \"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a\") " pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.471310 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-utilities\") pod \"redhat-operators-vtshp\" (UID: \"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a\") " pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.471333 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-catalog-content\") pod \"redhat-operators-vtshp\" (UID: \"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a\") " pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.472112 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-catalog-content\") pod \"redhat-operators-vtshp\" (UID: \"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a\") " pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.472792 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-utilities\") pod \"redhat-operators-vtshp\" (UID: \"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a\") " pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.496898 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55cnb\" (UniqueName: \"kubernetes.io/projected/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-kube-api-access-55cnb\") pod \"redhat-operators-vtshp\" (UID: \"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a\") " pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.662431 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:35:38 crc kubenswrapper[4764]: I0309 13:35:38.932373 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vtshp"] Mar 09 13:35:38 crc kubenswrapper[4764]: W0309 13:35:38.977585 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ebb4d3c_cf42_4cc4_9856_8bdbfffbd53a.slice/crio-b1aabe863f1f8bacecabae737d4453f4b2305ab4eb3778cf0d6e8b21552025a9 WatchSource:0}: Error finding container b1aabe863f1f8bacecabae737d4453f4b2305ab4eb3778cf0d6e8b21552025a9: Status 404 returned error can't find the container with id b1aabe863f1f8bacecabae737d4453f4b2305ab4eb3778cf0d6e8b21552025a9 Mar 09 13:35:39 crc kubenswrapper[4764]: I0309 13:35:39.365187 4764 generic.go:334] "Generic (PLEG): container finished" podID="413f45cc-5916-45bb-a2a2-7b33029445af" containerID="12e5be17dde6662ac78b5ccfb4597655e2de81cd358d4b05d3b1385bc9e64d2a" exitCode=0 Mar 09 13:35:39 crc kubenswrapper[4764]: I0309 13:35:39.365257 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" event={"ID":"413f45cc-5916-45bb-a2a2-7b33029445af","Type":"ContainerDied","Data":"12e5be17dde6662ac78b5ccfb4597655e2de81cd358d4b05d3b1385bc9e64d2a"} Mar 09 13:35:39 crc kubenswrapper[4764]: I0309 13:35:39.366915 4764 generic.go:334] "Generic (PLEG): container finished" podID="0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" containerID="cd30553e3203601e67a4a200045a0ff445fed5ef9355544a81ecb96bd9dfbcc0" exitCode=0 Mar 09 13:35:39 crc kubenswrapper[4764]: I0309 13:35:39.366979 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vtshp" event={"ID":"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a","Type":"ContainerDied","Data":"cd30553e3203601e67a4a200045a0ff445fed5ef9355544a81ecb96bd9dfbcc0"} Mar 09 13:35:39 crc kubenswrapper[4764]: I0309 13:35:39.367016 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vtshp" event={"ID":"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a","Type":"ContainerStarted","Data":"b1aabe863f1f8bacecabae737d4453f4b2305ab4eb3778cf0d6e8b21552025a9"} Mar 09 13:35:40 crc kubenswrapper[4764]: I0309 13:35:40.376159 4764 generic.go:334] "Generic (PLEG): container finished" podID="413f45cc-5916-45bb-a2a2-7b33029445af" containerID="7be3f3fd424c604b44633af0868d3f5a689458b5777c17c82d3868752bae1dd5" exitCode=0 Mar 09 13:35:40 crc kubenswrapper[4764]: I0309 13:35:40.376263 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" event={"ID":"413f45cc-5916-45bb-a2a2-7b33029445af","Type":"ContainerDied","Data":"7be3f3fd424c604b44633af0868d3f5a689458b5777c17c82d3868752bae1dd5"} Mar 09 13:35:41 crc kubenswrapper[4764]: I0309 13:35:41.643448 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" Mar 09 13:35:41 crc kubenswrapper[4764]: I0309 13:35:41.717230 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9cl5\" (UniqueName: \"kubernetes.io/projected/413f45cc-5916-45bb-a2a2-7b33029445af-kube-api-access-q9cl5\") pod \"413f45cc-5916-45bb-a2a2-7b33029445af\" (UID: \"413f45cc-5916-45bb-a2a2-7b33029445af\") " Mar 09 13:35:41 crc kubenswrapper[4764]: I0309 13:35:41.717471 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/413f45cc-5916-45bb-a2a2-7b33029445af-bundle\") pod \"413f45cc-5916-45bb-a2a2-7b33029445af\" (UID: \"413f45cc-5916-45bb-a2a2-7b33029445af\") " Mar 09 13:35:41 crc kubenswrapper[4764]: I0309 13:35:41.717503 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/413f45cc-5916-45bb-a2a2-7b33029445af-util\") pod \"413f45cc-5916-45bb-a2a2-7b33029445af\" (UID: \"413f45cc-5916-45bb-a2a2-7b33029445af\") " Mar 09 13:35:41 crc kubenswrapper[4764]: I0309 13:35:41.718728 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/413f45cc-5916-45bb-a2a2-7b33029445af-bundle" (OuterVolumeSpecName: "bundle") pod "413f45cc-5916-45bb-a2a2-7b33029445af" (UID: "413f45cc-5916-45bb-a2a2-7b33029445af"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:35:41 crc kubenswrapper[4764]: I0309 13:35:41.725385 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/413f45cc-5916-45bb-a2a2-7b33029445af-kube-api-access-q9cl5" (OuterVolumeSpecName: "kube-api-access-q9cl5") pod "413f45cc-5916-45bb-a2a2-7b33029445af" (UID: "413f45cc-5916-45bb-a2a2-7b33029445af"). InnerVolumeSpecName "kube-api-access-q9cl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:35:41 crc kubenswrapper[4764]: I0309 13:35:41.727717 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/413f45cc-5916-45bb-a2a2-7b33029445af-util" (OuterVolumeSpecName: "util") pod "413f45cc-5916-45bb-a2a2-7b33029445af" (UID: "413f45cc-5916-45bb-a2a2-7b33029445af"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:35:41 crc kubenswrapper[4764]: I0309 13:35:41.819603 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9cl5\" (UniqueName: \"kubernetes.io/projected/413f45cc-5916-45bb-a2a2-7b33029445af-kube-api-access-q9cl5\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:41 crc kubenswrapper[4764]: I0309 13:35:41.819659 4764 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/413f45cc-5916-45bb-a2a2-7b33029445af-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:41 crc kubenswrapper[4764]: I0309 13:35:41.819669 4764 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/413f45cc-5916-45bb-a2a2-7b33029445af-util\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:42 crc kubenswrapper[4764]: I0309 13:35:42.394224 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" event={"ID":"413f45cc-5916-45bb-a2a2-7b33029445af","Type":"ContainerDied","Data":"72224442d8862bb7b64a86203fad18ff6f25d0b361354db122490f417650fe17"} Mar 09 13:35:42 crc kubenswrapper[4764]: I0309 13:35:42.394630 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72224442d8862bb7b64a86203fad18ff6f25d0b361354db122490f417650fe17" Mar 09 13:35:42 crc kubenswrapper[4764]: I0309 13:35:42.394301 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg" Mar 09 13:35:45 crc kubenswrapper[4764]: I0309 13:35:45.412231 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vtshp" event={"ID":"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a","Type":"ContainerStarted","Data":"865c7b22e00481478eb7d002c1f40157c39f86e5b45add0664ca71524b2698e9"} Mar 09 13:35:45 crc kubenswrapper[4764]: E0309 13:35:45.769747 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ebb4d3c_cf42_4cc4_9856_8bdbfffbd53a.slice/crio-conmon-865c7b22e00481478eb7d002c1f40157c39f86e5b45add0664ca71524b2698e9.scope\": RecentStats: unable to find data in memory cache]" Mar 09 13:35:46 crc kubenswrapper[4764]: I0309 13:35:46.418919 4764 generic.go:334] "Generic (PLEG): container finished" podID="0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" containerID="865c7b22e00481478eb7d002c1f40157c39f86e5b45add0664ca71524b2698e9" exitCode=0 Mar 09 13:35:46 crc kubenswrapper[4764]: I0309 13:35:46.418982 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vtshp" event={"ID":"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a","Type":"ContainerDied","Data":"865c7b22e00481478eb7d002c1f40157c39f86e5b45add0664ca71524b2698e9"} Mar 09 13:35:47 crc kubenswrapper[4764]: I0309 13:35:47.430723 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vtshp" event={"ID":"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a","Type":"ContainerStarted","Data":"3a09b89cfa6f58872adaec4a35a3d3cd1dd0dc256900a0efe55e07d8f9199cdc"} Mar 09 13:35:47 crc kubenswrapper[4764]: I0309 13:35:47.464127 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vtshp" podStartSLOduration=1.89836003 podStartE2EDuration="9.464102366s" podCreationTimestamp="2026-03-09 13:35:38 +0000 UTC" firstStartedPulling="2026-03-09 13:35:39.368468971 +0000 UTC m=+894.618640889" lastFinishedPulling="2026-03-09 13:35:46.934211317 +0000 UTC m=+902.184383225" observedRunningTime="2026-03-09 13:35:47.458610379 +0000 UTC m=+902.708782297" watchObservedRunningTime="2026-03-09 13:35:47.464102366 +0000 UTC m=+902.714274274" Mar 09 13:35:48 crc kubenswrapper[4764]: I0309 13:35:48.662847 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:35:48 crc kubenswrapper[4764]: I0309 13:35:48.662991 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:35:49 crc kubenswrapper[4764]: I0309 13:35:49.702898 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vtshp" podUID="0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" containerName="registry-server" probeResult="failure" output=< Mar 09 13:35:49 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Mar 09 13:35:49 crc kubenswrapper[4764]: > Mar 09 13:35:50 crc kubenswrapper[4764]: I0309 13:35:50.931368 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-659b995f59-8s255"] Mar 09 13:35:50 crc kubenswrapper[4764]: E0309 13:35:50.931788 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="413f45cc-5916-45bb-a2a2-7b33029445af" containerName="util" Mar 09 13:35:50 crc kubenswrapper[4764]: I0309 13:35:50.931807 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="413f45cc-5916-45bb-a2a2-7b33029445af" containerName="util" Mar 09 13:35:50 crc kubenswrapper[4764]: E0309 13:35:50.931824 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="413f45cc-5916-45bb-a2a2-7b33029445af" containerName="extract" Mar 09 13:35:50 crc kubenswrapper[4764]: I0309 13:35:50.931832 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="413f45cc-5916-45bb-a2a2-7b33029445af" containerName="extract" Mar 09 13:35:50 crc kubenswrapper[4764]: E0309 13:35:50.931861 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="413f45cc-5916-45bb-a2a2-7b33029445af" containerName="pull" Mar 09 13:35:50 crc kubenswrapper[4764]: I0309 13:35:50.931869 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="413f45cc-5916-45bb-a2a2-7b33029445af" containerName="pull" Mar 09 13:35:50 crc kubenswrapper[4764]: I0309 13:35:50.932021 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="413f45cc-5916-45bb-a2a2-7b33029445af" containerName="extract" Mar 09 13:35:50 crc kubenswrapper[4764]: I0309 13:35:50.932630 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" Mar 09 13:35:50 crc kubenswrapper[4764]: I0309 13:35:50.937468 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 09 13:35:50 crc kubenswrapper[4764]: I0309 13:35:50.937780 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 09 13:35:50 crc kubenswrapper[4764]: I0309 13:35:50.938616 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-ldn24" Mar 09 13:35:50 crc kubenswrapper[4764]: I0309 13:35:50.938695 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 09 13:35:50 crc kubenswrapper[4764]: I0309 13:35:50.939876 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 09 13:35:50 crc kubenswrapper[4764]: I0309 13:35:50.946992 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-659b995f59-8s255"] Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.075854 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b89770ec-e502-4b3a-8233-8c9aa76d55de-apiservice-cert\") pod \"metallb-operator-controller-manager-659b995f59-8s255\" (UID: \"b89770ec-e502-4b3a-8233-8c9aa76d55de\") " pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.075919 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b89770ec-e502-4b3a-8233-8c9aa76d55de-webhook-cert\") pod \"metallb-operator-controller-manager-659b995f59-8s255\" (UID: \"b89770ec-e502-4b3a-8233-8c9aa76d55de\") " pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.075975 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvmg2\" (UniqueName: \"kubernetes.io/projected/b89770ec-e502-4b3a-8233-8c9aa76d55de-kube-api-access-jvmg2\") pod \"metallb-operator-controller-manager-659b995f59-8s255\" (UID: \"b89770ec-e502-4b3a-8233-8c9aa76d55de\") " pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.169277 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv"] Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.170039 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.172218 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-jtx95" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.172272 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.174694 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.177531 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b89770ec-e502-4b3a-8233-8c9aa76d55de-apiservice-cert\") pod \"metallb-operator-controller-manager-659b995f59-8s255\" (UID: \"b89770ec-e502-4b3a-8233-8c9aa76d55de\") " pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.177585 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b89770ec-e502-4b3a-8233-8c9aa76d55de-webhook-cert\") pod \"metallb-operator-controller-manager-659b995f59-8s255\" (UID: \"b89770ec-e502-4b3a-8233-8c9aa76d55de\") " pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.177635 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvmg2\" (UniqueName: \"kubernetes.io/projected/b89770ec-e502-4b3a-8233-8c9aa76d55de-kube-api-access-jvmg2\") pod \"metallb-operator-controller-manager-659b995f59-8s255\" (UID: \"b89770ec-e502-4b3a-8233-8c9aa76d55de\") " pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.185338 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b89770ec-e502-4b3a-8233-8c9aa76d55de-webhook-cert\") pod \"metallb-operator-controller-manager-659b995f59-8s255\" (UID: \"b89770ec-e502-4b3a-8233-8c9aa76d55de\") " pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.186990 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b89770ec-e502-4b3a-8233-8c9aa76d55de-apiservice-cert\") pod \"metallb-operator-controller-manager-659b995f59-8s255\" (UID: \"b89770ec-e502-4b3a-8233-8c9aa76d55de\") " pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.189620 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv"] Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.209420 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvmg2\" (UniqueName: \"kubernetes.io/projected/b89770ec-e502-4b3a-8233-8c9aa76d55de-kube-api-access-jvmg2\") pod \"metallb-operator-controller-manager-659b995f59-8s255\" (UID: \"b89770ec-e502-4b3a-8233-8c9aa76d55de\") " pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.254578 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.278591 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn56m\" (UniqueName: \"kubernetes.io/projected/ed37a5d1-5d4b-41fb-8476-189def32c909-kube-api-access-cn56m\") pod \"metallb-operator-webhook-server-7bccb4c96-wqdrv\" (UID: \"ed37a5d1-5d4b-41fb-8476-189def32c909\") " pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.278681 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ed37a5d1-5d4b-41fb-8476-189def32c909-webhook-cert\") pod \"metallb-operator-webhook-server-7bccb4c96-wqdrv\" (UID: \"ed37a5d1-5d4b-41fb-8476-189def32c909\") " pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.278720 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ed37a5d1-5d4b-41fb-8476-189def32c909-apiservice-cert\") pod \"metallb-operator-webhook-server-7bccb4c96-wqdrv\" (UID: \"ed37a5d1-5d4b-41fb-8476-189def32c909\") " pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.381144 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ed37a5d1-5d4b-41fb-8476-189def32c909-apiservice-cert\") pod \"metallb-operator-webhook-server-7bccb4c96-wqdrv\" (UID: \"ed37a5d1-5d4b-41fb-8476-189def32c909\") " pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.381865 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn56m\" (UniqueName: \"kubernetes.io/projected/ed37a5d1-5d4b-41fb-8476-189def32c909-kube-api-access-cn56m\") pod \"metallb-operator-webhook-server-7bccb4c96-wqdrv\" (UID: \"ed37a5d1-5d4b-41fb-8476-189def32c909\") " pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.381928 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ed37a5d1-5d4b-41fb-8476-189def32c909-webhook-cert\") pod \"metallb-operator-webhook-server-7bccb4c96-wqdrv\" (UID: \"ed37a5d1-5d4b-41fb-8476-189def32c909\") " pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.386060 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ed37a5d1-5d4b-41fb-8476-189def32c909-apiservice-cert\") pod \"metallb-operator-webhook-server-7bccb4c96-wqdrv\" (UID: \"ed37a5d1-5d4b-41fb-8476-189def32c909\") " pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.388636 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ed37a5d1-5d4b-41fb-8476-189def32c909-webhook-cert\") pod \"metallb-operator-webhook-server-7bccb4c96-wqdrv\" (UID: \"ed37a5d1-5d4b-41fb-8476-189def32c909\") " pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.406896 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn56m\" (UniqueName: \"kubernetes.io/projected/ed37a5d1-5d4b-41fb-8476-189def32c909-kube-api-access-cn56m\") pod \"metallb-operator-webhook-server-7bccb4c96-wqdrv\" (UID: \"ed37a5d1-5d4b-41fb-8476-189def32c909\") " pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.542127 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" Mar 09 13:35:51 crc kubenswrapper[4764]: I0309 13:35:51.628171 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-659b995f59-8s255"] Mar 09 13:35:51 crc kubenswrapper[4764]: W0309 13:35:51.650632 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb89770ec_e502_4b3a_8233_8c9aa76d55de.slice/crio-1c3cc6a46e1308cf3647cb168debebee0e6505ea65ce44bf3c0ba4be43dc8f42 WatchSource:0}: Error finding container 1c3cc6a46e1308cf3647cb168debebee0e6505ea65ce44bf3c0ba4be43dc8f42: Status 404 returned error can't find the container with id 1c3cc6a46e1308cf3647cb168debebee0e6505ea65ce44bf3c0ba4be43dc8f42 Mar 09 13:35:52 crc kubenswrapper[4764]: I0309 13:35:52.213540 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv"] Mar 09 13:35:52 crc kubenswrapper[4764]: W0309 13:35:52.215610 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded37a5d1_5d4b_41fb_8476_189def32c909.slice/crio-4fed1d217b5044ce3588ab59fefb0593c45f3823512ce3923b0ff444eb3f2846 WatchSource:0}: Error finding container 4fed1d217b5044ce3588ab59fefb0593c45f3823512ce3923b0ff444eb3f2846: Status 404 returned error can't find the container with id 4fed1d217b5044ce3588ab59fefb0593c45f3823512ce3923b0ff444eb3f2846 Mar 09 13:35:52 crc kubenswrapper[4764]: I0309 13:35:52.468020 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" event={"ID":"ed37a5d1-5d4b-41fb-8476-189def32c909","Type":"ContainerStarted","Data":"4fed1d217b5044ce3588ab59fefb0593c45f3823512ce3923b0ff444eb3f2846"} Mar 09 13:35:52 crc kubenswrapper[4764]: I0309 13:35:52.470099 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" event={"ID":"b89770ec-e502-4b3a-8233-8c9aa76d55de","Type":"ContainerStarted","Data":"1c3cc6a46e1308cf3647cb168debebee0e6505ea65ce44bf3c0ba4be43dc8f42"} Mar 09 13:35:58 crc kubenswrapper[4764]: I0309 13:35:58.525588 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" event={"ID":"ed37a5d1-5d4b-41fb-8476-189def32c909","Type":"ContainerStarted","Data":"dc1706c51269524f0b79241f7df5be4ff4134518ccda55d641057a2484396f4c"} Mar 09 13:35:58 crc kubenswrapper[4764]: I0309 13:35:58.526291 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" Mar 09 13:35:58 crc kubenswrapper[4764]: I0309 13:35:58.527270 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" event={"ID":"b89770ec-e502-4b3a-8233-8c9aa76d55de","Type":"ContainerStarted","Data":"1d3a57f678830400b5b899f6366dc418082f694e07f45a58c39ad8a42955a633"} Mar 09 13:35:58 crc kubenswrapper[4764]: I0309 13:35:58.527723 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" Mar 09 13:35:58 crc kubenswrapper[4764]: I0309 13:35:58.551726 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" podStartSLOduration=1.9586393439999998 podStartE2EDuration="7.551707375s" podCreationTimestamp="2026-03-09 13:35:51 +0000 UTC" firstStartedPulling="2026-03-09 13:35:52.219178747 +0000 UTC m=+907.469350655" lastFinishedPulling="2026-03-09 13:35:57.812246778 +0000 UTC m=+913.062418686" observedRunningTime="2026-03-09 13:35:58.547028849 +0000 UTC m=+913.797200777" watchObservedRunningTime="2026-03-09 13:35:58.551707375 +0000 UTC m=+913.801879283" Mar 09 13:35:58 crc kubenswrapper[4764]: I0309 13:35:58.577415 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" podStartSLOduration=2.439775926 podStartE2EDuration="8.577391851s" podCreationTimestamp="2026-03-09 13:35:50 +0000 UTC" firstStartedPulling="2026-03-09 13:35:51.65583871 +0000 UTC m=+906.906010618" lastFinishedPulling="2026-03-09 13:35:57.793454635 +0000 UTC m=+913.043626543" observedRunningTime="2026-03-09 13:35:58.576263961 +0000 UTC m=+913.826435869" watchObservedRunningTime="2026-03-09 13:35:58.577391851 +0000 UTC m=+913.827563759" Mar 09 13:35:58 crc kubenswrapper[4764]: I0309 13:35:58.704191 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:35:58 crc kubenswrapper[4764]: I0309 13:35:58.773138 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:35:59 crc kubenswrapper[4764]: I0309 13:35:59.323330 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vtshp"] Mar 09 13:36:00 crc kubenswrapper[4764]: I0309 13:36:00.140480 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551056-chr2q"] Mar 09 13:36:00 crc kubenswrapper[4764]: I0309 13:36:00.141609 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551056-chr2q" Mar 09 13:36:00 crc kubenswrapper[4764]: I0309 13:36:00.145192 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:36:00 crc kubenswrapper[4764]: I0309 13:36:00.145207 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:36:00 crc kubenswrapper[4764]: I0309 13:36:00.145736 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:36:00 crc kubenswrapper[4764]: I0309 13:36:00.153834 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551056-chr2q"] Mar 09 13:36:00 crc kubenswrapper[4764]: I0309 13:36:00.224073 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6dl4\" (UniqueName: \"kubernetes.io/projected/b5ceebdd-e9ad-472a-8806-f5b441ced89a-kube-api-access-h6dl4\") pod \"auto-csr-approver-29551056-chr2q\" (UID: \"b5ceebdd-e9ad-472a-8806-f5b441ced89a\") " pod="openshift-infra/auto-csr-approver-29551056-chr2q" Mar 09 13:36:00 crc kubenswrapper[4764]: I0309 13:36:00.325339 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6dl4\" (UniqueName: \"kubernetes.io/projected/b5ceebdd-e9ad-472a-8806-f5b441ced89a-kube-api-access-h6dl4\") pod \"auto-csr-approver-29551056-chr2q\" (UID: \"b5ceebdd-e9ad-472a-8806-f5b441ced89a\") " pod="openshift-infra/auto-csr-approver-29551056-chr2q" Mar 09 13:36:00 crc kubenswrapper[4764]: I0309 13:36:00.346452 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6dl4\" (UniqueName: \"kubernetes.io/projected/b5ceebdd-e9ad-472a-8806-f5b441ced89a-kube-api-access-h6dl4\") pod \"auto-csr-approver-29551056-chr2q\" (UID: \"b5ceebdd-e9ad-472a-8806-f5b441ced89a\") " pod="openshift-infra/auto-csr-approver-29551056-chr2q" Mar 09 13:36:00 crc kubenswrapper[4764]: I0309 13:36:00.498520 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551056-chr2q" Mar 09 13:36:00 crc kubenswrapper[4764]: I0309 13:36:00.542225 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vtshp" podUID="0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" containerName="registry-server" containerID="cri-o://3a09b89cfa6f58872adaec4a35a3d3cd1dd0dc256900a0efe55e07d8f9199cdc" gracePeriod=2 Mar 09 13:36:00 crc kubenswrapper[4764]: I0309 13:36:00.789363 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551056-chr2q"] Mar 09 13:36:00 crc kubenswrapper[4764]: I0309 13:36:00.911882 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.042830 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-utilities\") pod \"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a\" (UID: \"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a\") " Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.042907 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-catalog-content\") pod \"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a\" (UID: \"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a\") " Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.042933 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55cnb\" (UniqueName: \"kubernetes.io/projected/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-kube-api-access-55cnb\") pod \"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a\" (UID: \"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a\") " Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.044278 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-utilities" (OuterVolumeSpecName: "utilities") pod "0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" (UID: "0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.048413 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-kube-api-access-55cnb" (OuterVolumeSpecName: "kube-api-access-55cnb") pod "0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" (UID: "0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a"). InnerVolumeSpecName "kube-api-access-55cnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.144478 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.144517 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55cnb\" (UniqueName: \"kubernetes.io/projected/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-kube-api-access-55cnb\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.164414 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" (UID: "0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.246513 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.549768 4764 generic.go:334] "Generic (PLEG): container finished" podID="0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" containerID="3a09b89cfa6f58872adaec4a35a3d3cd1dd0dc256900a0efe55e07d8f9199cdc" exitCode=0 Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.549823 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vtshp" event={"ID":"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a","Type":"ContainerDied","Data":"3a09b89cfa6f58872adaec4a35a3d3cd1dd0dc256900a0efe55e07d8f9199cdc"} Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.549849 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vtshp" event={"ID":"0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a","Type":"ContainerDied","Data":"b1aabe863f1f8bacecabae737d4453f4b2305ab4eb3778cf0d6e8b21552025a9"} Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.549865 4764 scope.go:117] "RemoveContainer" containerID="3a09b89cfa6f58872adaec4a35a3d3cd1dd0dc256900a0efe55e07d8f9199cdc" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.549982 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vtshp" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.552398 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551056-chr2q" event={"ID":"b5ceebdd-e9ad-472a-8806-f5b441ced89a","Type":"ContainerStarted","Data":"25f6a3f57e2ed3f49a95fdf9ce89a920cef1531b81bc2f3426cf66871e5a53ad"} Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.580941 4764 scope.go:117] "RemoveContainer" containerID="865c7b22e00481478eb7d002c1f40157c39f86e5b45add0664ca71524b2698e9" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.581734 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vtshp"] Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.585474 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vtshp"] Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.599595 4764 scope.go:117] "RemoveContainer" containerID="cd30553e3203601e67a4a200045a0ff445fed5ef9355544a81ecb96bd9dfbcc0" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.623762 4764 scope.go:117] "RemoveContainer" containerID="3a09b89cfa6f58872adaec4a35a3d3cd1dd0dc256900a0efe55e07d8f9199cdc" Mar 09 13:36:01 crc kubenswrapper[4764]: E0309 13:36:01.624447 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a09b89cfa6f58872adaec4a35a3d3cd1dd0dc256900a0efe55e07d8f9199cdc\": container with ID starting with 3a09b89cfa6f58872adaec4a35a3d3cd1dd0dc256900a0efe55e07d8f9199cdc not found: ID does not exist" containerID="3a09b89cfa6f58872adaec4a35a3d3cd1dd0dc256900a0efe55e07d8f9199cdc" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.624690 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a09b89cfa6f58872adaec4a35a3d3cd1dd0dc256900a0efe55e07d8f9199cdc"} err="failed to get container status \"3a09b89cfa6f58872adaec4a35a3d3cd1dd0dc256900a0efe55e07d8f9199cdc\": rpc error: code = NotFound desc = could not find container \"3a09b89cfa6f58872adaec4a35a3d3cd1dd0dc256900a0efe55e07d8f9199cdc\": container with ID starting with 3a09b89cfa6f58872adaec4a35a3d3cd1dd0dc256900a0efe55e07d8f9199cdc not found: ID does not exist" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.624815 4764 scope.go:117] "RemoveContainer" containerID="865c7b22e00481478eb7d002c1f40157c39f86e5b45add0664ca71524b2698e9" Mar 09 13:36:01 crc kubenswrapper[4764]: E0309 13:36:01.625477 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"865c7b22e00481478eb7d002c1f40157c39f86e5b45add0664ca71524b2698e9\": container with ID starting with 865c7b22e00481478eb7d002c1f40157c39f86e5b45add0664ca71524b2698e9 not found: ID does not exist" containerID="865c7b22e00481478eb7d002c1f40157c39f86e5b45add0664ca71524b2698e9" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.625514 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"865c7b22e00481478eb7d002c1f40157c39f86e5b45add0664ca71524b2698e9"} err="failed to get container status \"865c7b22e00481478eb7d002c1f40157c39f86e5b45add0664ca71524b2698e9\": rpc error: code = NotFound desc = could not find container \"865c7b22e00481478eb7d002c1f40157c39f86e5b45add0664ca71524b2698e9\": container with ID starting with 865c7b22e00481478eb7d002c1f40157c39f86e5b45add0664ca71524b2698e9 not found: ID does not exist" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.625536 4764 scope.go:117] "RemoveContainer" containerID="cd30553e3203601e67a4a200045a0ff445fed5ef9355544a81ecb96bd9dfbcc0" Mar 09 13:36:01 crc kubenswrapper[4764]: E0309 13:36:01.625855 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd30553e3203601e67a4a200045a0ff445fed5ef9355544a81ecb96bd9dfbcc0\": container with ID starting with cd30553e3203601e67a4a200045a0ff445fed5ef9355544a81ecb96bd9dfbcc0 not found: ID does not exist" containerID="cd30553e3203601e67a4a200045a0ff445fed5ef9355544a81ecb96bd9dfbcc0" Mar 09 13:36:01 crc kubenswrapper[4764]: I0309 13:36:01.625878 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd30553e3203601e67a4a200045a0ff445fed5ef9355544a81ecb96bd9dfbcc0"} err="failed to get container status \"cd30553e3203601e67a4a200045a0ff445fed5ef9355544a81ecb96bd9dfbcc0\": rpc error: code = NotFound desc = could not find container \"cd30553e3203601e67a4a200045a0ff445fed5ef9355544a81ecb96bd9dfbcc0\": container with ID starting with cd30553e3203601e67a4a200045a0ff445fed5ef9355544a81ecb96bd9dfbcc0 not found: ID does not exist" Mar 09 13:36:02 crc kubenswrapper[4764]: I0309 13:36:02.561226 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551056-chr2q" event={"ID":"b5ceebdd-e9ad-472a-8806-f5b441ced89a","Type":"ContainerStarted","Data":"b34325cd4e2f6811cadb8554b8a6fb24248546f3c4182376f60b7c3268c9f6c7"} Mar 09 13:36:02 crc kubenswrapper[4764]: I0309 13:36:02.578858 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551056-chr2q" podStartSLOduration=1.129425532 podStartE2EDuration="2.578837346s" podCreationTimestamp="2026-03-09 13:36:00 +0000 UTC" firstStartedPulling="2026-03-09 13:36:00.811742487 +0000 UTC m=+916.061914395" lastFinishedPulling="2026-03-09 13:36:02.261154301 +0000 UTC m=+917.511326209" observedRunningTime="2026-03-09 13:36:02.575423014 +0000 UTC m=+917.825594922" watchObservedRunningTime="2026-03-09 13:36:02.578837346 +0000 UTC m=+917.829009254" Mar 09 13:36:03 crc kubenswrapper[4764]: I0309 13:36:03.567607 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" path="/var/lib/kubelet/pods/0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a/volumes" Mar 09 13:36:03 crc kubenswrapper[4764]: I0309 13:36:03.569376 4764 generic.go:334] "Generic (PLEG): container finished" podID="b5ceebdd-e9ad-472a-8806-f5b441ced89a" containerID="b34325cd4e2f6811cadb8554b8a6fb24248546f3c4182376f60b7c3268c9f6c7" exitCode=0 Mar 09 13:36:03 crc kubenswrapper[4764]: I0309 13:36:03.569408 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551056-chr2q" event={"ID":"b5ceebdd-e9ad-472a-8806-f5b441ced89a","Type":"ContainerDied","Data":"b34325cd4e2f6811cadb8554b8a6fb24248546f3c4182376f60b7c3268c9f6c7"} Mar 09 13:36:04 crc kubenswrapper[4764]: I0309 13:36:04.896096 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551056-chr2q" Mar 09 13:36:05 crc kubenswrapper[4764]: I0309 13:36:05.005743 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6dl4\" (UniqueName: \"kubernetes.io/projected/b5ceebdd-e9ad-472a-8806-f5b441ced89a-kube-api-access-h6dl4\") pod \"b5ceebdd-e9ad-472a-8806-f5b441ced89a\" (UID: \"b5ceebdd-e9ad-472a-8806-f5b441ced89a\") " Mar 09 13:36:05 crc kubenswrapper[4764]: I0309 13:36:05.014845 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5ceebdd-e9ad-472a-8806-f5b441ced89a-kube-api-access-h6dl4" (OuterVolumeSpecName: "kube-api-access-h6dl4") pod "b5ceebdd-e9ad-472a-8806-f5b441ced89a" (UID: "b5ceebdd-e9ad-472a-8806-f5b441ced89a"). InnerVolumeSpecName "kube-api-access-h6dl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:36:05 crc kubenswrapper[4764]: I0309 13:36:05.107927 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6dl4\" (UniqueName: \"kubernetes.io/projected/b5ceebdd-e9ad-472a-8806-f5b441ced89a-kube-api-access-h6dl4\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:05 crc kubenswrapper[4764]: I0309 13:36:05.592228 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551056-chr2q" event={"ID":"b5ceebdd-e9ad-472a-8806-f5b441ced89a","Type":"ContainerDied","Data":"25f6a3f57e2ed3f49a95fdf9ce89a920cef1531b81bc2f3426cf66871e5a53ad"} Mar 09 13:36:05 crc kubenswrapper[4764]: I0309 13:36:05.592293 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25f6a3f57e2ed3f49a95fdf9ce89a920cef1531b81bc2f3426cf66871e5a53ad" Mar 09 13:36:05 crc kubenswrapper[4764]: I0309 13:36:05.592303 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551056-chr2q" Mar 09 13:36:05 crc kubenswrapper[4764]: I0309 13:36:05.641282 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551050-zmlhn"] Mar 09 13:36:05 crc kubenswrapper[4764]: I0309 13:36:05.645498 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551050-zmlhn"] Mar 09 13:36:07 crc kubenswrapper[4764]: I0309 13:36:07.570846 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f815cd5-462f-4994-bab1-beef4157b06e" path="/var/lib/kubelet/pods/7f815cd5-462f-4994-bab1-beef4157b06e/volumes" Mar 09 13:36:11 crc kubenswrapper[4764]: I0309 13:36:11.551436 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7bccb4c96-wqdrv" Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.256911 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-659b995f59-8s255" Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.911801 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-kl47c"] Mar 09 13:36:31 crc kubenswrapper[4764]: E0309 13:36:31.912528 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" containerName="extract-utilities" Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.912554 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" containerName="extract-utilities" Mar 09 13:36:31 crc kubenswrapper[4764]: E0309 13:36:31.912569 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" containerName="registry-server" Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.912576 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" containerName="registry-server" Mar 09 13:36:31 crc kubenswrapper[4764]: E0309 13:36:31.912583 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5ceebdd-e9ad-472a-8806-f5b441ced89a" containerName="oc" Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.912592 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5ceebdd-e9ad-472a-8806-f5b441ced89a" containerName="oc" Mar 09 13:36:31 crc kubenswrapper[4764]: E0309 13:36:31.912612 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" containerName="extract-content" Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.912620 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" containerName="extract-content" Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.912749 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5ceebdd-e9ad-472a-8806-f5b441ced89a" containerName="oc" Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.912765 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ebb4d3c-cf42-4cc4-9856-8bdbfffbd53a" containerName="registry-server" Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.915154 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.918994 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.919316 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-nw2wr" Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.919331 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z"] Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.920077 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.920533 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z" Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.923149 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 09 13:36:31 crc kubenswrapper[4764]: I0309 13:36:31.943061 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z"] Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.020138 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfznq\" (UniqueName: \"kubernetes.io/projected/72efa175-2568-4c62-a97e-35893887fe82-kube-api-access-bfznq\") pod \"frr-k8s-webhook-server-7f989f654f-wqd8z\" (UID: \"72efa175-2568-4c62-a97e-35893887fe82\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.020227 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/9333a95c-85e4-4e7d-a142-ae2dd06b4146-frr-conf\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.020254 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9333a95c-85e4-4e7d-a142-ae2dd06b4146-metrics-certs\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.020271 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/9333a95c-85e4-4e7d-a142-ae2dd06b4146-frr-sockets\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.020292 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/9333a95c-85e4-4e7d-a142-ae2dd06b4146-metrics\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.020315 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72efa175-2568-4c62-a97e-35893887fe82-cert\") pod \"frr-k8s-webhook-server-7f989f654f-wqd8z\" (UID: \"72efa175-2568-4c62-a97e-35893887fe82\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.020339 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/9333a95c-85e4-4e7d-a142-ae2dd06b4146-reloader\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.020383 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/9333a95c-85e4-4e7d-a142-ae2dd06b4146-frr-startup\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.020404 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df4c8\" (UniqueName: \"kubernetes.io/projected/9333a95c-85e4-4e7d-a142-ae2dd06b4146-kube-api-access-df4c8\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.036929 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-2z5wp"] Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.038356 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-2z5wp" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.041494 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.041876 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.042018 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.042303 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-9bjvr" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.069186 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-lgrkv"] Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.070312 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-lgrkv" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.073143 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.081956 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-lgrkv"] Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.121760 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfznq\" (UniqueName: \"kubernetes.io/projected/72efa175-2568-4c62-a97e-35893887fe82-kube-api-access-bfznq\") pod \"frr-k8s-webhook-server-7f989f654f-wqd8z\" (UID: \"72efa175-2568-4c62-a97e-35893887fe82\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.121832 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bfd899d4-a0df-47e3-aa36-1cf690235c45-memberlist\") pod \"speaker-2z5wp\" (UID: \"bfd899d4-a0df-47e3-aa36-1cf690235c45\") " pod="metallb-system/speaker-2z5wp" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.121894 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/9333a95c-85e4-4e7d-a142-ae2dd06b4146-frr-conf\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.121912 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9333a95c-85e4-4e7d-a142-ae2dd06b4146-metrics-certs\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.121935 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/9333a95c-85e4-4e7d-a142-ae2dd06b4146-frr-sockets\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.121953 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/9333a95c-85e4-4e7d-a142-ae2dd06b4146-metrics\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.121971 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72efa175-2568-4c62-a97e-35893887fe82-cert\") pod \"frr-k8s-webhook-server-7f989f654f-wqd8z\" (UID: \"72efa175-2568-4c62-a97e-35893887fe82\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.121995 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/9333a95c-85e4-4e7d-a142-ae2dd06b4146-reloader\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.122031 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcrdv\" (UniqueName: \"kubernetes.io/projected/bfd899d4-a0df-47e3-aa36-1cf690235c45-kube-api-access-rcrdv\") pod \"speaker-2z5wp\" (UID: \"bfd899d4-a0df-47e3-aa36-1cf690235c45\") " pod="metallb-system/speaker-2z5wp" Mar 09 13:36:32 crc kubenswrapper[4764]: E0309 13:36:32.122050 4764 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 09 13:36:32 crc kubenswrapper[4764]: E0309 13:36:32.122129 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9333a95c-85e4-4e7d-a142-ae2dd06b4146-metrics-certs podName:9333a95c-85e4-4e7d-a142-ae2dd06b4146 nodeName:}" failed. No retries permitted until 2026-03-09 13:36:32.62210324 +0000 UTC m=+947.872275148 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9333a95c-85e4-4e7d-a142-ae2dd06b4146-metrics-certs") pod "frr-k8s-kl47c" (UID: "9333a95c-85e4-4e7d-a142-ae2dd06b4146") : secret "frr-k8s-certs-secret" not found Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.122054 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/bfd899d4-a0df-47e3-aa36-1cf690235c45-metallb-excludel2\") pod \"speaker-2z5wp\" (UID: \"bfd899d4-a0df-47e3-aa36-1cf690235c45\") " pod="metallb-system/speaker-2z5wp" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.122568 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/9333a95c-85e4-4e7d-a142-ae2dd06b4146-frr-startup\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.122688 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df4c8\" (UniqueName: \"kubernetes.io/projected/9333a95c-85e4-4e7d-a142-ae2dd06b4146-kube-api-access-df4c8\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.122811 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfd899d4-a0df-47e3-aa36-1cf690235c45-metrics-certs\") pod \"speaker-2z5wp\" (UID: \"bfd899d4-a0df-47e3-aa36-1cf690235c45\") " pod="metallb-system/speaker-2z5wp" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.122812 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/9333a95c-85e4-4e7d-a142-ae2dd06b4146-frr-conf\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.123109 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/9333a95c-85e4-4e7d-a142-ae2dd06b4146-frr-sockets\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.123151 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/9333a95c-85e4-4e7d-a142-ae2dd06b4146-reloader\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.123276 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/9333a95c-85e4-4e7d-a142-ae2dd06b4146-metrics\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.123810 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/9333a95c-85e4-4e7d-a142-ae2dd06b4146-frr-startup\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.130037 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72efa175-2568-4c62-a97e-35893887fe82-cert\") pod \"frr-k8s-webhook-server-7f989f654f-wqd8z\" (UID: \"72efa175-2568-4c62-a97e-35893887fe82\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.142967 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df4c8\" (UniqueName: \"kubernetes.io/projected/9333a95c-85e4-4e7d-a142-ae2dd06b4146-kube-api-access-df4c8\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.156037 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfznq\" (UniqueName: \"kubernetes.io/projected/72efa175-2568-4c62-a97e-35893887fe82-kube-api-access-bfznq\") pod \"frr-k8s-webhook-server-7f989f654f-wqd8z\" (UID: \"72efa175-2568-4c62-a97e-35893887fe82\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.224508 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcrdv\" (UniqueName: \"kubernetes.io/projected/bfd899d4-a0df-47e3-aa36-1cf690235c45-kube-api-access-rcrdv\") pod \"speaker-2z5wp\" (UID: \"bfd899d4-a0df-47e3-aa36-1cf690235c45\") " pod="metallb-system/speaker-2z5wp" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.224595 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/709e786e-5c7d-45d3-ac38-78351dfbec81-metrics-certs\") pod \"controller-86ddb6bd46-lgrkv\" (UID: \"709e786e-5c7d-45d3-ac38-78351dfbec81\") " pod="metallb-system/controller-86ddb6bd46-lgrkv" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.224683 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/bfd899d4-a0df-47e3-aa36-1cf690235c45-metallb-excludel2\") pod \"speaker-2z5wp\" (UID: \"bfd899d4-a0df-47e3-aa36-1cf690235c45\") " pod="metallb-system/speaker-2z5wp" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.224740 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ft42\" (UniqueName: \"kubernetes.io/projected/709e786e-5c7d-45d3-ac38-78351dfbec81-kube-api-access-2ft42\") pod \"controller-86ddb6bd46-lgrkv\" (UID: \"709e786e-5c7d-45d3-ac38-78351dfbec81\") " pod="metallb-system/controller-86ddb6bd46-lgrkv" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.224773 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfd899d4-a0df-47e3-aa36-1cf690235c45-metrics-certs\") pod \"speaker-2z5wp\" (UID: \"bfd899d4-a0df-47e3-aa36-1cf690235c45\") " pod="metallb-system/speaker-2z5wp" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.224818 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bfd899d4-a0df-47e3-aa36-1cf690235c45-memberlist\") pod \"speaker-2z5wp\" (UID: \"bfd899d4-a0df-47e3-aa36-1cf690235c45\") " pod="metallb-system/speaker-2z5wp" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.224852 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/709e786e-5c7d-45d3-ac38-78351dfbec81-cert\") pod \"controller-86ddb6bd46-lgrkv\" (UID: \"709e786e-5c7d-45d3-ac38-78351dfbec81\") " pod="metallb-system/controller-86ddb6bd46-lgrkv" Mar 09 13:36:32 crc kubenswrapper[4764]: E0309 13:36:32.225036 4764 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 09 13:36:32 crc kubenswrapper[4764]: E0309 13:36:32.225097 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfd899d4-a0df-47e3-aa36-1cf690235c45-metrics-certs podName:bfd899d4-a0df-47e3-aa36-1cf690235c45 nodeName:}" failed. No retries permitted until 2026-03-09 13:36:32.725076354 +0000 UTC m=+947.975248272 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bfd899d4-a0df-47e3-aa36-1cf690235c45-metrics-certs") pod "speaker-2z5wp" (UID: "bfd899d4-a0df-47e3-aa36-1cf690235c45") : secret "speaker-certs-secret" not found Mar 09 13:36:32 crc kubenswrapper[4764]: E0309 13:36:32.225407 4764 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 09 13:36:32 crc kubenswrapper[4764]: E0309 13:36:32.225455 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfd899d4-a0df-47e3-aa36-1cf690235c45-memberlist podName:bfd899d4-a0df-47e3-aa36-1cf690235c45 nodeName:}" failed. No retries permitted until 2026-03-09 13:36:32.725445213 +0000 UTC m=+947.975617141 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/bfd899d4-a0df-47e3-aa36-1cf690235c45-memberlist") pod "speaker-2z5wp" (UID: "bfd899d4-a0df-47e3-aa36-1cf690235c45") : secret "metallb-memberlist" not found Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.225594 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/bfd899d4-a0df-47e3-aa36-1cf690235c45-metallb-excludel2\") pod \"speaker-2z5wp\" (UID: \"bfd899d4-a0df-47e3-aa36-1cf690235c45\") " pod="metallb-system/speaker-2z5wp" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.242812 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.247251 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcrdv\" (UniqueName: \"kubernetes.io/projected/bfd899d4-a0df-47e3-aa36-1cf690235c45-kube-api-access-rcrdv\") pod \"speaker-2z5wp\" (UID: \"bfd899d4-a0df-47e3-aa36-1cf690235c45\") " pod="metallb-system/speaker-2z5wp" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.326541 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/709e786e-5c7d-45d3-ac38-78351dfbec81-metrics-certs\") pod \"controller-86ddb6bd46-lgrkv\" (UID: \"709e786e-5c7d-45d3-ac38-78351dfbec81\") " pod="metallb-system/controller-86ddb6bd46-lgrkv" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.326628 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ft42\" (UniqueName: \"kubernetes.io/projected/709e786e-5c7d-45d3-ac38-78351dfbec81-kube-api-access-2ft42\") pod \"controller-86ddb6bd46-lgrkv\" (UID: \"709e786e-5c7d-45d3-ac38-78351dfbec81\") " pod="metallb-system/controller-86ddb6bd46-lgrkv" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.326734 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/709e786e-5c7d-45d3-ac38-78351dfbec81-cert\") pod \"controller-86ddb6bd46-lgrkv\" (UID: \"709e786e-5c7d-45d3-ac38-78351dfbec81\") " pod="metallb-system/controller-86ddb6bd46-lgrkv" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.333908 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/709e786e-5c7d-45d3-ac38-78351dfbec81-metrics-certs\") pod \"controller-86ddb6bd46-lgrkv\" (UID: \"709e786e-5c7d-45d3-ac38-78351dfbec81\") " pod="metallb-system/controller-86ddb6bd46-lgrkv" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.334716 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/709e786e-5c7d-45d3-ac38-78351dfbec81-cert\") pod \"controller-86ddb6bd46-lgrkv\" (UID: \"709e786e-5c7d-45d3-ac38-78351dfbec81\") " pod="metallb-system/controller-86ddb6bd46-lgrkv" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.352222 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ft42\" (UniqueName: \"kubernetes.io/projected/709e786e-5c7d-45d3-ac38-78351dfbec81-kube-api-access-2ft42\") pod \"controller-86ddb6bd46-lgrkv\" (UID: \"709e786e-5c7d-45d3-ac38-78351dfbec81\") " pod="metallb-system/controller-86ddb6bd46-lgrkv" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.385950 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-lgrkv" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.488714 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z"] Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.604417 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-lgrkv"] Mar 09 13:36:32 crc kubenswrapper[4764]: W0309 13:36:32.609410 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod709e786e_5c7d_45d3_ac38_78351dfbec81.slice/crio-ccc82aaefed7d9537262982fd7a996604ba097e5041bb95579a4d031f9d2e60c WatchSource:0}: Error finding container ccc82aaefed7d9537262982fd7a996604ba097e5041bb95579a4d031f9d2e60c: Status 404 returned error can't find the container with id ccc82aaefed7d9537262982fd7a996604ba097e5041bb95579a4d031f9d2e60c Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.631879 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9333a95c-85e4-4e7d-a142-ae2dd06b4146-metrics-certs\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.635024 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9333a95c-85e4-4e7d-a142-ae2dd06b4146-metrics-certs\") pod \"frr-k8s-kl47c\" (UID: \"9333a95c-85e4-4e7d-a142-ae2dd06b4146\") " pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.733146 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfd899d4-a0df-47e3-aa36-1cf690235c45-metrics-certs\") pod \"speaker-2z5wp\" (UID: \"bfd899d4-a0df-47e3-aa36-1cf690235c45\") " pod="metallb-system/speaker-2z5wp" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.733830 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bfd899d4-a0df-47e3-aa36-1cf690235c45-memberlist\") pod \"speaker-2z5wp\" (UID: \"bfd899d4-a0df-47e3-aa36-1cf690235c45\") " pod="metallb-system/speaker-2z5wp" Mar 09 13:36:32 crc kubenswrapper[4764]: E0309 13:36:32.734033 4764 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 09 13:36:32 crc kubenswrapper[4764]: E0309 13:36:32.734103 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfd899d4-a0df-47e3-aa36-1cf690235c45-memberlist podName:bfd899d4-a0df-47e3-aa36-1cf690235c45 nodeName:}" failed. No retries permitted until 2026-03-09 13:36:33.734078397 +0000 UTC m=+948.984250315 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/bfd899d4-a0df-47e3-aa36-1cf690235c45-memberlist") pod "speaker-2z5wp" (UID: "bfd899d4-a0df-47e3-aa36-1cf690235c45") : secret "metallb-memberlist" not found Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.738931 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfd899d4-a0df-47e3-aa36-1cf690235c45-metrics-certs\") pod \"speaker-2z5wp\" (UID: \"bfd899d4-a0df-47e3-aa36-1cf690235c45\") " pod="metallb-system/speaker-2z5wp" Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.790801 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-lgrkv" event={"ID":"709e786e-5c7d-45d3-ac38-78351dfbec81","Type":"ContainerStarted","Data":"c645dbcc54ac7389adcf8472029f28a17bb3ab18d62f651f44a296f7c941a029"} Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.790863 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-lgrkv" event={"ID":"709e786e-5c7d-45d3-ac38-78351dfbec81","Type":"ContainerStarted","Data":"ccc82aaefed7d9537262982fd7a996604ba097e5041bb95579a4d031f9d2e60c"} Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.792624 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z" event={"ID":"72efa175-2568-4c62-a97e-35893887fe82","Type":"ContainerStarted","Data":"8fbfb9c4ea8c03f5d5d51c68d861b19b4b94f6d21190cf8346cf7708bfaff7c5"} Mar 09 13:36:32 crc kubenswrapper[4764]: I0309 13:36:32.835445 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:33 crc kubenswrapper[4764]: I0309 13:36:33.749576 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bfd899d4-a0df-47e3-aa36-1cf690235c45-memberlist\") pod \"speaker-2z5wp\" (UID: \"bfd899d4-a0df-47e3-aa36-1cf690235c45\") " pod="metallb-system/speaker-2z5wp" Mar 09 13:36:33 crc kubenswrapper[4764]: I0309 13:36:33.760156 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bfd899d4-a0df-47e3-aa36-1cf690235c45-memberlist\") pod \"speaker-2z5wp\" (UID: \"bfd899d4-a0df-47e3-aa36-1cf690235c45\") " pod="metallb-system/speaker-2z5wp" Mar 09 13:36:33 crc kubenswrapper[4764]: I0309 13:36:33.810378 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-lgrkv" event={"ID":"709e786e-5c7d-45d3-ac38-78351dfbec81","Type":"ContainerStarted","Data":"44e0d7fccc074ab18f2a9f368ba3458681486755c84a6061fa7b346b373675d1"} Mar 09 13:36:33 crc kubenswrapper[4764]: I0309 13:36:33.810552 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-lgrkv" Mar 09 13:36:33 crc kubenswrapper[4764]: I0309 13:36:33.812663 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kl47c" event={"ID":"9333a95c-85e4-4e7d-a142-ae2dd06b4146","Type":"ContainerStarted","Data":"a9b94eb5591c50ab81ba457deee7f7ff9c242cb15790807a3ea843d1c7e9fe45"} Mar 09 13:36:33 crc kubenswrapper[4764]: I0309 13:36:33.831013 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-lgrkv" podStartSLOduration=1.8309633619999999 podStartE2EDuration="1.830963362s" podCreationTimestamp="2026-03-09 13:36:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:36:33.826660327 +0000 UTC m=+949.076832255" watchObservedRunningTime="2026-03-09 13:36:33.830963362 +0000 UTC m=+949.081135290" Mar 09 13:36:33 crc kubenswrapper[4764]: I0309 13:36:33.853041 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-2z5wp" Mar 09 13:36:33 crc kubenswrapper[4764]: W0309 13:36:33.875755 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfd899d4_a0df_47e3_aa36_1cf690235c45.slice/crio-9b795bda24de764cb038a0f7e99fc5d97e3d1a330e3d8bf387e47f7d332912eb WatchSource:0}: Error finding container 9b795bda24de764cb038a0f7e99fc5d97e3d1a330e3d8bf387e47f7d332912eb: Status 404 returned error can't find the container with id 9b795bda24de764cb038a0f7e99fc5d97e3d1a330e3d8bf387e47f7d332912eb Mar 09 13:36:34 crc kubenswrapper[4764]: I0309 13:36:34.829781 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-2z5wp" event={"ID":"bfd899d4-a0df-47e3-aa36-1cf690235c45","Type":"ContainerStarted","Data":"52a4d032a3a07513b9b75a90a480c9aa3bd91bfb06613d9f966c934e9181de0c"} Mar 09 13:36:34 crc kubenswrapper[4764]: I0309 13:36:34.830257 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-2z5wp" event={"ID":"bfd899d4-a0df-47e3-aa36-1cf690235c45","Type":"ContainerStarted","Data":"6683eb40b5aec2a052fbe30366a057a1a2a74e3121f4a15d3bd3de9cec82bd1d"} Mar 09 13:36:34 crc kubenswrapper[4764]: I0309 13:36:34.830275 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-2z5wp" event={"ID":"bfd899d4-a0df-47e3-aa36-1cf690235c45","Type":"ContainerStarted","Data":"9b795bda24de764cb038a0f7e99fc5d97e3d1a330e3d8bf387e47f7d332912eb"} Mar 09 13:36:34 crc kubenswrapper[4764]: I0309 13:36:34.830512 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-2z5wp" Mar 09 13:36:34 crc kubenswrapper[4764]: I0309 13:36:34.866875 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-2z5wp" podStartSLOduration=2.866849395 podStartE2EDuration="2.866849395s" podCreationTimestamp="2026-03-09 13:36:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:36:34.865717935 +0000 UTC m=+950.115889863" watchObservedRunningTime="2026-03-09 13:36:34.866849395 +0000 UTC m=+950.117021303" Mar 09 13:36:40 crc kubenswrapper[4764]: I0309 13:36:40.871471 4764 generic.go:334] "Generic (PLEG): container finished" podID="9333a95c-85e4-4e7d-a142-ae2dd06b4146" containerID="470655eecd68eac5ce56d927a5337c5a83b695a3960176753ea38b8da26f138a" exitCode=0 Mar 09 13:36:40 crc kubenswrapper[4764]: I0309 13:36:40.871543 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kl47c" event={"ID":"9333a95c-85e4-4e7d-a142-ae2dd06b4146","Type":"ContainerDied","Data":"470655eecd68eac5ce56d927a5337c5a83b695a3960176753ea38b8da26f138a"} Mar 09 13:36:40 crc kubenswrapper[4764]: I0309 13:36:40.873433 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z" event={"ID":"72efa175-2568-4c62-a97e-35893887fe82","Type":"ContainerStarted","Data":"7a46c71e01e80dee907d49b4b2478287a8a1bc104aef22126e89bb77e6c8bd91"} Mar 09 13:36:40 crc kubenswrapper[4764]: I0309 13:36:40.873610 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z" Mar 09 13:36:40 crc kubenswrapper[4764]: I0309 13:36:40.914271 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z" podStartSLOduration=2.206412149 podStartE2EDuration="9.914248209s" podCreationTimestamp="2026-03-09 13:36:31 +0000 UTC" firstStartedPulling="2026-03-09 13:36:32.503258543 +0000 UTC m=+947.753430451" lastFinishedPulling="2026-03-09 13:36:40.211094603 +0000 UTC m=+955.461266511" observedRunningTime="2026-03-09 13:36:40.911160286 +0000 UTC m=+956.161332194" watchObservedRunningTime="2026-03-09 13:36:40.914248209 +0000 UTC m=+956.164420117" Mar 09 13:36:41 crc kubenswrapper[4764]: I0309 13:36:41.881382 4764 generic.go:334] "Generic (PLEG): container finished" podID="9333a95c-85e4-4e7d-a142-ae2dd06b4146" containerID="1f7f45e7224027f135e01339375692e4ab6c75e79fc75dad448b13ac4973a932" exitCode=0 Mar 09 13:36:41 crc kubenswrapper[4764]: I0309 13:36:41.881495 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kl47c" event={"ID":"9333a95c-85e4-4e7d-a142-ae2dd06b4146","Type":"ContainerDied","Data":"1f7f45e7224027f135e01339375692e4ab6c75e79fc75dad448b13ac4973a932"} Mar 09 13:36:42 crc kubenswrapper[4764]: I0309 13:36:42.390276 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-lgrkv" Mar 09 13:36:42 crc kubenswrapper[4764]: I0309 13:36:42.890705 4764 generic.go:334] "Generic (PLEG): container finished" podID="9333a95c-85e4-4e7d-a142-ae2dd06b4146" containerID="b307d1fe6f83787aea879f9ff45eed41f0d807cb5616664dc7891dea7e3ed6a1" exitCode=0 Mar 09 13:36:42 crc kubenswrapper[4764]: I0309 13:36:42.890950 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kl47c" event={"ID":"9333a95c-85e4-4e7d-a142-ae2dd06b4146","Type":"ContainerDied","Data":"b307d1fe6f83787aea879f9ff45eed41f0d807cb5616664dc7891dea7e3ed6a1"} Mar 09 13:36:43 crc kubenswrapper[4764]: I0309 13:36:43.902344 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kl47c" event={"ID":"9333a95c-85e4-4e7d-a142-ae2dd06b4146","Type":"ContainerStarted","Data":"c0cc25911be6ab5a281fa081a719db07070f1612173f89156a7722670d4f38ce"} Mar 09 13:36:43 crc kubenswrapper[4764]: I0309 13:36:43.903879 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:43 crc kubenswrapper[4764]: I0309 13:36:43.903899 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kl47c" event={"ID":"9333a95c-85e4-4e7d-a142-ae2dd06b4146","Type":"ContainerStarted","Data":"b875b550b788ce5f1a7ea11015e8c25e08dd7c9085f74715b0514b52c8fee9cf"} Mar 09 13:36:43 crc kubenswrapper[4764]: I0309 13:36:43.903913 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kl47c" event={"ID":"9333a95c-85e4-4e7d-a142-ae2dd06b4146","Type":"ContainerStarted","Data":"1f113a3f9f07f101fcae5bf2331396d152db4190e0d459527807589797ae1746"} Mar 09 13:36:43 crc kubenswrapper[4764]: I0309 13:36:43.903924 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kl47c" event={"ID":"9333a95c-85e4-4e7d-a142-ae2dd06b4146","Type":"ContainerStarted","Data":"2c65403ce820a236bd04146940e6cbd59f0b07a8a0e0eb9baa6635134e2b1c11"} Mar 09 13:36:43 crc kubenswrapper[4764]: I0309 13:36:43.903936 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kl47c" event={"ID":"9333a95c-85e4-4e7d-a142-ae2dd06b4146","Type":"ContainerStarted","Data":"c126faa9bee79fddc335b737e5b6ffe374c670cbc4f2a2fd41d798cbdef516b5"} Mar 09 13:36:43 crc kubenswrapper[4764]: I0309 13:36:43.903946 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kl47c" event={"ID":"9333a95c-85e4-4e7d-a142-ae2dd06b4146","Type":"ContainerStarted","Data":"42cff07c48e09d3c676271f26a1cdb8a4f5bfa450eeae8871a4ecb0db96dc764"} Mar 09 13:36:43 crc kubenswrapper[4764]: I0309 13:36:43.924576 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-kl47c" podStartSLOduration=5.712443285 podStartE2EDuration="12.924557917s" podCreationTimestamp="2026-03-09 13:36:31 +0000 UTC" firstStartedPulling="2026-03-09 13:36:33.021974486 +0000 UTC m=+948.272146394" lastFinishedPulling="2026-03-09 13:36:40.234089118 +0000 UTC m=+955.484261026" observedRunningTime="2026-03-09 13:36:43.922023059 +0000 UTC m=+959.172194967" watchObservedRunningTime="2026-03-09 13:36:43.924557917 +0000 UTC m=+959.174729825" Mar 09 13:36:47 crc kubenswrapper[4764]: I0309 13:36:47.835814 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:47 crc kubenswrapper[4764]: I0309 13:36:47.875277 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-kl47c" Mar 09 13:36:49 crc kubenswrapper[4764]: I0309 13:36:49.620885 4764 scope.go:117] "RemoveContainer" containerID="a7ac41644a3901488ef405782554d6dd08becac720d51b607ff6a4cba78e912f" Mar 09 13:36:52 crc kubenswrapper[4764]: I0309 13:36:52.248205 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-wqd8z" Mar 09 13:36:53 crc kubenswrapper[4764]: I0309 13:36:53.857352 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-2z5wp" Mar 09 13:36:56 crc kubenswrapper[4764]: I0309 13:36:56.420598 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-lnxvv"] Mar 09 13:36:56 crc kubenswrapper[4764]: I0309 13:36:56.422375 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lnxvv" Mar 09 13:36:56 crc kubenswrapper[4764]: I0309 13:36:56.440709 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 09 13:36:56 crc kubenswrapper[4764]: I0309 13:36:56.440718 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 09 13:36:56 crc kubenswrapper[4764]: I0309 13:36:56.440722 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-6c56h" Mar 09 13:36:56 crc kubenswrapper[4764]: I0309 13:36:56.444126 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-lnxvv"] Mar 09 13:36:56 crc kubenswrapper[4764]: I0309 13:36:56.511635 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9v92\" (UniqueName: \"kubernetes.io/projected/15e66009-37fa-4f89-aba2-e39f68c46496-kube-api-access-g9v92\") pod \"openstack-operator-index-lnxvv\" (UID: \"15e66009-37fa-4f89-aba2-e39f68c46496\") " pod="openstack-operators/openstack-operator-index-lnxvv" Mar 09 13:36:56 crc kubenswrapper[4764]: I0309 13:36:56.613173 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9v92\" (UniqueName: \"kubernetes.io/projected/15e66009-37fa-4f89-aba2-e39f68c46496-kube-api-access-g9v92\") pod \"openstack-operator-index-lnxvv\" (UID: \"15e66009-37fa-4f89-aba2-e39f68c46496\") " pod="openstack-operators/openstack-operator-index-lnxvv" Mar 09 13:36:56 crc kubenswrapper[4764]: I0309 13:36:56.633014 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9v92\" (UniqueName: \"kubernetes.io/projected/15e66009-37fa-4f89-aba2-e39f68c46496-kube-api-access-g9v92\") pod \"openstack-operator-index-lnxvv\" (UID: \"15e66009-37fa-4f89-aba2-e39f68c46496\") " pod="openstack-operators/openstack-operator-index-lnxvv" Mar 09 13:36:56 crc kubenswrapper[4764]: I0309 13:36:56.741732 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lnxvv" Mar 09 13:36:57 crc kubenswrapper[4764]: I0309 13:36:57.163450 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-lnxvv"] Mar 09 13:36:57 crc kubenswrapper[4764]: W0309 13:36:57.172382 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15e66009_37fa_4f89_aba2_e39f68c46496.slice/crio-357d479dcf0b97e2bac8502d01fb6d3b44d98edba00f3d6869efb668be985ee8 WatchSource:0}: Error finding container 357d479dcf0b97e2bac8502d01fb6d3b44d98edba00f3d6869efb668be985ee8: Status 404 returned error can't find the container with id 357d479dcf0b97e2bac8502d01fb6d3b44d98edba00f3d6869efb668be985ee8 Mar 09 13:36:58 crc kubenswrapper[4764]: I0309 13:36:58.000175 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lnxvv" event={"ID":"15e66009-37fa-4f89-aba2-e39f68c46496","Type":"ContainerStarted","Data":"357d479dcf0b97e2bac8502d01fb6d3b44d98edba00f3d6869efb668be985ee8"} Mar 09 13:36:58 crc kubenswrapper[4764]: I0309 13:36:58.371957 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:36:58 crc kubenswrapper[4764]: I0309 13:36:58.372091 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:36:59 crc kubenswrapper[4764]: I0309 13:36:59.802749 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-lnxvv"] Mar 09 13:37:00 crc kubenswrapper[4764]: I0309 13:37:00.427717 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-lvrg9"] Mar 09 13:37:00 crc kubenswrapper[4764]: I0309 13:37:00.428499 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lvrg9" Mar 09 13:37:00 crc kubenswrapper[4764]: I0309 13:37:00.446227 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-lvrg9"] Mar 09 13:37:00 crc kubenswrapper[4764]: I0309 13:37:00.571747 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxv8l\" (UniqueName: \"kubernetes.io/projected/8e6c087a-8aaa-427c-822b-a274e19cc440-kube-api-access-gxv8l\") pod \"openstack-operator-index-lvrg9\" (UID: \"8e6c087a-8aaa-427c-822b-a274e19cc440\") " pod="openstack-operators/openstack-operator-index-lvrg9" Mar 09 13:37:00 crc kubenswrapper[4764]: I0309 13:37:00.672772 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxv8l\" (UniqueName: \"kubernetes.io/projected/8e6c087a-8aaa-427c-822b-a274e19cc440-kube-api-access-gxv8l\") pod \"openstack-operator-index-lvrg9\" (UID: \"8e6c087a-8aaa-427c-822b-a274e19cc440\") " pod="openstack-operators/openstack-operator-index-lvrg9" Mar 09 13:37:00 crc kubenswrapper[4764]: I0309 13:37:00.693737 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxv8l\" (UniqueName: \"kubernetes.io/projected/8e6c087a-8aaa-427c-822b-a274e19cc440-kube-api-access-gxv8l\") pod \"openstack-operator-index-lvrg9\" (UID: \"8e6c087a-8aaa-427c-822b-a274e19cc440\") " pod="openstack-operators/openstack-operator-index-lvrg9" Mar 09 13:37:00 crc kubenswrapper[4764]: I0309 13:37:00.813194 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lvrg9" Mar 09 13:37:01 crc kubenswrapper[4764]: I0309 13:37:01.026751 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lnxvv" event={"ID":"15e66009-37fa-4f89-aba2-e39f68c46496","Type":"ContainerStarted","Data":"1dc0bce8d8df10f300fba641a99cc5d62b54813faee2b39fbfc95ac2058ac6a6"} Mar 09 13:37:01 crc kubenswrapper[4764]: I0309 13:37:01.026918 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-lnxvv" podUID="15e66009-37fa-4f89-aba2-e39f68c46496" containerName="registry-server" containerID="cri-o://1dc0bce8d8df10f300fba641a99cc5d62b54813faee2b39fbfc95ac2058ac6a6" gracePeriod=2 Mar 09 13:37:01 crc kubenswrapper[4764]: I0309 13:37:01.053048 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-lnxvv" podStartSLOduration=1.572860859 podStartE2EDuration="5.053022653s" podCreationTimestamp="2026-03-09 13:36:56 +0000 UTC" firstStartedPulling="2026-03-09 13:36:57.174733332 +0000 UTC m=+972.424905230" lastFinishedPulling="2026-03-09 13:37:00.654895116 +0000 UTC m=+975.905067024" observedRunningTime="2026-03-09 13:37:01.045719448 +0000 UTC m=+976.295891366" watchObservedRunningTime="2026-03-09 13:37:01.053022653 +0000 UTC m=+976.303194561" Mar 09 13:37:01 crc kubenswrapper[4764]: I0309 13:37:01.252609 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-lvrg9"] Mar 09 13:37:01 crc kubenswrapper[4764]: I0309 13:37:01.432473 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lnxvv" Mar 09 13:37:01 crc kubenswrapper[4764]: I0309 13:37:01.589020 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9v92\" (UniqueName: \"kubernetes.io/projected/15e66009-37fa-4f89-aba2-e39f68c46496-kube-api-access-g9v92\") pod \"15e66009-37fa-4f89-aba2-e39f68c46496\" (UID: \"15e66009-37fa-4f89-aba2-e39f68c46496\") " Mar 09 13:37:01 crc kubenswrapper[4764]: I0309 13:37:01.595928 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15e66009-37fa-4f89-aba2-e39f68c46496-kube-api-access-g9v92" (OuterVolumeSpecName: "kube-api-access-g9v92") pod "15e66009-37fa-4f89-aba2-e39f68c46496" (UID: "15e66009-37fa-4f89-aba2-e39f68c46496"). InnerVolumeSpecName "kube-api-access-g9v92". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:37:01 crc kubenswrapper[4764]: I0309 13:37:01.691192 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9v92\" (UniqueName: \"kubernetes.io/projected/15e66009-37fa-4f89-aba2-e39f68c46496-kube-api-access-g9v92\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:02 crc kubenswrapper[4764]: I0309 13:37:02.037797 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lvrg9" event={"ID":"8e6c087a-8aaa-427c-822b-a274e19cc440","Type":"ContainerStarted","Data":"8761408c1f43723e51de7920c8e1301b9c1ef1b34821dca6d09ceaef4a9b756b"} Mar 09 13:37:02 crc kubenswrapper[4764]: I0309 13:37:02.038359 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lvrg9" event={"ID":"8e6c087a-8aaa-427c-822b-a274e19cc440","Type":"ContainerStarted","Data":"38a5ed38f395b2547ef8de67cb616f1ee84cca2cb343c5104ceec7270c8f4d8a"} Mar 09 13:37:02 crc kubenswrapper[4764]: I0309 13:37:02.040509 4764 generic.go:334] "Generic (PLEG): container finished" podID="15e66009-37fa-4f89-aba2-e39f68c46496" containerID="1dc0bce8d8df10f300fba641a99cc5d62b54813faee2b39fbfc95ac2058ac6a6" exitCode=0 Mar 09 13:37:02 crc kubenswrapper[4764]: I0309 13:37:02.040548 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lnxvv" event={"ID":"15e66009-37fa-4f89-aba2-e39f68c46496","Type":"ContainerDied","Data":"1dc0bce8d8df10f300fba641a99cc5d62b54813faee2b39fbfc95ac2058ac6a6"} Mar 09 13:37:02 crc kubenswrapper[4764]: I0309 13:37:02.040574 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lnxvv" event={"ID":"15e66009-37fa-4f89-aba2-e39f68c46496","Type":"ContainerDied","Data":"357d479dcf0b97e2bac8502d01fb6d3b44d98edba00f3d6869efb668be985ee8"} Mar 09 13:37:02 crc kubenswrapper[4764]: I0309 13:37:02.040599 4764 scope.go:117] "RemoveContainer" containerID="1dc0bce8d8df10f300fba641a99cc5d62b54813faee2b39fbfc95ac2058ac6a6" Mar 09 13:37:02 crc kubenswrapper[4764]: I0309 13:37:02.040716 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lnxvv" Mar 09 13:37:02 crc kubenswrapper[4764]: I0309 13:37:02.063744 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-lvrg9" podStartSLOduration=2.01418457 podStartE2EDuration="2.063712114s" podCreationTimestamp="2026-03-09 13:37:00 +0000 UTC" firstStartedPulling="2026-03-09 13:37:01.266353739 +0000 UTC m=+976.516525647" lastFinishedPulling="2026-03-09 13:37:01.315881283 +0000 UTC m=+976.566053191" observedRunningTime="2026-03-09 13:37:02.056809789 +0000 UTC m=+977.306981697" watchObservedRunningTime="2026-03-09 13:37:02.063712114 +0000 UTC m=+977.313884032" Mar 09 13:37:02 crc kubenswrapper[4764]: I0309 13:37:02.073961 4764 scope.go:117] "RemoveContainer" containerID="1dc0bce8d8df10f300fba641a99cc5d62b54813faee2b39fbfc95ac2058ac6a6" Mar 09 13:37:02 crc kubenswrapper[4764]: E0309 13:37:02.074462 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dc0bce8d8df10f300fba641a99cc5d62b54813faee2b39fbfc95ac2058ac6a6\": container with ID starting with 1dc0bce8d8df10f300fba641a99cc5d62b54813faee2b39fbfc95ac2058ac6a6 not found: ID does not exist" containerID="1dc0bce8d8df10f300fba641a99cc5d62b54813faee2b39fbfc95ac2058ac6a6" Mar 09 13:37:02 crc kubenswrapper[4764]: I0309 13:37:02.074501 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dc0bce8d8df10f300fba641a99cc5d62b54813faee2b39fbfc95ac2058ac6a6"} err="failed to get container status \"1dc0bce8d8df10f300fba641a99cc5d62b54813faee2b39fbfc95ac2058ac6a6\": rpc error: code = NotFound desc = could not find container \"1dc0bce8d8df10f300fba641a99cc5d62b54813faee2b39fbfc95ac2058ac6a6\": container with ID starting with 1dc0bce8d8df10f300fba641a99cc5d62b54813faee2b39fbfc95ac2058ac6a6 not found: ID does not exist" Mar 09 13:37:02 crc kubenswrapper[4764]: I0309 13:37:02.085792 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-lnxvv"] Mar 09 13:37:02 crc kubenswrapper[4764]: I0309 13:37:02.091540 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-lnxvv"] Mar 09 13:37:02 crc kubenswrapper[4764]: I0309 13:37:02.841698 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-kl47c" Mar 09 13:37:03 crc kubenswrapper[4764]: I0309 13:37:03.568192 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15e66009-37fa-4f89-aba2-e39f68c46496" path="/var/lib/kubelet/pods/15e66009-37fa-4f89-aba2-e39f68c46496/volumes" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.212883 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wbrdf"] Mar 09 13:37:10 crc kubenswrapper[4764]: E0309 13:37:10.213717 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15e66009-37fa-4f89-aba2-e39f68c46496" containerName="registry-server" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.213728 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="15e66009-37fa-4f89-aba2-e39f68c46496" containerName="registry-server" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.213864 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="15e66009-37fa-4f89-aba2-e39f68c46496" containerName="registry-server" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.214635 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.224280 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wbrdf"] Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.326276 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5554t\" (UniqueName: \"kubernetes.io/projected/a8dfccd4-6f59-4e38-8beb-d586722f6429-kube-api-access-5554t\") pod \"community-operators-wbrdf\" (UID: \"a8dfccd4-6f59-4e38-8beb-d586722f6429\") " pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.326326 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8dfccd4-6f59-4e38-8beb-d586722f6429-utilities\") pod \"community-operators-wbrdf\" (UID: \"a8dfccd4-6f59-4e38-8beb-d586722f6429\") " pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.326767 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8dfccd4-6f59-4e38-8beb-d586722f6429-catalog-content\") pod \"community-operators-wbrdf\" (UID: \"a8dfccd4-6f59-4e38-8beb-d586722f6429\") " pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.427919 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8dfccd4-6f59-4e38-8beb-d586722f6429-catalog-content\") pod \"community-operators-wbrdf\" (UID: \"a8dfccd4-6f59-4e38-8beb-d586722f6429\") " pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.428398 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5554t\" (UniqueName: \"kubernetes.io/projected/a8dfccd4-6f59-4e38-8beb-d586722f6429-kube-api-access-5554t\") pod \"community-operators-wbrdf\" (UID: \"a8dfccd4-6f59-4e38-8beb-d586722f6429\") " pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.428440 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8dfccd4-6f59-4e38-8beb-d586722f6429-utilities\") pod \"community-operators-wbrdf\" (UID: \"a8dfccd4-6f59-4e38-8beb-d586722f6429\") " pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.428978 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8dfccd4-6f59-4e38-8beb-d586722f6429-catalog-content\") pod \"community-operators-wbrdf\" (UID: \"a8dfccd4-6f59-4e38-8beb-d586722f6429\") " pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.429094 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8dfccd4-6f59-4e38-8beb-d586722f6429-utilities\") pod \"community-operators-wbrdf\" (UID: \"a8dfccd4-6f59-4e38-8beb-d586722f6429\") " pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.448431 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5554t\" (UniqueName: \"kubernetes.io/projected/a8dfccd4-6f59-4e38-8beb-d586722f6429-kube-api-access-5554t\") pod \"community-operators-wbrdf\" (UID: \"a8dfccd4-6f59-4e38-8beb-d586722f6429\") " pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.583760 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.813784 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-lvrg9" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.814811 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-lvrg9" Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.817346 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wbrdf"] Mar 09 13:37:10 crc kubenswrapper[4764]: I0309 13:37:10.872004 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-lvrg9" Mar 09 13:37:11 crc kubenswrapper[4764]: I0309 13:37:11.105252 4764 generic.go:334] "Generic (PLEG): container finished" podID="a8dfccd4-6f59-4e38-8beb-d586722f6429" containerID="65f7c38a5f862088c540cb251a678acbf56eb7ce24508fe81ee1a45ff576510e" exitCode=0 Mar 09 13:37:11 crc kubenswrapper[4764]: I0309 13:37:11.105352 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbrdf" event={"ID":"a8dfccd4-6f59-4e38-8beb-d586722f6429","Type":"ContainerDied","Data":"65f7c38a5f862088c540cb251a678acbf56eb7ce24508fe81ee1a45ff576510e"} Mar 09 13:37:11 crc kubenswrapper[4764]: I0309 13:37:11.105398 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbrdf" event={"ID":"a8dfccd4-6f59-4e38-8beb-d586722f6429","Type":"ContainerStarted","Data":"3e4da4181002b0bbc30f0995a9b31086d3309299728b9e55d210f7d242dd2b4b"} Mar 09 13:37:11 crc kubenswrapper[4764]: I0309 13:37:11.131854 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-lvrg9" Mar 09 13:37:13 crc kubenswrapper[4764]: I0309 13:37:13.120451 4764 generic.go:334] "Generic (PLEG): container finished" podID="a8dfccd4-6f59-4e38-8beb-d586722f6429" containerID="83d42e877f0fbd05f43d48b955fbdaf6a30563c45f97fa84b159a579fe3f00b5" exitCode=0 Mar 09 13:37:13 crc kubenswrapper[4764]: I0309 13:37:13.120500 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbrdf" event={"ID":"a8dfccd4-6f59-4e38-8beb-d586722f6429","Type":"ContainerDied","Data":"83d42e877f0fbd05f43d48b955fbdaf6a30563c45f97fa84b159a579fe3f00b5"} Mar 09 13:37:14 crc kubenswrapper[4764]: I0309 13:37:14.131347 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbrdf" event={"ID":"a8dfccd4-6f59-4e38-8beb-d586722f6429","Type":"ContainerStarted","Data":"9eca7f1ea0758281e2a6e18462903a1ceedc6b59b0cf6098bdfe193448c7d9e5"} Mar 09 13:37:14 crc kubenswrapper[4764]: I0309 13:37:14.150752 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wbrdf" podStartSLOduration=1.665454113 podStartE2EDuration="4.150728351s" podCreationTimestamp="2026-03-09 13:37:10 +0000 UTC" firstStartedPulling="2026-03-09 13:37:11.107279206 +0000 UTC m=+986.357451114" lastFinishedPulling="2026-03-09 13:37:13.592553444 +0000 UTC m=+988.842725352" observedRunningTime="2026-03-09 13:37:14.149013325 +0000 UTC m=+989.399185233" watchObservedRunningTime="2026-03-09 13:37:14.150728351 +0000 UTC m=+989.400900279" Mar 09 13:37:18 crc kubenswrapper[4764]: I0309 13:37:18.651433 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5"] Mar 09 13:37:18 crc kubenswrapper[4764]: I0309 13:37:18.653670 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" Mar 09 13:37:18 crc kubenswrapper[4764]: I0309 13:37:18.656614 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-78ggt" Mar 09 13:37:18 crc kubenswrapper[4764]: I0309 13:37:18.662667 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5"] Mar 09 13:37:18 crc kubenswrapper[4764]: I0309 13:37:18.769107 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f89e888-fc0d-48c0-ad4c-978e058ffebd-bundle\") pod \"bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5\" (UID: \"3f89e888-fc0d-48c0-ad4c-978e058ffebd\") " pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" Mar 09 13:37:18 crc kubenswrapper[4764]: I0309 13:37:18.769370 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlzbq\" (UniqueName: \"kubernetes.io/projected/3f89e888-fc0d-48c0-ad4c-978e058ffebd-kube-api-access-mlzbq\") pod \"bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5\" (UID: \"3f89e888-fc0d-48c0-ad4c-978e058ffebd\") " pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" Mar 09 13:37:18 crc kubenswrapper[4764]: I0309 13:37:18.769531 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f89e888-fc0d-48c0-ad4c-978e058ffebd-util\") pod \"bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5\" (UID: \"3f89e888-fc0d-48c0-ad4c-978e058ffebd\") " pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" Mar 09 13:37:18 crc kubenswrapper[4764]: I0309 13:37:18.871256 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f89e888-fc0d-48c0-ad4c-978e058ffebd-bundle\") pod \"bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5\" (UID: \"3f89e888-fc0d-48c0-ad4c-978e058ffebd\") " pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" Mar 09 13:37:18 crc kubenswrapper[4764]: I0309 13:37:18.871556 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlzbq\" (UniqueName: \"kubernetes.io/projected/3f89e888-fc0d-48c0-ad4c-978e058ffebd-kube-api-access-mlzbq\") pod \"bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5\" (UID: \"3f89e888-fc0d-48c0-ad4c-978e058ffebd\") " pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" Mar 09 13:37:18 crc kubenswrapper[4764]: I0309 13:37:18.871610 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f89e888-fc0d-48c0-ad4c-978e058ffebd-util\") pod \"bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5\" (UID: \"3f89e888-fc0d-48c0-ad4c-978e058ffebd\") " pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" Mar 09 13:37:18 crc kubenswrapper[4764]: I0309 13:37:18.872325 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f89e888-fc0d-48c0-ad4c-978e058ffebd-util\") pod \"bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5\" (UID: \"3f89e888-fc0d-48c0-ad4c-978e058ffebd\") " pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" Mar 09 13:37:18 crc kubenswrapper[4764]: I0309 13:37:18.872314 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f89e888-fc0d-48c0-ad4c-978e058ffebd-bundle\") pod \"bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5\" (UID: \"3f89e888-fc0d-48c0-ad4c-978e058ffebd\") " pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" Mar 09 13:37:18 crc kubenswrapper[4764]: I0309 13:37:18.904742 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlzbq\" (UniqueName: \"kubernetes.io/projected/3f89e888-fc0d-48c0-ad4c-978e058ffebd-kube-api-access-mlzbq\") pod \"bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5\" (UID: \"3f89e888-fc0d-48c0-ad4c-978e058ffebd\") " pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" Mar 09 13:37:18 crc kubenswrapper[4764]: I0309 13:37:18.975828 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" Mar 09 13:37:19 crc kubenswrapper[4764]: I0309 13:37:19.427367 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5"] Mar 09 13:37:20 crc kubenswrapper[4764]: I0309 13:37:20.181444 4764 generic.go:334] "Generic (PLEG): container finished" podID="3f89e888-fc0d-48c0-ad4c-978e058ffebd" containerID="73b3fa69a88f215c55d983ce1eed7ce8947722da1ea80a695e4eb68985582271" exitCode=0 Mar 09 13:37:20 crc kubenswrapper[4764]: I0309 13:37:20.181524 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" event={"ID":"3f89e888-fc0d-48c0-ad4c-978e058ffebd","Type":"ContainerDied","Data":"73b3fa69a88f215c55d983ce1eed7ce8947722da1ea80a695e4eb68985582271"} Mar 09 13:37:20 crc kubenswrapper[4764]: I0309 13:37:20.181568 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" event={"ID":"3f89e888-fc0d-48c0-ad4c-978e058ffebd","Type":"ContainerStarted","Data":"3143d48deb25e1228b0b9bf2779c6a99ff527ab1f6ee5b799af9189c2a7983c8"} Mar 09 13:37:20 crc kubenswrapper[4764]: I0309 13:37:20.583915 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:20 crc kubenswrapper[4764]: I0309 13:37:20.583980 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:20 crc kubenswrapper[4764]: I0309 13:37:20.636352 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:21 crc kubenswrapper[4764]: I0309 13:37:21.193774 4764 generic.go:334] "Generic (PLEG): container finished" podID="3f89e888-fc0d-48c0-ad4c-978e058ffebd" containerID="c761b16c6f8255fa5f43bd5bfb98564a0b3ca85e3beeeee7d200204b4d9f2fef" exitCode=0 Mar 09 13:37:21 crc kubenswrapper[4764]: I0309 13:37:21.193954 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" event={"ID":"3f89e888-fc0d-48c0-ad4c-978e058ffebd","Type":"ContainerDied","Data":"c761b16c6f8255fa5f43bd5bfb98564a0b3ca85e3beeeee7d200204b4d9f2fef"} Mar 09 13:37:21 crc kubenswrapper[4764]: I0309 13:37:21.269200 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:22 crc kubenswrapper[4764]: I0309 13:37:22.206321 4764 generic.go:334] "Generic (PLEG): container finished" podID="3f89e888-fc0d-48c0-ad4c-978e058ffebd" containerID="5996dc65947ecc7b229c5bfcde0b46498cf6fc32f67fc10a16c94d68375e7654" exitCode=0 Mar 09 13:37:22 crc kubenswrapper[4764]: I0309 13:37:22.207590 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" event={"ID":"3f89e888-fc0d-48c0-ad4c-978e058ffebd","Type":"ContainerDied","Data":"5996dc65947ecc7b229c5bfcde0b46498cf6fc32f67fc10a16c94d68375e7654"} Mar 09 13:37:23 crc kubenswrapper[4764]: I0309 13:37:23.531405 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" Mar 09 13:37:23 crc kubenswrapper[4764]: I0309 13:37:23.654496 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlzbq\" (UniqueName: \"kubernetes.io/projected/3f89e888-fc0d-48c0-ad4c-978e058ffebd-kube-api-access-mlzbq\") pod \"3f89e888-fc0d-48c0-ad4c-978e058ffebd\" (UID: \"3f89e888-fc0d-48c0-ad4c-978e058ffebd\") " Mar 09 13:37:23 crc kubenswrapper[4764]: I0309 13:37:23.654699 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f89e888-fc0d-48c0-ad4c-978e058ffebd-util\") pod \"3f89e888-fc0d-48c0-ad4c-978e058ffebd\" (UID: \"3f89e888-fc0d-48c0-ad4c-978e058ffebd\") " Mar 09 13:37:23 crc kubenswrapper[4764]: I0309 13:37:23.654769 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f89e888-fc0d-48c0-ad4c-978e058ffebd-bundle\") pod \"3f89e888-fc0d-48c0-ad4c-978e058ffebd\" (UID: \"3f89e888-fc0d-48c0-ad4c-978e058ffebd\") " Mar 09 13:37:23 crc kubenswrapper[4764]: I0309 13:37:23.655770 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f89e888-fc0d-48c0-ad4c-978e058ffebd-bundle" (OuterVolumeSpecName: "bundle") pod "3f89e888-fc0d-48c0-ad4c-978e058ffebd" (UID: "3f89e888-fc0d-48c0-ad4c-978e058ffebd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:37:23 crc kubenswrapper[4764]: I0309 13:37:23.661870 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f89e888-fc0d-48c0-ad4c-978e058ffebd-kube-api-access-mlzbq" (OuterVolumeSpecName: "kube-api-access-mlzbq") pod "3f89e888-fc0d-48c0-ad4c-978e058ffebd" (UID: "3f89e888-fc0d-48c0-ad4c-978e058ffebd"). InnerVolumeSpecName "kube-api-access-mlzbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:37:23 crc kubenswrapper[4764]: I0309 13:37:23.669906 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f89e888-fc0d-48c0-ad4c-978e058ffebd-util" (OuterVolumeSpecName: "util") pod "3f89e888-fc0d-48c0-ad4c-978e058ffebd" (UID: "3f89e888-fc0d-48c0-ad4c-978e058ffebd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:37:23 crc kubenswrapper[4764]: I0309 13:37:23.757313 4764 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f89e888-fc0d-48c0-ad4c-978e058ffebd-util\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:23 crc kubenswrapper[4764]: I0309 13:37:23.757361 4764 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f89e888-fc0d-48c0-ad4c-978e058ffebd-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:23 crc kubenswrapper[4764]: I0309 13:37:23.757374 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlzbq\" (UniqueName: \"kubernetes.io/projected/3f89e888-fc0d-48c0-ad4c-978e058ffebd-kube-api-access-mlzbq\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:23 crc kubenswrapper[4764]: I0309 13:37:23.799163 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wbrdf"] Mar 09 13:37:23 crc kubenswrapper[4764]: I0309 13:37:23.799546 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wbrdf" podUID="a8dfccd4-6f59-4e38-8beb-d586722f6429" containerName="registry-server" containerID="cri-o://9eca7f1ea0758281e2a6e18462903a1ceedc6b59b0cf6098bdfe193448c7d9e5" gracePeriod=2 Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.223315 4764 generic.go:334] "Generic (PLEG): container finished" podID="a8dfccd4-6f59-4e38-8beb-d586722f6429" containerID="9eca7f1ea0758281e2a6e18462903a1ceedc6b59b0cf6098bdfe193448c7d9e5" exitCode=0 Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.223421 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbrdf" event={"ID":"a8dfccd4-6f59-4e38-8beb-d586722f6429","Type":"ContainerDied","Data":"9eca7f1ea0758281e2a6e18462903a1ceedc6b59b0cf6098bdfe193448c7d9e5"} Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.226799 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" event={"ID":"3f89e888-fc0d-48c0-ad4c-978e058ffebd","Type":"ContainerDied","Data":"3143d48deb25e1228b0b9bf2779c6a99ff527ab1f6ee5b799af9189c2a7983c8"} Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.226846 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3143d48deb25e1228b0b9bf2779c6a99ff527ab1f6ee5b799af9189c2a7983c8" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.226910 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.261510 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.364746 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8dfccd4-6f59-4e38-8beb-d586722f6429-catalog-content\") pod \"a8dfccd4-6f59-4e38-8beb-d586722f6429\" (UID: \"a8dfccd4-6f59-4e38-8beb-d586722f6429\") " Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.364842 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8dfccd4-6f59-4e38-8beb-d586722f6429-utilities\") pod \"a8dfccd4-6f59-4e38-8beb-d586722f6429\" (UID: \"a8dfccd4-6f59-4e38-8beb-d586722f6429\") " Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.364980 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5554t\" (UniqueName: \"kubernetes.io/projected/a8dfccd4-6f59-4e38-8beb-d586722f6429-kube-api-access-5554t\") pod \"a8dfccd4-6f59-4e38-8beb-d586722f6429\" (UID: \"a8dfccd4-6f59-4e38-8beb-d586722f6429\") " Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.366009 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8dfccd4-6f59-4e38-8beb-d586722f6429-utilities" (OuterVolumeSpecName: "utilities") pod "a8dfccd4-6f59-4e38-8beb-d586722f6429" (UID: "a8dfccd4-6f59-4e38-8beb-d586722f6429"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.370847 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8dfccd4-6f59-4e38-8beb-d586722f6429-kube-api-access-5554t" (OuterVolumeSpecName: "kube-api-access-5554t") pod "a8dfccd4-6f59-4e38-8beb-d586722f6429" (UID: "a8dfccd4-6f59-4e38-8beb-d586722f6429"). InnerVolumeSpecName "kube-api-access-5554t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.423458 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8dfccd4-6f59-4e38-8beb-d586722f6429-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8dfccd4-6f59-4e38-8beb-d586722f6429" (UID: "a8dfccd4-6f59-4e38-8beb-d586722f6429"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.466673 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5554t\" (UniqueName: \"kubernetes.io/projected/a8dfccd4-6f59-4e38-8beb-d586722f6429-kube-api-access-5554t\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.466726 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8dfccd4-6f59-4e38-8beb-d586722f6429-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.466739 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8dfccd4-6f59-4e38-8beb-d586722f6429-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.807771 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m4qqw"] Mar 09 13:37:24 crc kubenswrapper[4764]: E0309 13:37:24.808152 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f89e888-fc0d-48c0-ad4c-978e058ffebd" containerName="pull" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.808170 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f89e888-fc0d-48c0-ad4c-978e058ffebd" containerName="pull" Mar 09 13:37:24 crc kubenswrapper[4764]: E0309 13:37:24.808182 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f89e888-fc0d-48c0-ad4c-978e058ffebd" containerName="util" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.808190 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f89e888-fc0d-48c0-ad4c-978e058ffebd" containerName="util" Mar 09 13:37:24 crc kubenswrapper[4764]: E0309 13:37:24.808199 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f89e888-fc0d-48c0-ad4c-978e058ffebd" containerName="extract" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.808206 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f89e888-fc0d-48c0-ad4c-978e058ffebd" containerName="extract" Mar 09 13:37:24 crc kubenswrapper[4764]: E0309 13:37:24.808226 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8dfccd4-6f59-4e38-8beb-d586722f6429" containerName="extract-utilities" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.808233 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8dfccd4-6f59-4e38-8beb-d586722f6429" containerName="extract-utilities" Mar 09 13:37:24 crc kubenswrapper[4764]: E0309 13:37:24.808241 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8dfccd4-6f59-4e38-8beb-d586722f6429" containerName="extract-content" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.808249 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8dfccd4-6f59-4e38-8beb-d586722f6429" containerName="extract-content" Mar 09 13:37:24 crc kubenswrapper[4764]: E0309 13:37:24.808260 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8dfccd4-6f59-4e38-8beb-d586722f6429" containerName="registry-server" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.808266 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8dfccd4-6f59-4e38-8beb-d586722f6429" containerName="registry-server" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.808412 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8dfccd4-6f59-4e38-8beb-d586722f6429" containerName="registry-server" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.808426 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f89e888-fc0d-48c0-ad4c-978e058ffebd" containerName="extract" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.809399 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.848737 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4qqw"] Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.873288 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6chf\" (UniqueName: \"kubernetes.io/projected/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-kube-api-access-p6chf\") pod \"redhat-marketplace-m4qqw\" (UID: \"36d190ba-cb10-4d0a-a5f2-b87befbf6f87\") " pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.873395 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-utilities\") pod \"redhat-marketplace-m4qqw\" (UID: \"36d190ba-cb10-4d0a-a5f2-b87befbf6f87\") " pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.873459 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-catalog-content\") pod \"redhat-marketplace-m4qqw\" (UID: \"36d190ba-cb10-4d0a-a5f2-b87befbf6f87\") " pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.975229 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-catalog-content\") pod \"redhat-marketplace-m4qqw\" (UID: \"36d190ba-cb10-4d0a-a5f2-b87befbf6f87\") " pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.975344 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6chf\" (UniqueName: \"kubernetes.io/projected/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-kube-api-access-p6chf\") pod \"redhat-marketplace-m4qqw\" (UID: \"36d190ba-cb10-4d0a-a5f2-b87befbf6f87\") " pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.975420 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-utilities\") pod \"redhat-marketplace-m4qqw\" (UID: \"36d190ba-cb10-4d0a-a5f2-b87befbf6f87\") " pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.976139 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-utilities\") pod \"redhat-marketplace-m4qqw\" (UID: \"36d190ba-cb10-4d0a-a5f2-b87befbf6f87\") " pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.976228 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-catalog-content\") pod \"redhat-marketplace-m4qqw\" (UID: \"36d190ba-cb10-4d0a-a5f2-b87befbf6f87\") " pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:24 crc kubenswrapper[4764]: I0309 13:37:24.994033 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6chf\" (UniqueName: \"kubernetes.io/projected/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-kube-api-access-p6chf\") pod \"redhat-marketplace-m4qqw\" (UID: \"36d190ba-cb10-4d0a-a5f2-b87befbf6f87\") " pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:25 crc kubenswrapper[4764]: I0309 13:37:25.127053 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:25 crc kubenswrapper[4764]: I0309 13:37:25.239005 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbrdf" event={"ID":"a8dfccd4-6f59-4e38-8beb-d586722f6429","Type":"ContainerDied","Data":"3e4da4181002b0bbc30f0995a9b31086d3309299728b9e55d210f7d242dd2b4b"} Mar 09 13:37:25 crc kubenswrapper[4764]: I0309 13:37:25.239361 4764 scope.go:117] "RemoveContainer" containerID="9eca7f1ea0758281e2a6e18462903a1ceedc6b59b0cf6098bdfe193448c7d9e5" Mar 09 13:37:25 crc kubenswrapper[4764]: I0309 13:37:25.239515 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wbrdf" Mar 09 13:37:25 crc kubenswrapper[4764]: I0309 13:37:25.278772 4764 scope.go:117] "RemoveContainer" containerID="83d42e877f0fbd05f43d48b955fbdaf6a30563c45f97fa84b159a579fe3f00b5" Mar 09 13:37:25 crc kubenswrapper[4764]: I0309 13:37:25.281238 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wbrdf"] Mar 09 13:37:25 crc kubenswrapper[4764]: I0309 13:37:25.286427 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wbrdf"] Mar 09 13:37:25 crc kubenswrapper[4764]: I0309 13:37:25.306978 4764 scope.go:117] "RemoveContainer" containerID="65f7c38a5f862088c540cb251a678acbf56eb7ce24508fe81ee1a45ff576510e" Mar 09 13:37:25 crc kubenswrapper[4764]: I0309 13:37:25.570819 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8dfccd4-6f59-4e38-8beb-d586722f6429" path="/var/lib/kubelet/pods/a8dfccd4-6f59-4e38-8beb-d586722f6429/volumes" Mar 09 13:37:25 crc kubenswrapper[4764]: I0309 13:37:25.611676 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4qqw"] Mar 09 13:37:25 crc kubenswrapper[4764]: W0309 13:37:25.622920 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36d190ba_cb10_4d0a_a5f2_b87befbf6f87.slice/crio-7755051cd41c2e60be705b844644aeaa882858e41de06e49db1ac8b1d1793e1e WatchSource:0}: Error finding container 7755051cd41c2e60be705b844644aeaa882858e41de06e49db1ac8b1d1793e1e: Status 404 returned error can't find the container with id 7755051cd41c2e60be705b844644aeaa882858e41de06e49db1ac8b1d1793e1e Mar 09 13:37:26 crc kubenswrapper[4764]: I0309 13:37:26.247528 4764 generic.go:334] "Generic (PLEG): container finished" podID="36d190ba-cb10-4d0a-a5f2-b87befbf6f87" containerID="476aec4d944077e73498f316fe9d3fd7b8aa5f1995803ba7f7af28feb66c2ef3" exitCode=0 Mar 09 13:37:26 crc kubenswrapper[4764]: I0309 13:37:26.247597 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4qqw" event={"ID":"36d190ba-cb10-4d0a-a5f2-b87befbf6f87","Type":"ContainerDied","Data":"476aec4d944077e73498f316fe9d3fd7b8aa5f1995803ba7f7af28feb66c2ef3"} Mar 09 13:37:26 crc kubenswrapper[4764]: I0309 13:37:26.248016 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4qqw" event={"ID":"36d190ba-cb10-4d0a-a5f2-b87befbf6f87","Type":"ContainerStarted","Data":"7755051cd41c2e60be705b844644aeaa882858e41de06e49db1ac8b1d1793e1e"} Mar 09 13:37:27 crc kubenswrapper[4764]: I0309 13:37:27.260463 4764 generic.go:334] "Generic (PLEG): container finished" podID="36d190ba-cb10-4d0a-a5f2-b87befbf6f87" containerID="1760acfca8d45db46ccb61c42fe77b466941c9ff04663728cd5544b31e088cab" exitCode=0 Mar 09 13:37:27 crc kubenswrapper[4764]: I0309 13:37:27.260528 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4qqw" event={"ID":"36d190ba-cb10-4d0a-a5f2-b87befbf6f87","Type":"ContainerDied","Data":"1760acfca8d45db46ccb61c42fe77b466941c9ff04663728cd5544b31e088cab"} Mar 09 13:37:28 crc kubenswrapper[4764]: I0309 13:37:28.269404 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4qqw" event={"ID":"36d190ba-cb10-4d0a-a5f2-b87befbf6f87","Type":"ContainerStarted","Data":"268126c07789b73f5e804a36c88266544a60c65041565cfbad871a65aa4c42e4"} Mar 09 13:37:28 crc kubenswrapper[4764]: I0309 13:37:28.292161 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6754b7f846-ns9zn"] Mar 09 13:37:28 crc kubenswrapper[4764]: I0309 13:37:28.293495 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6754b7f846-ns9zn" Mar 09 13:37:28 crc kubenswrapper[4764]: I0309 13:37:28.295623 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-44sd7" Mar 09 13:37:28 crc kubenswrapper[4764]: I0309 13:37:28.307063 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m4qqw" podStartSLOduration=2.8889430369999998 podStartE2EDuration="4.307042293s" podCreationTimestamp="2026-03-09 13:37:24 +0000 UTC" firstStartedPulling="2026-03-09 13:37:26.249890117 +0000 UTC m=+1001.500062025" lastFinishedPulling="2026-03-09 13:37:27.667989373 +0000 UTC m=+1002.918161281" observedRunningTime="2026-03-09 13:37:28.304328451 +0000 UTC m=+1003.554500369" watchObservedRunningTime="2026-03-09 13:37:28.307042293 +0000 UTC m=+1003.557214201" Mar 09 13:37:28 crc kubenswrapper[4764]: I0309 13:37:28.328739 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6754b7f846-ns9zn"] Mar 09 13:37:28 crc kubenswrapper[4764]: I0309 13:37:28.370030 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:37:28 crc kubenswrapper[4764]: I0309 13:37:28.370092 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:37:28 crc kubenswrapper[4764]: I0309 13:37:28.429393 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zfsv\" (UniqueName: \"kubernetes.io/projected/67c57635-59f1-48a2-9823-c86732eabbf6-kube-api-access-4zfsv\") pod \"openstack-operator-controller-init-6754b7f846-ns9zn\" (UID: \"67c57635-59f1-48a2-9823-c86732eabbf6\") " pod="openstack-operators/openstack-operator-controller-init-6754b7f846-ns9zn" Mar 09 13:37:28 crc kubenswrapper[4764]: I0309 13:37:28.531003 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zfsv\" (UniqueName: \"kubernetes.io/projected/67c57635-59f1-48a2-9823-c86732eabbf6-kube-api-access-4zfsv\") pod \"openstack-operator-controller-init-6754b7f846-ns9zn\" (UID: \"67c57635-59f1-48a2-9823-c86732eabbf6\") " pod="openstack-operators/openstack-operator-controller-init-6754b7f846-ns9zn" Mar 09 13:37:28 crc kubenswrapper[4764]: I0309 13:37:28.562709 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zfsv\" (UniqueName: \"kubernetes.io/projected/67c57635-59f1-48a2-9823-c86732eabbf6-kube-api-access-4zfsv\") pod \"openstack-operator-controller-init-6754b7f846-ns9zn\" (UID: \"67c57635-59f1-48a2-9823-c86732eabbf6\") " pod="openstack-operators/openstack-operator-controller-init-6754b7f846-ns9zn" Mar 09 13:37:28 crc kubenswrapper[4764]: I0309 13:37:28.609868 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6754b7f846-ns9zn" Mar 09 13:37:28 crc kubenswrapper[4764]: I0309 13:37:28.892349 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6754b7f846-ns9zn"] Mar 09 13:37:28 crc kubenswrapper[4764]: W0309 13:37:28.899893 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67c57635_59f1_48a2_9823_c86732eabbf6.slice/crio-50ed490116b2fe4bc00db782eb11ee4ed848d95032478408fb7b3ab412ea9e99 WatchSource:0}: Error finding container 50ed490116b2fe4bc00db782eb11ee4ed848d95032478408fb7b3ab412ea9e99: Status 404 returned error can't find the container with id 50ed490116b2fe4bc00db782eb11ee4ed848d95032478408fb7b3ab412ea9e99 Mar 09 13:37:29 crc kubenswrapper[4764]: I0309 13:37:29.280575 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6754b7f846-ns9zn" event={"ID":"67c57635-59f1-48a2-9823-c86732eabbf6","Type":"ContainerStarted","Data":"50ed490116b2fe4bc00db782eb11ee4ed848d95032478408fb7b3ab412ea9e99"} Mar 09 13:37:35 crc kubenswrapper[4764]: I0309 13:37:35.128227 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:35 crc kubenswrapper[4764]: I0309 13:37:35.129044 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:35 crc kubenswrapper[4764]: I0309 13:37:35.181620 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:35 crc kubenswrapper[4764]: I0309 13:37:35.367037 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6754b7f846-ns9zn" event={"ID":"67c57635-59f1-48a2-9823-c86732eabbf6","Type":"ContainerStarted","Data":"c862bc3fd9e6268402af1f5ce9425cb23adf6320f25fe03c8a2fa25c24c088d2"} Mar 09 13:37:35 crc kubenswrapper[4764]: I0309 13:37:35.367357 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6754b7f846-ns9zn" Mar 09 13:37:35 crc kubenswrapper[4764]: I0309 13:37:35.398568 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6754b7f846-ns9zn" podStartSLOduration=1.9172723280000001 podStartE2EDuration="7.39854516s" podCreationTimestamp="2026-03-09 13:37:28 +0000 UTC" firstStartedPulling="2026-03-09 13:37:28.90179087 +0000 UTC m=+1004.151962778" lastFinishedPulling="2026-03-09 13:37:34.383063702 +0000 UTC m=+1009.633235610" observedRunningTime="2026-03-09 13:37:35.397555343 +0000 UTC m=+1010.647727261" watchObservedRunningTime="2026-03-09 13:37:35.39854516 +0000 UTC m=+1010.648717068" Mar 09 13:37:35 crc kubenswrapper[4764]: I0309 13:37:35.423984 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:35 crc kubenswrapper[4764]: I0309 13:37:35.998504 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4qqw"] Mar 09 13:37:37 crc kubenswrapper[4764]: I0309 13:37:37.380075 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m4qqw" podUID="36d190ba-cb10-4d0a-a5f2-b87befbf6f87" containerName="registry-server" containerID="cri-o://268126c07789b73f5e804a36c88266544a60c65041565cfbad871a65aa4c42e4" gracePeriod=2 Mar 09 13:37:37 crc kubenswrapper[4764]: E0309 13:37:37.503555 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36d190ba_cb10_4d0a_a5f2_b87befbf6f87.slice/crio-conmon-268126c07789b73f5e804a36c88266544a60c65041565cfbad871a65aa4c42e4.scope\": RecentStats: unable to find data in memory cache]" Mar 09 13:37:37 crc kubenswrapper[4764]: I0309 13:37:37.775606 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:37 crc kubenswrapper[4764]: I0309 13:37:37.942750 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-catalog-content\") pod \"36d190ba-cb10-4d0a-a5f2-b87befbf6f87\" (UID: \"36d190ba-cb10-4d0a-a5f2-b87befbf6f87\") " Mar 09 13:37:37 crc kubenswrapper[4764]: I0309 13:37:37.943166 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6chf\" (UniqueName: \"kubernetes.io/projected/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-kube-api-access-p6chf\") pod \"36d190ba-cb10-4d0a-a5f2-b87befbf6f87\" (UID: \"36d190ba-cb10-4d0a-a5f2-b87befbf6f87\") " Mar 09 13:37:37 crc kubenswrapper[4764]: I0309 13:37:37.943340 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-utilities\") pod \"36d190ba-cb10-4d0a-a5f2-b87befbf6f87\" (UID: \"36d190ba-cb10-4d0a-a5f2-b87befbf6f87\") " Mar 09 13:37:37 crc kubenswrapper[4764]: I0309 13:37:37.944990 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-utilities" (OuterVolumeSpecName: "utilities") pod "36d190ba-cb10-4d0a-a5f2-b87befbf6f87" (UID: "36d190ba-cb10-4d0a-a5f2-b87befbf6f87"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:37:37 crc kubenswrapper[4764]: I0309 13:37:37.951271 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-kube-api-access-p6chf" (OuterVolumeSpecName: "kube-api-access-p6chf") pod "36d190ba-cb10-4d0a-a5f2-b87befbf6f87" (UID: "36d190ba-cb10-4d0a-a5f2-b87befbf6f87"). InnerVolumeSpecName "kube-api-access-p6chf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:37:37 crc kubenswrapper[4764]: I0309 13:37:37.987265 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36d190ba-cb10-4d0a-a5f2-b87befbf6f87" (UID: "36d190ba-cb10-4d0a-a5f2-b87befbf6f87"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.046497 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.046583 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.046606 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6chf\" (UniqueName: \"kubernetes.io/projected/36d190ba-cb10-4d0a-a5f2-b87befbf6f87-kube-api-access-p6chf\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.390216 4764 generic.go:334] "Generic (PLEG): container finished" podID="36d190ba-cb10-4d0a-a5f2-b87befbf6f87" containerID="268126c07789b73f5e804a36c88266544a60c65041565cfbad871a65aa4c42e4" exitCode=0 Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.390295 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4qqw" event={"ID":"36d190ba-cb10-4d0a-a5f2-b87befbf6f87","Type":"ContainerDied","Data":"268126c07789b73f5e804a36c88266544a60c65041565cfbad871a65aa4c42e4"} Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.390352 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4qqw" event={"ID":"36d190ba-cb10-4d0a-a5f2-b87befbf6f87","Type":"ContainerDied","Data":"7755051cd41c2e60be705b844644aeaa882858e41de06e49db1ac8b1d1793e1e"} Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.390380 4764 scope.go:117] "RemoveContainer" containerID="268126c07789b73f5e804a36c88266544a60c65041565cfbad871a65aa4c42e4" Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.390299 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m4qqw" Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.421541 4764 scope.go:117] "RemoveContainer" containerID="1760acfca8d45db46ccb61c42fe77b466941c9ff04663728cd5544b31e088cab" Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.426380 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4qqw"] Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.432788 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4qqw"] Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.443172 4764 scope.go:117] "RemoveContainer" containerID="476aec4d944077e73498f316fe9d3fd7b8aa5f1995803ba7f7af28feb66c2ef3" Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.461965 4764 scope.go:117] "RemoveContainer" containerID="268126c07789b73f5e804a36c88266544a60c65041565cfbad871a65aa4c42e4" Mar 09 13:37:38 crc kubenswrapper[4764]: E0309 13:37:38.462441 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"268126c07789b73f5e804a36c88266544a60c65041565cfbad871a65aa4c42e4\": container with ID starting with 268126c07789b73f5e804a36c88266544a60c65041565cfbad871a65aa4c42e4 not found: ID does not exist" containerID="268126c07789b73f5e804a36c88266544a60c65041565cfbad871a65aa4c42e4" Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.462476 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"268126c07789b73f5e804a36c88266544a60c65041565cfbad871a65aa4c42e4"} err="failed to get container status \"268126c07789b73f5e804a36c88266544a60c65041565cfbad871a65aa4c42e4\": rpc error: code = NotFound desc = could not find container \"268126c07789b73f5e804a36c88266544a60c65041565cfbad871a65aa4c42e4\": container with ID starting with 268126c07789b73f5e804a36c88266544a60c65041565cfbad871a65aa4c42e4 not found: ID does not exist" Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.462496 4764 scope.go:117] "RemoveContainer" containerID="1760acfca8d45db46ccb61c42fe77b466941c9ff04663728cd5544b31e088cab" Mar 09 13:37:38 crc kubenswrapper[4764]: E0309 13:37:38.462791 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1760acfca8d45db46ccb61c42fe77b466941c9ff04663728cd5544b31e088cab\": container with ID starting with 1760acfca8d45db46ccb61c42fe77b466941c9ff04663728cd5544b31e088cab not found: ID does not exist" containerID="1760acfca8d45db46ccb61c42fe77b466941c9ff04663728cd5544b31e088cab" Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.462813 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1760acfca8d45db46ccb61c42fe77b466941c9ff04663728cd5544b31e088cab"} err="failed to get container status \"1760acfca8d45db46ccb61c42fe77b466941c9ff04663728cd5544b31e088cab\": rpc error: code = NotFound desc = could not find container \"1760acfca8d45db46ccb61c42fe77b466941c9ff04663728cd5544b31e088cab\": container with ID starting with 1760acfca8d45db46ccb61c42fe77b466941c9ff04663728cd5544b31e088cab not found: ID does not exist" Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.462825 4764 scope.go:117] "RemoveContainer" containerID="476aec4d944077e73498f316fe9d3fd7b8aa5f1995803ba7f7af28feb66c2ef3" Mar 09 13:37:38 crc kubenswrapper[4764]: E0309 13:37:38.463064 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"476aec4d944077e73498f316fe9d3fd7b8aa5f1995803ba7f7af28feb66c2ef3\": container with ID starting with 476aec4d944077e73498f316fe9d3fd7b8aa5f1995803ba7f7af28feb66c2ef3 not found: ID does not exist" containerID="476aec4d944077e73498f316fe9d3fd7b8aa5f1995803ba7f7af28feb66c2ef3" Mar 09 13:37:38 crc kubenswrapper[4764]: I0309 13:37:38.463086 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"476aec4d944077e73498f316fe9d3fd7b8aa5f1995803ba7f7af28feb66c2ef3"} err="failed to get container status \"476aec4d944077e73498f316fe9d3fd7b8aa5f1995803ba7f7af28feb66c2ef3\": rpc error: code = NotFound desc = could not find container \"476aec4d944077e73498f316fe9d3fd7b8aa5f1995803ba7f7af28feb66c2ef3\": container with ID starting with 476aec4d944077e73498f316fe9d3fd7b8aa5f1995803ba7f7af28feb66c2ef3 not found: ID does not exist" Mar 09 13:37:39 crc kubenswrapper[4764]: I0309 13:37:39.570815 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36d190ba-cb10-4d0a-a5f2-b87befbf6f87" path="/var/lib/kubelet/pods/36d190ba-cb10-4d0a-a5f2-b87befbf6f87/volumes" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.412803 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jfqln"] Mar 09 13:37:42 crc kubenswrapper[4764]: E0309 13:37:42.413480 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d190ba-cb10-4d0a-a5f2-b87befbf6f87" containerName="extract-utilities" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.413499 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d190ba-cb10-4d0a-a5f2-b87befbf6f87" containerName="extract-utilities" Mar 09 13:37:42 crc kubenswrapper[4764]: E0309 13:37:42.413516 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d190ba-cb10-4d0a-a5f2-b87befbf6f87" containerName="extract-content" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.413523 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d190ba-cb10-4d0a-a5f2-b87befbf6f87" containerName="extract-content" Mar 09 13:37:42 crc kubenswrapper[4764]: E0309 13:37:42.413548 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d190ba-cb10-4d0a-a5f2-b87befbf6f87" containerName="registry-server" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.413558 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d190ba-cb10-4d0a-a5f2-b87befbf6f87" containerName="registry-server" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.413768 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="36d190ba-cb10-4d0a-a5f2-b87befbf6f87" containerName="registry-server" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.414878 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.436758 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jfqln"] Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.508785 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a77dd9b9-647b-4a75-b754-d7c92507e241-catalog-content\") pod \"certified-operators-jfqln\" (UID: \"a77dd9b9-647b-4a75-b754-d7c92507e241\") " pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.508834 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a77dd9b9-647b-4a75-b754-d7c92507e241-utilities\") pod \"certified-operators-jfqln\" (UID: \"a77dd9b9-647b-4a75-b754-d7c92507e241\") " pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.508863 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gct72\" (UniqueName: \"kubernetes.io/projected/a77dd9b9-647b-4a75-b754-d7c92507e241-kube-api-access-gct72\") pod \"certified-operators-jfqln\" (UID: \"a77dd9b9-647b-4a75-b754-d7c92507e241\") " pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.611212 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a77dd9b9-647b-4a75-b754-d7c92507e241-catalog-content\") pod \"certified-operators-jfqln\" (UID: \"a77dd9b9-647b-4a75-b754-d7c92507e241\") " pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.611706 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a77dd9b9-647b-4a75-b754-d7c92507e241-utilities\") pod \"certified-operators-jfqln\" (UID: \"a77dd9b9-647b-4a75-b754-d7c92507e241\") " pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.611830 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a77dd9b9-647b-4a75-b754-d7c92507e241-catalog-content\") pod \"certified-operators-jfqln\" (UID: \"a77dd9b9-647b-4a75-b754-d7c92507e241\") " pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.612191 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a77dd9b9-647b-4a75-b754-d7c92507e241-utilities\") pod \"certified-operators-jfqln\" (UID: \"a77dd9b9-647b-4a75-b754-d7c92507e241\") " pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.612406 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gct72\" (UniqueName: \"kubernetes.io/projected/a77dd9b9-647b-4a75-b754-d7c92507e241-kube-api-access-gct72\") pod \"certified-operators-jfqln\" (UID: \"a77dd9b9-647b-4a75-b754-d7c92507e241\") " pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.639068 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gct72\" (UniqueName: \"kubernetes.io/projected/a77dd9b9-647b-4a75-b754-d7c92507e241-kube-api-access-gct72\") pod \"certified-operators-jfqln\" (UID: \"a77dd9b9-647b-4a75-b754-d7c92507e241\") " pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:42 crc kubenswrapper[4764]: I0309 13:37:42.740022 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:43 crc kubenswrapper[4764]: I0309 13:37:43.202139 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jfqln"] Mar 09 13:37:43 crc kubenswrapper[4764]: I0309 13:37:43.446632 4764 generic.go:334] "Generic (PLEG): container finished" podID="a77dd9b9-647b-4a75-b754-d7c92507e241" containerID="9cc110aa0cbe4a810cdc27fc5d3549dd4a1ef08a16d0a88aba121e7def0205ec" exitCode=0 Mar 09 13:37:43 crc kubenswrapper[4764]: I0309 13:37:43.446699 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfqln" event={"ID":"a77dd9b9-647b-4a75-b754-d7c92507e241","Type":"ContainerDied","Data":"9cc110aa0cbe4a810cdc27fc5d3549dd4a1ef08a16d0a88aba121e7def0205ec"} Mar 09 13:37:43 crc kubenswrapper[4764]: I0309 13:37:43.446731 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfqln" event={"ID":"a77dd9b9-647b-4a75-b754-d7c92507e241","Type":"ContainerStarted","Data":"fea157f4e9abb709a49af60bce7d1a5f75e60dccdc7d1d8642fcea7367aa5768"} Mar 09 13:37:44 crc kubenswrapper[4764]: I0309 13:37:44.455463 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfqln" event={"ID":"a77dd9b9-647b-4a75-b754-d7c92507e241","Type":"ContainerStarted","Data":"b5fc661ae561d0866cfa65888b69c005d4e29c4b626a7af3e20fd0370ac554ca"} Mar 09 13:37:45 crc kubenswrapper[4764]: I0309 13:37:45.463617 4764 generic.go:334] "Generic (PLEG): container finished" podID="a77dd9b9-647b-4a75-b754-d7c92507e241" containerID="b5fc661ae561d0866cfa65888b69c005d4e29c4b626a7af3e20fd0370ac554ca" exitCode=0 Mar 09 13:37:45 crc kubenswrapper[4764]: I0309 13:37:45.463697 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfqln" event={"ID":"a77dd9b9-647b-4a75-b754-d7c92507e241","Type":"ContainerDied","Data":"b5fc661ae561d0866cfa65888b69c005d4e29c4b626a7af3e20fd0370ac554ca"} Mar 09 13:37:47 crc kubenswrapper[4764]: I0309 13:37:47.480855 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfqln" event={"ID":"a77dd9b9-647b-4a75-b754-d7c92507e241","Type":"ContainerStarted","Data":"c6becf3bd2bee1ada31af980926b44680472b5286412c9246d30f2b373c59019"} Mar 09 13:37:47 crc kubenswrapper[4764]: I0309 13:37:47.502713 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jfqln" podStartSLOduration=2.000163354 podStartE2EDuration="5.502626864s" podCreationTimestamp="2026-03-09 13:37:42 +0000 UTC" firstStartedPulling="2026-03-09 13:37:43.448577372 +0000 UTC m=+1018.698749280" lastFinishedPulling="2026-03-09 13:37:46.951040882 +0000 UTC m=+1022.201212790" observedRunningTime="2026-03-09 13:37:47.498514604 +0000 UTC m=+1022.748686512" watchObservedRunningTime="2026-03-09 13:37:47.502626864 +0000 UTC m=+1022.752798772" Mar 09 13:37:48 crc kubenswrapper[4764]: I0309 13:37:48.617073 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6754b7f846-ns9zn" Mar 09 13:37:52 crc kubenswrapper[4764]: I0309 13:37:52.740408 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:52 crc kubenswrapper[4764]: I0309 13:37:52.740810 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:52 crc kubenswrapper[4764]: I0309 13:37:52.807189 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:53 crc kubenswrapper[4764]: I0309 13:37:53.568265 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:53 crc kubenswrapper[4764]: I0309 13:37:53.612839 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jfqln"] Mar 09 13:37:55 crc kubenswrapper[4764]: I0309 13:37:55.549397 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jfqln" podUID="a77dd9b9-647b-4a75-b754-d7c92507e241" containerName="registry-server" containerID="cri-o://c6becf3bd2bee1ada31af980926b44680472b5286412c9246d30f2b373c59019" gracePeriod=2 Mar 09 13:37:56 crc kubenswrapper[4764]: I0309 13:37:56.567766 4764 generic.go:334] "Generic (PLEG): container finished" podID="a77dd9b9-647b-4a75-b754-d7c92507e241" containerID="c6becf3bd2bee1ada31af980926b44680472b5286412c9246d30f2b373c59019" exitCode=0 Mar 09 13:37:56 crc kubenswrapper[4764]: I0309 13:37:56.567838 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfqln" event={"ID":"a77dd9b9-647b-4a75-b754-d7c92507e241","Type":"ContainerDied","Data":"c6becf3bd2bee1ada31af980926b44680472b5286412c9246d30f2b373c59019"} Mar 09 13:37:56 crc kubenswrapper[4764]: I0309 13:37:56.696113 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:56 crc kubenswrapper[4764]: I0309 13:37:56.765892 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a77dd9b9-647b-4a75-b754-d7c92507e241-catalog-content\") pod \"a77dd9b9-647b-4a75-b754-d7c92507e241\" (UID: \"a77dd9b9-647b-4a75-b754-d7c92507e241\") " Mar 09 13:37:56 crc kubenswrapper[4764]: I0309 13:37:56.765951 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gct72\" (UniqueName: \"kubernetes.io/projected/a77dd9b9-647b-4a75-b754-d7c92507e241-kube-api-access-gct72\") pod \"a77dd9b9-647b-4a75-b754-d7c92507e241\" (UID: \"a77dd9b9-647b-4a75-b754-d7c92507e241\") " Mar 09 13:37:56 crc kubenswrapper[4764]: I0309 13:37:56.766062 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a77dd9b9-647b-4a75-b754-d7c92507e241-utilities\") pod \"a77dd9b9-647b-4a75-b754-d7c92507e241\" (UID: \"a77dd9b9-647b-4a75-b754-d7c92507e241\") " Mar 09 13:37:56 crc kubenswrapper[4764]: I0309 13:37:56.767069 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a77dd9b9-647b-4a75-b754-d7c92507e241-utilities" (OuterVolumeSpecName: "utilities") pod "a77dd9b9-647b-4a75-b754-d7c92507e241" (UID: "a77dd9b9-647b-4a75-b754-d7c92507e241"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:37:56 crc kubenswrapper[4764]: I0309 13:37:56.774664 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a77dd9b9-647b-4a75-b754-d7c92507e241-kube-api-access-gct72" (OuterVolumeSpecName: "kube-api-access-gct72") pod "a77dd9b9-647b-4a75-b754-d7c92507e241" (UID: "a77dd9b9-647b-4a75-b754-d7c92507e241"). InnerVolumeSpecName "kube-api-access-gct72". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:37:56 crc kubenswrapper[4764]: I0309 13:37:56.834236 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a77dd9b9-647b-4a75-b754-d7c92507e241-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a77dd9b9-647b-4a75-b754-d7c92507e241" (UID: "a77dd9b9-647b-4a75-b754-d7c92507e241"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:37:56 crc kubenswrapper[4764]: I0309 13:37:56.868375 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a77dd9b9-647b-4a75-b754-d7c92507e241-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:56 crc kubenswrapper[4764]: I0309 13:37:56.868413 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gct72\" (UniqueName: \"kubernetes.io/projected/a77dd9b9-647b-4a75-b754-d7c92507e241-kube-api-access-gct72\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:56 crc kubenswrapper[4764]: I0309 13:37:56.868428 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a77dd9b9-647b-4a75-b754-d7c92507e241-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:57 crc kubenswrapper[4764]: I0309 13:37:57.577685 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfqln" event={"ID":"a77dd9b9-647b-4a75-b754-d7c92507e241","Type":"ContainerDied","Data":"fea157f4e9abb709a49af60bce7d1a5f75e60dccdc7d1d8642fcea7367aa5768"} Mar 09 13:37:57 crc kubenswrapper[4764]: I0309 13:37:57.577740 4764 scope.go:117] "RemoveContainer" containerID="c6becf3bd2bee1ada31af980926b44680472b5286412c9246d30f2b373c59019" Mar 09 13:37:57 crc kubenswrapper[4764]: I0309 13:37:57.577760 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jfqln" Mar 09 13:37:57 crc kubenswrapper[4764]: I0309 13:37:57.599064 4764 scope.go:117] "RemoveContainer" containerID="b5fc661ae561d0866cfa65888b69c005d4e29c4b626a7af3e20fd0370ac554ca" Mar 09 13:37:57 crc kubenswrapper[4764]: I0309 13:37:57.618291 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jfqln"] Mar 09 13:37:57 crc kubenswrapper[4764]: I0309 13:37:57.631436 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jfqln"] Mar 09 13:37:57 crc kubenswrapper[4764]: I0309 13:37:57.631554 4764 scope.go:117] "RemoveContainer" containerID="9cc110aa0cbe4a810cdc27fc5d3549dd4a1ef08a16d0a88aba121e7def0205ec" Mar 09 13:37:58 crc kubenswrapper[4764]: I0309 13:37:58.370753 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:37:58 crc kubenswrapper[4764]: I0309 13:37:58.371050 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:37:58 crc kubenswrapper[4764]: I0309 13:37:58.371095 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:37:58 crc kubenswrapper[4764]: I0309 13:37:58.371763 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bd218b04a62035d950083134e237631295493daecba2baa1eef096560f891819"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:37:58 crc kubenswrapper[4764]: I0309 13:37:58.371818 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://bd218b04a62035d950083134e237631295493daecba2baa1eef096560f891819" gracePeriod=600 Mar 09 13:37:58 crc kubenswrapper[4764]: I0309 13:37:58.604247 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="bd218b04a62035d950083134e237631295493daecba2baa1eef096560f891819" exitCode=0 Mar 09 13:37:58 crc kubenswrapper[4764]: I0309 13:37:58.604363 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"bd218b04a62035d950083134e237631295493daecba2baa1eef096560f891819"} Mar 09 13:37:58 crc kubenswrapper[4764]: I0309 13:37:58.604413 4764 scope.go:117] "RemoveContainer" containerID="3e004d46e664a823c675c177e537cdd2b21dcfc6ff9afcb1b390da1939340817" Mar 09 13:37:59 crc kubenswrapper[4764]: I0309 13:37:59.569990 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a77dd9b9-647b-4a75-b754-d7c92507e241" path="/var/lib/kubelet/pods/a77dd9b9-647b-4a75-b754-d7c92507e241/volumes" Mar 09 13:37:59 crc kubenswrapper[4764]: I0309 13:37:59.620383 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"089fede4f6de3963d0297d0b784543ed7a9d38cdefe66f6bbb9aab989dbffbb0"} Mar 09 13:38:00 crc kubenswrapper[4764]: I0309 13:38:00.144754 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551058-6mlbf"] Mar 09 13:38:00 crc kubenswrapper[4764]: E0309 13:38:00.145408 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a77dd9b9-647b-4a75-b754-d7c92507e241" containerName="extract-utilities" Mar 09 13:38:00 crc kubenswrapper[4764]: I0309 13:38:00.145430 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a77dd9b9-647b-4a75-b754-d7c92507e241" containerName="extract-utilities" Mar 09 13:38:00 crc kubenswrapper[4764]: E0309 13:38:00.145448 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a77dd9b9-647b-4a75-b754-d7c92507e241" containerName="extract-content" Mar 09 13:38:00 crc kubenswrapper[4764]: I0309 13:38:00.145455 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a77dd9b9-647b-4a75-b754-d7c92507e241" containerName="extract-content" Mar 09 13:38:00 crc kubenswrapper[4764]: E0309 13:38:00.145471 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a77dd9b9-647b-4a75-b754-d7c92507e241" containerName="registry-server" Mar 09 13:38:00 crc kubenswrapper[4764]: I0309 13:38:00.145479 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a77dd9b9-647b-4a75-b754-d7c92507e241" containerName="registry-server" Mar 09 13:38:00 crc kubenswrapper[4764]: I0309 13:38:00.145618 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a77dd9b9-647b-4a75-b754-d7c92507e241" containerName="registry-server" Mar 09 13:38:00 crc kubenswrapper[4764]: I0309 13:38:00.146176 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551058-6mlbf" Mar 09 13:38:00 crc kubenswrapper[4764]: I0309 13:38:00.147987 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:38:00 crc kubenswrapper[4764]: I0309 13:38:00.148565 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:38:00 crc kubenswrapper[4764]: I0309 13:38:00.150966 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:38:00 crc kubenswrapper[4764]: I0309 13:38:00.152923 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551058-6mlbf"] Mar 09 13:38:00 crc kubenswrapper[4764]: I0309 13:38:00.213702 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mps4k\" (UniqueName: \"kubernetes.io/projected/175910d6-eb27-4000-ac8b-9ea49f05bb8b-kube-api-access-mps4k\") pod \"auto-csr-approver-29551058-6mlbf\" (UID: \"175910d6-eb27-4000-ac8b-9ea49f05bb8b\") " pod="openshift-infra/auto-csr-approver-29551058-6mlbf" Mar 09 13:38:00 crc kubenswrapper[4764]: I0309 13:38:00.315505 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mps4k\" (UniqueName: \"kubernetes.io/projected/175910d6-eb27-4000-ac8b-9ea49f05bb8b-kube-api-access-mps4k\") pod \"auto-csr-approver-29551058-6mlbf\" (UID: \"175910d6-eb27-4000-ac8b-9ea49f05bb8b\") " pod="openshift-infra/auto-csr-approver-29551058-6mlbf" Mar 09 13:38:00 crc kubenswrapper[4764]: I0309 13:38:00.338670 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mps4k\" (UniqueName: \"kubernetes.io/projected/175910d6-eb27-4000-ac8b-9ea49f05bb8b-kube-api-access-mps4k\") pod \"auto-csr-approver-29551058-6mlbf\" (UID: \"175910d6-eb27-4000-ac8b-9ea49f05bb8b\") " pod="openshift-infra/auto-csr-approver-29551058-6mlbf" Mar 09 13:38:00 crc kubenswrapper[4764]: I0309 13:38:00.496537 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551058-6mlbf" Mar 09 13:38:01 crc kubenswrapper[4764]: I0309 13:38:01.006329 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551058-6mlbf"] Mar 09 13:38:01 crc kubenswrapper[4764]: I0309 13:38:01.636588 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551058-6mlbf" event={"ID":"175910d6-eb27-4000-ac8b-9ea49f05bb8b","Type":"ContainerStarted","Data":"ce5182a953f7d877c80b916ad134db136116956777eedbeb4d0f7a167e307e06"} Mar 09 13:38:02 crc kubenswrapper[4764]: I0309 13:38:02.646937 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551058-6mlbf" event={"ID":"175910d6-eb27-4000-ac8b-9ea49f05bb8b","Type":"ContainerStarted","Data":"c0f759ebfc59d520d002517970a3889749936b75779f65069038f6e34fd87723"} Mar 09 13:38:02 crc kubenswrapper[4764]: I0309 13:38:02.691935 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551058-6mlbf" podStartSLOduration=1.792670766 podStartE2EDuration="2.691911067s" podCreationTimestamp="2026-03-09 13:38:00 +0000 UTC" firstStartedPulling="2026-03-09 13:38:01.019217915 +0000 UTC m=+1036.269389823" lastFinishedPulling="2026-03-09 13:38:01.918458216 +0000 UTC m=+1037.168630124" observedRunningTime="2026-03-09 13:38:02.689485514 +0000 UTC m=+1037.939657422" watchObservedRunningTime="2026-03-09 13:38:02.691911067 +0000 UTC m=+1037.942082985" Mar 09 13:38:03 crc kubenswrapper[4764]: I0309 13:38:03.653864 4764 generic.go:334] "Generic (PLEG): container finished" podID="175910d6-eb27-4000-ac8b-9ea49f05bb8b" containerID="c0f759ebfc59d520d002517970a3889749936b75779f65069038f6e34fd87723" exitCode=0 Mar 09 13:38:03 crc kubenswrapper[4764]: I0309 13:38:03.654053 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551058-6mlbf" event={"ID":"175910d6-eb27-4000-ac8b-9ea49f05bb8b","Type":"ContainerDied","Data":"c0f759ebfc59d520d002517970a3889749936b75779f65069038f6e34fd87723"} Mar 09 13:38:05 crc kubenswrapper[4764]: I0309 13:38:05.033331 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551058-6mlbf" Mar 09 13:38:05 crc kubenswrapper[4764]: I0309 13:38:05.191616 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mps4k\" (UniqueName: \"kubernetes.io/projected/175910d6-eb27-4000-ac8b-9ea49f05bb8b-kube-api-access-mps4k\") pod \"175910d6-eb27-4000-ac8b-9ea49f05bb8b\" (UID: \"175910d6-eb27-4000-ac8b-9ea49f05bb8b\") " Mar 09 13:38:05 crc kubenswrapper[4764]: I0309 13:38:05.201428 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/175910d6-eb27-4000-ac8b-9ea49f05bb8b-kube-api-access-mps4k" (OuterVolumeSpecName: "kube-api-access-mps4k") pod "175910d6-eb27-4000-ac8b-9ea49f05bb8b" (UID: "175910d6-eb27-4000-ac8b-9ea49f05bb8b"). InnerVolumeSpecName "kube-api-access-mps4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:38:05 crc kubenswrapper[4764]: I0309 13:38:05.293822 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mps4k\" (UniqueName: \"kubernetes.io/projected/175910d6-eb27-4000-ac8b-9ea49f05bb8b-kube-api-access-mps4k\") on node \"crc\" DevicePath \"\"" Mar 09 13:38:05 crc kubenswrapper[4764]: I0309 13:38:05.668383 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551058-6mlbf" event={"ID":"175910d6-eb27-4000-ac8b-9ea49f05bb8b","Type":"ContainerDied","Data":"ce5182a953f7d877c80b916ad134db136116956777eedbeb4d0f7a167e307e06"} Mar 09 13:38:05 crc kubenswrapper[4764]: I0309 13:38:05.668862 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce5182a953f7d877c80b916ad134db136116956777eedbeb4d0f7a167e307e06" Mar 09 13:38:05 crc kubenswrapper[4764]: I0309 13:38:05.668462 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551058-6mlbf" Mar 09 13:38:06 crc kubenswrapper[4764]: I0309 13:38:06.091717 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551052-t652n"] Mar 09 13:38:06 crc kubenswrapper[4764]: I0309 13:38:06.096667 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551052-t652n"] Mar 09 13:38:07 crc kubenswrapper[4764]: I0309 13:38:07.566970 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee50d407-01a6-43e7-833e-b803dbb4792f" path="/var/lib/kubelet/pods/ee50d407-01a6-43e7-833e-b803dbb4792f/volumes" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.649604 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-82cg8"] Mar 09 13:38:08 crc kubenswrapper[4764]: E0309 13:38:08.650265 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="175910d6-eb27-4000-ac8b-9ea49f05bb8b" containerName="oc" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.650281 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="175910d6-eb27-4000-ac8b-9ea49f05bb8b" containerName="oc" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.650427 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="175910d6-eb27-4000-ac8b-9ea49f05bb8b" containerName="oc" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.651026 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-82cg8" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.654794 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-cdpv6" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.656924 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-nppjq"] Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.657922 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-nppjq" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.665300 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-5dvsf" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.673878 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-82cg8"] Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.691532 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-nppjq"] Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.708693 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-cmtpc"] Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.709525 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-cmtpc" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.717337 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-s8xv6" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.737848 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-cmtpc"] Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.742189 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znvfv\" (UniqueName: \"kubernetes.io/projected/e220a3f1-4dbe-4ee6-9b19-26985fa998cf-kube-api-access-znvfv\") pod \"barbican-operator-controller-manager-6db6876945-82cg8\" (UID: \"e220a3f1-4dbe-4ee6-9b19-26985fa998cf\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-82cg8" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.742233 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnz4t\" (UniqueName: \"kubernetes.io/projected/4c271ca0-0c25-46d1-b730-e94f68397e29-kube-api-access-lnz4t\") pod \"cinder-operator-controller-manager-55d77d7b5c-nppjq\" (UID: \"4c271ca0-0c25-46d1-b730-e94f68397e29\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-nppjq" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.772257 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-mjf6m"] Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.773341 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mjf6m" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.780055 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-924l7" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.797420 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-mjf6m"] Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.826695 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-jnmbv"] Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.827480 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-jnmbv" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.837916 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-7cd2b" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.845411 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95kb4\" (UniqueName: \"kubernetes.io/projected/488ff419-d889-4778-96cf-a11006c49507-kube-api-access-95kb4\") pod \"glance-operator-controller-manager-64db6967f8-mjf6m\" (UID: \"488ff419-d889-4778-96cf-a11006c49507\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mjf6m" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.845517 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znvfv\" (UniqueName: \"kubernetes.io/projected/e220a3f1-4dbe-4ee6-9b19-26985fa998cf-kube-api-access-znvfv\") pod \"barbican-operator-controller-manager-6db6876945-82cg8\" (UID: \"e220a3f1-4dbe-4ee6-9b19-26985fa998cf\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-82cg8" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.845550 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnz4t\" (UniqueName: \"kubernetes.io/projected/4c271ca0-0c25-46d1-b730-e94f68397e29-kube-api-access-lnz4t\") pod \"cinder-operator-controller-manager-55d77d7b5c-nppjq\" (UID: \"4c271ca0-0c25-46d1-b730-e94f68397e29\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-nppjq" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.845627 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkgj9\" (UniqueName: \"kubernetes.io/projected/725c0dd0-07d1-4a1c-b223-e8bec76cc7ff-kube-api-access-lkgj9\") pod \"designate-operator-controller-manager-5d87c9d997-cmtpc\" (UID: \"725c0dd0-07d1-4a1c-b223-e8bec76cc7ff\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-cmtpc" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.860205 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5xc2s"] Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.861134 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5xc2s" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.867188 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-t6qtj" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.872733 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5xc2s"] Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.893633 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnz4t\" (UniqueName: \"kubernetes.io/projected/4c271ca0-0c25-46d1-b730-e94f68397e29-kube-api-access-lnz4t\") pod \"cinder-operator-controller-manager-55d77d7b5c-nppjq\" (UID: \"4c271ca0-0c25-46d1-b730-e94f68397e29\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-nppjq" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.894253 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znvfv\" (UniqueName: \"kubernetes.io/projected/e220a3f1-4dbe-4ee6-9b19-26985fa998cf-kube-api-access-znvfv\") pod \"barbican-operator-controller-manager-6db6876945-82cg8\" (UID: \"e220a3f1-4dbe-4ee6-9b19-26985fa998cf\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-82cg8" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.901165 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-jnmbv"] Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.907763 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-hvpbz"] Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.919898 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hvpbz" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.920072 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9"] Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.925037 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-j6rd9" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.927488 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.938864 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.939201 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-s9gg2" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.946606 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkvqv\" (UniqueName: \"kubernetes.io/projected/3da43711-be34-4189-b686-e8e9bc9e7265-kube-api-access-dkvqv\") pod \"horizon-operator-controller-manager-78bc7f9bd9-5xc2s\" (UID: \"3da43711-be34-4189-b686-e8e9bc9e7265\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5xc2s" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.946765 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkgj9\" (UniqueName: \"kubernetes.io/projected/725c0dd0-07d1-4a1c-b223-e8bec76cc7ff-kube-api-access-lkgj9\") pod \"designate-operator-controller-manager-5d87c9d997-cmtpc\" (UID: \"725c0dd0-07d1-4a1c-b223-e8bec76cc7ff\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-cmtpc" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.946793 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95kb4\" (UniqueName: \"kubernetes.io/projected/488ff419-d889-4778-96cf-a11006c49507-kube-api-access-95kb4\") pod \"glance-operator-controller-manager-64db6967f8-mjf6m\" (UID: \"488ff419-d889-4778-96cf-a11006c49507\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mjf6m" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.946825 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cqd9\" (UniqueName: \"kubernetes.io/projected/7295db10-1c36-4c17-bf1e-4c4a702c201b-kube-api-access-2cqd9\") pod \"heat-operator-controller-manager-cf99c678f-jnmbv\" (UID: \"7295db10-1c36-4c17-bf1e-4c4a702c201b\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-jnmbv" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.977054 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-82cg8" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.983202 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-hvpbz"] Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.988920 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-nppjq" Mar 09 13:38:08 crc kubenswrapper[4764]: I0309 13:38:08.989522 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c7bcbc569-qhpvs"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.001556 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c7bcbc569-qhpvs" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.009967 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-xj97p" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.015344 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c7bcbc569-qhpvs"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.029692 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkgj9\" (UniqueName: \"kubernetes.io/projected/725c0dd0-07d1-4a1c-b223-e8bec76cc7ff-kube-api-access-lkgj9\") pod \"designate-operator-controller-manager-5d87c9d997-cmtpc\" (UID: \"725c0dd0-07d1-4a1c-b223-e8bec76cc7ff\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-cmtpc" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.039054 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-cmtpc" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.049248 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.049993 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcznz\" (UniqueName: \"kubernetes.io/projected/5cd7eb92-2fae-4978-a5e9-58fa87c63e84-kube-api-access-bcznz\") pod \"manila-operator-controller-manager-7c7bcbc569-qhpvs\" (UID: \"5cd7eb92-2fae-4978-a5e9-58fa87c63e84\") " pod="openstack-operators/manila-operator-controller-manager-7c7bcbc569-qhpvs" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.050055 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-652xw\" (UniqueName: \"kubernetes.io/projected/5bd11a3c-79f3-4aa2-9d5a-78ec3c18cb31-kube-api-access-652xw\") pod \"ironic-operator-controller-manager-545456dc4-hvpbz\" (UID: \"5bd11a3c-79f3-4aa2-9d5a-78ec3c18cb31\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hvpbz" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.050097 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cqd9\" (UniqueName: \"kubernetes.io/projected/7295db10-1c36-4c17-bf1e-4c4a702c201b-kube-api-access-2cqd9\") pod \"heat-operator-controller-manager-cf99c678f-jnmbv\" (UID: \"7295db10-1c36-4c17-bf1e-4c4a702c201b\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-jnmbv" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.050124 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7z99\" (UniqueName: \"kubernetes.io/projected/bfda7896-83e3-407c-9eb5-74fbc11104f0-kube-api-access-h7z99\") pod \"infra-operator-controller-manager-5995f4446f-m58s9\" (UID: \"bfda7896-83e3-407c-9eb5-74fbc11104f0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.050150 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert\") pod \"infra-operator-controller-manager-5995f4446f-m58s9\" (UID: \"bfda7896-83e3-407c-9eb5-74fbc11104f0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.050181 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkvqv\" (UniqueName: \"kubernetes.io/projected/3da43711-be34-4189-b686-e8e9bc9e7265-kube-api-access-dkvqv\") pod \"horizon-operator-controller-manager-78bc7f9bd9-5xc2s\" (UID: \"3da43711-be34-4189-b686-e8e9bc9e7265\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5xc2s" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.058967 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95kb4\" (UniqueName: \"kubernetes.io/projected/488ff419-d889-4778-96cf-a11006c49507-kube-api-access-95kb4\") pod \"glance-operator-controller-manager-64db6967f8-mjf6m\" (UID: \"488ff419-d889-4778-96cf-a11006c49507\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mjf6m" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.088526 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-wv2rp"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.089385 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wv2rp" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.098998 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mjf6m" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.101622 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-wv2rp"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.121908 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkvqv\" (UniqueName: \"kubernetes.io/projected/3da43711-be34-4189-b686-e8e9bc9e7265-kube-api-access-dkvqv\") pod \"horizon-operator-controller-manager-78bc7f9bd9-5xc2s\" (UID: \"3da43711-be34-4189-b686-e8e9bc9e7265\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5xc2s" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.136405 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-zqsnk" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.142590 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cqd9\" (UniqueName: \"kubernetes.io/projected/7295db10-1c36-4c17-bf1e-4c4a702c201b-kube-api-access-2cqd9\") pod \"heat-operator-controller-manager-cf99c678f-jnmbv\" (UID: \"7295db10-1c36-4c17-bf1e-4c4a702c201b\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-jnmbv" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.152556 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcznz\" (UniqueName: \"kubernetes.io/projected/5cd7eb92-2fae-4978-a5e9-58fa87c63e84-kube-api-access-bcznz\") pod \"manila-operator-controller-manager-7c7bcbc569-qhpvs\" (UID: \"5cd7eb92-2fae-4978-a5e9-58fa87c63e84\") " pod="openstack-operators/manila-operator-controller-manager-7c7bcbc569-qhpvs" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.152640 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-652xw\" (UniqueName: \"kubernetes.io/projected/5bd11a3c-79f3-4aa2-9d5a-78ec3c18cb31-kube-api-access-652xw\") pod \"ironic-operator-controller-manager-545456dc4-hvpbz\" (UID: \"5bd11a3c-79f3-4aa2-9d5a-78ec3c18cb31\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hvpbz" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.152816 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7z99\" (UniqueName: \"kubernetes.io/projected/bfda7896-83e3-407c-9eb5-74fbc11104f0-kube-api-access-h7z99\") pod \"infra-operator-controller-manager-5995f4446f-m58s9\" (UID: \"bfda7896-83e3-407c-9eb5-74fbc11104f0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.152855 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flbm2\" (UniqueName: \"kubernetes.io/projected/32eb5815-c566-4177-8b47-f756807d4a30-kube-api-access-flbm2\") pod \"keystone-operator-controller-manager-7c789f89c6-wv2rp\" (UID: \"32eb5815-c566-4177-8b47-f756807d4a30\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wv2rp" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.152895 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert\") pod \"infra-operator-controller-manager-5995f4446f-m58s9\" (UID: \"bfda7896-83e3-407c-9eb5-74fbc11104f0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" Mar 09 13:38:09 crc kubenswrapper[4764]: E0309 13:38:09.153080 4764 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 13:38:09 crc kubenswrapper[4764]: E0309 13:38:09.153144 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert podName:bfda7896-83e3-407c-9eb5-74fbc11104f0 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:09.653123095 +0000 UTC m=+1044.903295003 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert") pod "infra-operator-controller-manager-5995f4446f-m58s9" (UID: "bfda7896-83e3-407c-9eb5-74fbc11104f0") : secret "infra-operator-webhook-server-cert" not found Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.170630 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-jnmbv" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.201635 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5xc2s" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.202170 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-cgv66"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.203563 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-cgv66" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.205947 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7z99\" (UniqueName: \"kubernetes.io/projected/bfda7896-83e3-407c-9eb5-74fbc11104f0-kube-api-access-h7z99\") pod \"infra-operator-controller-manager-5995f4446f-m58s9\" (UID: \"bfda7896-83e3-407c-9eb5-74fbc11104f0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.259431 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-652xw\" (UniqueName: \"kubernetes.io/projected/5bd11a3c-79f3-4aa2-9d5a-78ec3c18cb31-kube-api-access-652xw\") pod \"ironic-operator-controller-manager-545456dc4-hvpbz\" (UID: \"5bd11a3c-79f3-4aa2-9d5a-78ec3c18cb31\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hvpbz" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.260306 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-dm7rn"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.261543 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dm7rn" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.262586 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flbm2\" (UniqueName: \"kubernetes.io/projected/32eb5815-c566-4177-8b47-f756807d4a30-kube-api-access-flbm2\") pod \"keystone-operator-controller-manager-7c789f89c6-wv2rp\" (UID: \"32eb5815-c566-4177-8b47-f756807d4a30\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wv2rp" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.262712 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rf24\" (UniqueName: \"kubernetes.io/projected/2ddf1e89-9c89-4052-aa1b-6fb84438b86d-kube-api-access-5rf24\") pod \"neutron-operator-controller-manager-54688575f-cgv66\" (UID: \"2ddf1e89-9c89-4052-aa1b-6fb84438b86d\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-cgv66" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.268276 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-bhzwg" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.269503 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hvpbz" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.272263 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.273306 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.273706 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-jd85k" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.274364 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcznz\" (UniqueName: \"kubernetes.io/projected/5cd7eb92-2fae-4978-a5e9-58fa87c63e84-kube-api-access-bcznz\") pod \"manila-operator-controller-manager-7c7bcbc569-qhpvs\" (UID: \"5cd7eb92-2fae-4978-a5e9-58fa87c63e84\") " pod="openstack-operators/manila-operator-controller-manager-7c7bcbc569-qhpvs" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.278074 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-68h76" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.289726 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-cgv66"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.307899 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-dm7rn"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.325758 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.328345 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flbm2\" (UniqueName: \"kubernetes.io/projected/32eb5815-c566-4177-8b47-f756807d4a30-kube-api-access-flbm2\") pod \"keystone-operator-controller-manager-7c789f89c6-wv2rp\" (UID: \"32eb5815-c566-4177-8b47-f756807d4a30\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wv2rp" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.363811 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rf24\" (UniqueName: \"kubernetes.io/projected/2ddf1e89-9c89-4052-aa1b-6fb84438b86d-kube-api-access-5rf24\") pod \"neutron-operator-controller-manager-54688575f-cgv66\" (UID: \"2ddf1e89-9c89-4052-aa1b-6fb84438b86d\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-cgv66" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.364868 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6jd4\" (UniqueName: \"kubernetes.io/projected/26535a82-8d70-4623-b2b4-7dd1546d48d6-kube-api-access-d6jd4\") pod \"nova-operator-controller-manager-74b6b5dc96-dm7rn\" (UID: \"26535a82-8d70-4623-b2b4-7dd1546d48d6\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dm7rn" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.364957 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvk8m\" (UniqueName: \"kubernetes.io/projected/da851ddd-2b27-45f0-b149-de32ae21ad91-kube-api-access-cvk8m\") pod \"mariadb-operator-controller-manager-7b6bfb6475-vkns5\" (UID: \"da851ddd-2b27-45f0-b149-de32ae21ad91\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.398416 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rf24\" (UniqueName: \"kubernetes.io/projected/2ddf1e89-9c89-4052-aa1b-6fb84438b86d-kube-api-access-5rf24\") pod \"neutron-operator-controller-manager-54688575f-cgv66\" (UID: \"2ddf1e89-9c89-4052-aa1b-6fb84438b86d\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-cgv66" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.398863 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.400206 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.413443 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.413859 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c7bcbc569-qhpvs" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.424155 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-92kgv" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.437452 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-jfgzw"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.438576 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-jfgzw" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.449410 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-bpf6g" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.453082 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.461117 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.466761 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvk8m\" (UniqueName: \"kubernetes.io/projected/da851ddd-2b27-45f0-b149-de32ae21ad91-kube-api-access-cvk8m\") pod \"mariadb-operator-controller-manager-7b6bfb6475-vkns5\" (UID: \"da851ddd-2b27-45f0-b149-de32ae21ad91\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.466806 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz7h2\" (UniqueName: \"kubernetes.io/projected/615473d3-072e-4685-8f32-73a44badf1e2-kube-api-access-lz7h2\") pod \"ovn-operator-controller-manager-75684d597f-jfgzw\" (UID: \"615473d3-072e-4685-8f32-73a44badf1e2\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-jfgzw" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.466895 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6jd4\" (UniqueName: \"kubernetes.io/projected/26535a82-8d70-4623-b2b4-7dd1546d48d6-kube-api-access-d6jd4\") pod \"nova-operator-controller-manager-74b6b5dc96-dm7rn\" (UID: \"26535a82-8d70-4623-b2b4-7dd1546d48d6\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dm7rn" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.466935 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms9zj\" (UniqueName: \"kubernetes.io/projected/b54e2237-603a-44ad-a129-04736cf749b2-kube-api-access-ms9zj\") pod \"octavia-operator-controller-manager-5d86c7ddb7-gv2sm\" (UID: \"b54e2237-603a-44ad-a129-04736cf749b2\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.472406 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-jfgzw"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.485755 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.489017 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.489224 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-vftv5" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.490876 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.497142 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-bxjrz" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.509738 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.512656 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.517723 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvk8m\" (UniqueName: \"kubernetes.io/projected/da851ddd-2b27-45f0-b149-de32ae21ad91-kube-api-access-cvk8m\") pod \"mariadb-operator-controller-manager-7b6bfb6475-vkns5\" (UID: \"da851ddd-2b27-45f0-b149-de32ae21ad91\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.518208 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-rgfgd" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.543678 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-4cpsz"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.545003 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-4cpsz" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.555455 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.566124 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-ftds8" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.566275 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6jd4\" (UniqueName: \"kubernetes.io/projected/26535a82-8d70-4623-b2b4-7dd1546d48d6-kube-api-access-d6jd4\") pod \"nova-operator-controller-manager-74b6b5dc96-dm7rn\" (UID: \"26535a82-8d70-4623-b2b4-7dd1546d48d6\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dm7rn" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.569142 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz7h2\" (UniqueName: \"kubernetes.io/projected/615473d3-072e-4685-8f32-73a44badf1e2-kube-api-access-lz7h2\") pod \"ovn-operator-controller-manager-75684d597f-jfgzw\" (UID: \"615473d3-072e-4685-8f32-73a44badf1e2\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-jfgzw" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.569284 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blgqv\" (UniqueName: \"kubernetes.io/projected/c6a51c61-770b-4b46-9dd3-1f61e6f5ebc8-kube-api-access-blgqv\") pod \"telemetry-operator-controller-manager-5fdb694969-4cpsz\" (UID: \"c6a51c61-770b-4b46-9dd3-1f61e6f5ebc8\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-4cpsz" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.569360 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdwmb\" (UniqueName: \"kubernetes.io/projected/47bd7072-a414-4ce8-800b-753b7054be23-kube-api-access-xdwmb\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm\" (UID: \"47bd7072-a414-4ce8-800b-753b7054be23\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.569385 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhwj4\" (UniqueName: \"kubernetes.io/projected/c44e76b2-0de9-4a5b-93ee-536c6300157f-kube-api-access-fhwj4\") pod \"placement-operator-controller-manager-648564c9fc-8ms5w\" (UID: \"c44e76b2-0de9-4a5b-93ee-536c6300157f\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.569491 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms9zj\" (UniqueName: \"kubernetes.io/projected/b54e2237-603a-44ad-a129-04736cf749b2-kube-api-access-ms9zj\") pod \"octavia-operator-controller-manager-5d86c7ddb7-gv2sm\" (UID: \"b54e2237-603a-44ad-a129-04736cf749b2\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.569556 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm\" (UID: \"47bd7072-a414-4ce8-800b-753b7054be23\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.569599 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhthv\" (UniqueName: \"kubernetes.io/projected/003210d3-5572-44bd-aae5-d5e24aac16a5-kube-api-access-rhthv\") pod \"swift-operator-controller-manager-9b9ff9f4d-bf8w8\" (UID: \"003210d3-5572-44bd-aae5-d5e24aac16a5\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.569501 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wv2rp" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.583193 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.583228 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.598416 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz7h2\" (UniqueName: \"kubernetes.io/projected/615473d3-072e-4685-8f32-73a44badf1e2-kube-api-access-lz7h2\") pod \"ovn-operator-controller-manager-75684d597f-jfgzw\" (UID: \"615473d3-072e-4685-8f32-73a44badf1e2\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-jfgzw" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.601921 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-4cpsz"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.605037 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms9zj\" (UniqueName: \"kubernetes.io/projected/b54e2237-603a-44ad-a129-04736cf749b2-kube-api-access-ms9zj\") pod \"octavia-operator-controller-manager-5d86c7ddb7-gv2sm\" (UID: \"b54e2237-603a-44ad-a129-04736cf749b2\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.629357 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-d65xp"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.634218 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-d65xp" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.642393 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-qrhxx" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.658018 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-cgv66" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.660445 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dm7rn" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.670751 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.671564 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.671989 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blgqv\" (UniqueName: \"kubernetes.io/projected/c6a51c61-770b-4b46-9dd3-1f61e6f5ebc8-kube-api-access-blgqv\") pod \"telemetry-operator-controller-manager-5fdb694969-4cpsz\" (UID: \"c6a51c61-770b-4b46-9dd3-1f61e6f5ebc8\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-4cpsz" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.672019 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdwmb\" (UniqueName: \"kubernetes.io/projected/47bd7072-a414-4ce8-800b-753b7054be23-kube-api-access-xdwmb\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm\" (UID: \"47bd7072-a414-4ce8-800b-753b7054be23\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.672038 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhwj4\" (UniqueName: \"kubernetes.io/projected/c44e76b2-0de9-4a5b-93ee-536c6300157f-kube-api-access-fhwj4\") pod \"placement-operator-controller-manager-648564c9fc-8ms5w\" (UID: \"c44e76b2-0de9-4a5b-93ee-536c6300157f\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.672091 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert\") pod \"infra-operator-controller-manager-5995f4446f-m58s9\" (UID: \"bfda7896-83e3-407c-9eb5-74fbc11104f0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.672143 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm\" (UID: \"47bd7072-a414-4ce8-800b-753b7054be23\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.672172 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhthv\" (UniqueName: \"kubernetes.io/projected/003210d3-5572-44bd-aae5-d5e24aac16a5-kube-api-access-rhthv\") pod \"swift-operator-controller-manager-9b9ff9f4d-bf8w8\" (UID: \"003210d3-5572-44bd-aae5-d5e24aac16a5\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.672211 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsvnd\" (UniqueName: \"kubernetes.io/projected/867908a2-f085-4f3d-b569-84c915f730b1-kube-api-access-gsvnd\") pod \"test-operator-controller-manager-55b5ff4dbb-d65xp\" (UID: \"867908a2-f085-4f3d-b569-84c915f730b1\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-d65xp" Mar 09 13:38:09 crc kubenswrapper[4764]: E0309 13:38:09.672481 4764 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 13:38:09 crc kubenswrapper[4764]: E0309 13:38:09.672528 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert podName:bfda7896-83e3-407c-9eb5-74fbc11104f0 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:10.67251234 +0000 UTC m=+1045.922684248 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert") pod "infra-operator-controller-manager-5995f4446f-m58s9" (UID: "bfda7896-83e3-407c-9eb5-74fbc11104f0") : secret "infra-operator-webhook-server-cert" not found Mar 09 13:38:09 crc kubenswrapper[4764]: E0309 13:38:09.673001 4764 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:38:09 crc kubenswrapper[4764]: E0309 13:38:09.673030 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert podName:47bd7072-a414-4ce8-800b-753b7054be23 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:10.173020753 +0000 UTC m=+1045.423192671 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" (UID: "47bd7072-a414-4ce8-800b-753b7054be23") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.675539 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.675719 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.682179 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-lg2pt" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.695215 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-d65xp"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.705125 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blgqv\" (UniqueName: \"kubernetes.io/projected/c6a51c61-770b-4b46-9dd3-1f61e6f5ebc8-kube-api-access-blgqv\") pod \"telemetry-operator-controller-manager-5fdb694969-4cpsz\" (UID: \"c6a51c61-770b-4b46-9dd3-1f61e6f5ebc8\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-4cpsz" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.716864 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdwmb\" (UniqueName: \"kubernetes.io/projected/47bd7072-a414-4ce8-800b-753b7054be23-kube-api-access-xdwmb\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm\" (UID: \"47bd7072-a414-4ce8-800b-753b7054be23\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.718370 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhwj4\" (UniqueName: \"kubernetes.io/projected/c44e76b2-0de9-4a5b-93ee-536c6300157f-kube-api-access-fhwj4\") pod \"placement-operator-controller-manager-648564c9fc-8ms5w\" (UID: \"c44e76b2-0de9-4a5b-93ee-536c6300157f\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.724191 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhthv\" (UniqueName: \"kubernetes.io/projected/003210d3-5572-44bd-aae5-d5e24aac16a5-kube-api-access-rhthv\") pod \"swift-operator-controller-manager-9b9ff9f4d-bf8w8\" (UID: \"003210d3-5572-44bd-aae5-d5e24aac16a5\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.748395 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.750482 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.769049 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-4cpsz" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.779703 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.780088 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-p7k4x" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.780305 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.783142 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsvnd\" (UniqueName: \"kubernetes.io/projected/867908a2-f085-4f3d-b569-84c915f730b1-kube-api-access-gsvnd\") pod \"test-operator-controller-manager-55b5ff4dbb-d65xp\" (UID: \"867908a2-f085-4f3d-b569-84c915f730b1\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-d65xp" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.822447 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.831141 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsvnd\" (UniqueName: \"kubernetes.io/projected/867908a2-f085-4f3d-b569-84c915f730b1-kube-api-access-gsvnd\") pod \"test-operator-controller-manager-55b5ff4dbb-d65xp\" (UID: \"867908a2-f085-4f3d-b569-84c915f730b1\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-d65xp" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.848334 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6v2sq"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.850527 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6v2sq" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.857031 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6v2sq"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.860071 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.860877 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-w5dcl" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.880696 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-d65xp" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.884276 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.884343 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vpnw\" (UniqueName: \"kubernetes.io/projected/e11f44d8-58a5-4fc7-b05b-e2e688647d01-kube-api-access-6vpnw\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.884372 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5nrw\" (UniqueName: \"kubernetes.io/projected/f705ec78-e960-4200-b5a6-f3d4310f1bd5-kube-api-access-h5nrw\") pod \"watcher-operator-controller-manager-bccc79885-7f8nr\" (UID: \"f705ec78-e960-4200-b5a6-f3d4310f1bd5\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.884754 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.902472 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-jfgzw" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.940180 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-82cg8"] Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.974020 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w" Mar 09 13:38:09 crc kubenswrapper[4764]: W0309 13:38:09.986119 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode220a3f1_4dbe_4ee6_9b19_26985fa998cf.slice/crio-436d33e370c3500040ecc542247302bce24a8a00ee337b46e2f406a44bdaa218 WatchSource:0}: Error finding container 436d33e370c3500040ecc542247302bce24a8a00ee337b46e2f406a44bdaa218: Status 404 returned error can't find the container with id 436d33e370c3500040ecc542247302bce24a8a00ee337b46e2f406a44bdaa218 Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.987017 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb6g7\" (UniqueName: \"kubernetes.io/projected/01ea99aa-eb21-4799-9557-42c3fb55945a-kube-api-access-kb6g7\") pod \"rabbitmq-cluster-operator-manager-668c99d594-6v2sq\" (UID: \"01ea99aa-eb21-4799-9557-42c3fb55945a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6v2sq" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.987068 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.987102 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vpnw\" (UniqueName: \"kubernetes.io/projected/e11f44d8-58a5-4fc7-b05b-e2e688647d01-kube-api-access-6vpnw\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.987123 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5nrw\" (UniqueName: \"kubernetes.io/projected/f705ec78-e960-4200-b5a6-f3d4310f1bd5-kube-api-access-h5nrw\") pod \"watcher-operator-controller-manager-bccc79885-7f8nr\" (UID: \"f705ec78-e960-4200-b5a6-f3d4310f1bd5\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr" Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.987199 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:09 crc kubenswrapper[4764]: E0309 13:38:09.989428 4764 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 13:38:09 crc kubenswrapper[4764]: E0309 13:38:09.989497 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs podName:e11f44d8-58a5-4fc7-b05b-e2e688647d01 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:10.489471456 +0000 UTC m=+1045.739643364 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs") pod "openstack-operator-controller-manager-6746d697b-lr6nx" (UID: "e11f44d8-58a5-4fc7-b05b-e2e688647d01") : secret "webhook-server-cert" not found Mar 09 13:38:09 crc kubenswrapper[4764]: E0309 13:38:09.990160 4764 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 13:38:09 crc kubenswrapper[4764]: E0309 13:38:09.990187 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs podName:e11f44d8-58a5-4fc7-b05b-e2e688647d01 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:10.490178754 +0000 UTC m=+1045.740350662 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs") pod "openstack-operator-controller-manager-6746d697b-lr6nx" (UID: "e11f44d8-58a5-4fc7-b05b-e2e688647d01") : secret "metrics-server-cert" not found Mar 09 13:38:09 crc kubenswrapper[4764]: I0309 13:38:09.999154 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8" Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.017354 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vpnw\" (UniqueName: \"kubernetes.io/projected/e11f44d8-58a5-4fc7-b05b-e2e688647d01-kube-api-access-6vpnw\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.024362 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5nrw\" (UniqueName: \"kubernetes.io/projected/f705ec78-e960-4200-b5a6-f3d4310f1bd5-kube-api-access-h5nrw\") pod \"watcher-operator-controller-manager-bccc79885-7f8nr\" (UID: \"f705ec78-e960-4200-b5a6-f3d4310f1bd5\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr" Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.089879 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb6g7\" (UniqueName: \"kubernetes.io/projected/01ea99aa-eb21-4799-9557-42c3fb55945a-kube-api-access-kb6g7\") pod \"rabbitmq-cluster-operator-manager-668c99d594-6v2sq\" (UID: \"01ea99aa-eb21-4799-9557-42c3fb55945a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6v2sq" Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.116259 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-nppjq"] Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.117205 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb6g7\" (UniqueName: \"kubernetes.io/projected/01ea99aa-eb21-4799-9557-42c3fb55945a-kube-api-access-kb6g7\") pod \"rabbitmq-cluster-operator-manager-668c99d594-6v2sq\" (UID: \"01ea99aa-eb21-4799-9557-42c3fb55945a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6v2sq" Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.128731 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6v2sq" Mar 09 13:38:10 crc kubenswrapper[4764]: W0309 13:38:10.165417 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c271ca0_0c25_46d1_b730_e94f68397e29.slice/crio-453cd0556a5a9d4e58f08c125aa16fcec22f7fbc83cd351c18f798adf3681e01 WatchSource:0}: Error finding container 453cd0556a5a9d4e58f08c125aa16fcec22f7fbc83cd351c18f798adf3681e01: Status 404 returned error can't find the container with id 453cd0556a5a9d4e58f08c125aa16fcec22f7fbc83cd351c18f798adf3681e01 Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.191992 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm\" (UID: \"47bd7072-a414-4ce8-800b-753b7054be23\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:10 crc kubenswrapper[4764]: E0309 13:38:10.192146 4764 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:38:10 crc kubenswrapper[4764]: E0309 13:38:10.192196 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert podName:47bd7072-a414-4ce8-800b-753b7054be23 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:11.192179772 +0000 UTC m=+1046.442351680 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" (UID: "47bd7072-a414-4ce8-800b-753b7054be23") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.220045 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr" Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.499036 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:10 crc kubenswrapper[4764]: E0309 13:38:10.499160 4764 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.499176 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:10 crc kubenswrapper[4764]: E0309 13:38:10.499216 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs podName:e11f44d8-58a5-4fc7-b05b-e2e688647d01 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:11.499200679 +0000 UTC m=+1046.749372587 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs") pod "openstack-operator-controller-manager-6746d697b-lr6nx" (UID: "e11f44d8-58a5-4fc7-b05b-e2e688647d01") : secret "webhook-server-cert" not found Mar 09 13:38:10 crc kubenswrapper[4764]: E0309 13:38:10.499258 4764 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 13:38:10 crc kubenswrapper[4764]: E0309 13:38:10.499288 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs podName:e11f44d8-58a5-4fc7-b05b-e2e688647d01 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:11.499278651 +0000 UTC m=+1046.749450559 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs") pod "openstack-operator-controller-manager-6746d697b-lr6nx" (UID: "e11f44d8-58a5-4fc7-b05b-e2e688647d01") : secret "metrics-server-cert" not found Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.539356 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-hvpbz"] Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.558060 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5xc2s"] Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.572978 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-cmtpc"] Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.585441 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-mjf6m"] Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.597098 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-jnmbv"] Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.660757 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c7bcbc569-qhpvs"] Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.667327 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-4cpsz"] Mar 09 13:38:10 crc kubenswrapper[4764]: W0309 13:38:10.690448 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32eb5815_c566_4177_8b47_f756807d4a30.slice/crio-e1f885f85ae4d97874d821cbea9fa8a3ef4927592a3615d22ee85fdec3b4a333 WatchSource:0}: Error finding container e1f885f85ae4d97874d821cbea9fa8a3ef4927592a3615d22ee85fdec3b4a333: Status 404 returned error can't find the container with id e1f885f85ae4d97874d821cbea9fa8a3ef4927592a3615d22ee85fdec3b4a333 Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.691483 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-wv2rp"] Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.704994 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert\") pod \"infra-operator-controller-manager-5995f4446f-m58s9\" (UID: \"bfda7896-83e3-407c-9eb5-74fbc11104f0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" Mar 09 13:38:10 crc kubenswrapper[4764]: E0309 13:38:10.705155 4764 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 13:38:10 crc kubenswrapper[4764]: E0309 13:38:10.705248 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert podName:bfda7896-83e3-407c-9eb5-74fbc11104f0 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:12.705223502 +0000 UTC m=+1047.955395410 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert") pod "infra-operator-controller-manager-5995f4446f-m58s9" (UID: "bfda7896-83e3-407c-9eb5-74fbc11104f0") : secret "infra-operator-webhook-server-cert" not found Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.806788 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-dm7rn"] Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.819011 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-jfgzw"] Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.820019 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-nppjq" event={"ID":"4c271ca0-0c25-46d1-b730-e94f68397e29","Type":"ContainerStarted","Data":"453cd0556a5a9d4e58f08c125aa16fcec22f7fbc83cd351c18f798adf3681e01"} Mar 09 13:38:10 crc kubenswrapper[4764]: W0309 13:38:10.821088 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26535a82_8d70_4623_b2b4_7dd1546d48d6.slice/crio-532fdf9d88bb4997585ea9f96d3af86c9cc2f6e163a943806143b437035a35e0 WatchSource:0}: Error finding container 532fdf9d88bb4997585ea9f96d3af86c9cc2f6e163a943806143b437035a35e0: Status 404 returned error can't find the container with id 532fdf9d88bb4997585ea9f96d3af86c9cc2f6e163a943806143b437035a35e0 Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.824861 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c7bcbc569-qhpvs" event={"ID":"5cd7eb92-2fae-4978-a5e9-58fa87c63e84","Type":"ContainerStarted","Data":"2f2e45ecb699d7bde9da9d5f11e44d4e5a917a68f0bd3a50c818220f5192f006"} Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.828985 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hvpbz" event={"ID":"5bd11a3c-79f3-4aa2-9d5a-78ec3c18cb31","Type":"ContainerStarted","Data":"21e4357340aa7356c17959b990234c44d54f135747e955392796bf9e53c7ba5d"} Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.835930 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-cgv66"] Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.839131 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-4cpsz" event={"ID":"c6a51c61-770b-4b46-9dd3-1f61e6f5ebc8","Type":"ContainerStarted","Data":"956bd835dec5019d146939cac22529a5e7bb52fdc032f62c2fd46668000f4d84"} Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.863487 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5"] Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.863550 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wv2rp" event={"ID":"32eb5815-c566-4177-8b47-f756807d4a30","Type":"ContainerStarted","Data":"e1f885f85ae4d97874d821cbea9fa8a3ef4927592a3615d22ee85fdec3b4a333"} Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.864634 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-cmtpc" event={"ID":"725c0dd0-07d1-4a1c-b223-e8bec76cc7ff","Type":"ContainerStarted","Data":"e8a90e91a245e03595ceb2876d19da11011a1c5798d0035edbfd9c252f1c077d"} Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.865684 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-82cg8" event={"ID":"e220a3f1-4dbe-4ee6-9b19-26985fa998cf","Type":"ContainerStarted","Data":"436d33e370c3500040ecc542247302bce24a8a00ee337b46e2f406a44bdaa218"} Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.866619 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5xc2s" event={"ID":"3da43711-be34-4189-b686-e8e9bc9e7265","Type":"ContainerStarted","Data":"782aacb8ac9540eac4995bb178b4be34655e33dd40680d4d82147b011c3bd1be"} Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.867538 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mjf6m" event={"ID":"488ff419-d889-4778-96cf-a11006c49507","Type":"ContainerStarted","Data":"54b9226ff3f8aaf10350199499dd0a3615c3b73553afd9aa930ad22f1cd57f04"} Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.868618 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-jnmbv" event={"ID":"7295db10-1c36-4c17-bf1e-4c4a702c201b","Type":"ContainerStarted","Data":"ec1981c22097f197cc5c9636d6e6fa3817921dadae9f13097f2fc7d2b9bf02e8"} Mar 09 13:38:10 crc kubenswrapper[4764]: W0309 13:38:10.873551 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda851ddd_2b27_45f0_b149_de32ae21ad91.slice/crio-f856db644f5da9169430c0c5fab980932b910b982e9db6e7990a9c6003480f6d WatchSource:0}: Error finding container f856db644f5da9169430c0c5fab980932b910b982e9db6e7990a9c6003480f6d: Status 404 returned error can't find the container with id f856db644f5da9169430c0c5fab980932b910b982e9db6e7990a9c6003480f6d Mar 09 13:38:10 crc kubenswrapper[4764]: E0309 13:38:10.877512 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:5592ec4a6fbe2c832d1828b51af0b907e5d733d478b6f378a9b2f6d6cf0ac505,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cvk8m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-7b6bfb6475-vkns5_openstack-operators(da851ddd-2b27-45f0-b149-de32ae21ad91): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 13:38:10 crc kubenswrapper[4764]: E0309 13:38:10.879148 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5" podUID="da851ddd-2b27-45f0-b149-de32ae21ad91" Mar 09 13:38:10 crc kubenswrapper[4764]: I0309 13:38:10.996822 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-d65xp"] Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.007690 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8"] Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.021662 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w"] Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.032473 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:f309cdea8084a4b1e8cbcd732d6e250fd93c55cfd1b48ba9026907c8591faab7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rhthv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-9b9ff9f4d-bf8w8_openstack-operators(003210d3-5572-44bd-aae5-d5e24aac16a5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.032761 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm"] Mar 09 13:38:11 crc kubenswrapper[4764]: W0309 13:38:11.032937 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc44e76b2_0de9_4a5b_93ee_536c6300157f.slice/crio-efbc7e66c2b8f2bdc691a5de2e13d156a4f5d3ccea628eea1e12b9e80cd0713e WatchSource:0}: Error finding container efbc7e66c2b8f2bdc691a5de2e13d156a4f5d3ccea628eea1e12b9e80cd0713e: Status 404 returned error can't find the container with id efbc7e66c2b8f2bdc691a5de2e13d156a4f5d3ccea628eea1e12b9e80cd0713e Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.033666 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8" podUID="003210d3-5572-44bd-aae5-d5e24aac16a5" Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.035611 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fhwj4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-648564c9fc-8ms5w_openstack-operators(c44e76b2-0de9-4a5b-93ee-536c6300157f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.036718 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w" podUID="c44e76b2-0de9-4a5b-93ee-536c6300157f" Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.042928 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6v2sq"] Mar 09 13:38:11 crc kubenswrapper[4764]: W0309 13:38:11.043208 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01ea99aa_eb21_4799_9557_42c3fb55945a.slice/crio-9bd570e87d64f9de5b72579e1811c0c122d2847d8eca056f67cefdcf7b8d3d6f WatchSource:0}: Error finding container 9bd570e87d64f9de5b72579e1811c0c122d2847d8eca056f67cefdcf7b8d3d6f: Status 404 returned error can't find the container with id 9bd570e87d64f9de5b72579e1811c0c122d2847d8eca056f67cefdcf7b8d3d6f Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.045109 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kb6g7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-6v2sq_openstack-operators(01ea99aa-eb21-4799-9557-42c3fb55945a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.047142 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6v2sq" podUID="01ea99aa-eb21-4799-9557-42c3fb55945a" Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.050200 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr"] Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.054591 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ms9zj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5d86c7ddb7-gv2sm_openstack-operators(b54e2237-603a-44ad-a129-04736cf749b2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.055943 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm" podUID="b54e2237-603a-44ad-a129-04736cf749b2" Mar 09 13:38:11 crc kubenswrapper[4764]: W0309 13:38:11.061050 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf705ec78_e960_4200_b5a6_f3d4310f1bd5.slice/crio-14622df55451775e15d1e15f57b322fc2931130167f7083e5b3dc7fefa133ca3 WatchSource:0}: Error finding container 14622df55451775e15d1e15f57b322fc2931130167f7083e5b3dc7fefa133ca3: Status 404 returned error can't find the container with id 14622df55451775e15d1e15f57b322fc2931130167f7083e5b3dc7fefa133ca3 Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.068923 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h5nrw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-bccc79885-7f8nr_openstack-operators(f705ec78-e960-4200-b5a6-f3d4310f1bd5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.070073 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr" podUID="f705ec78-e960-4200-b5a6-f3d4310f1bd5" Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.213191 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm\" (UID: \"47bd7072-a414-4ce8-800b-753b7054be23\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.213455 4764 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.213530 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert podName:47bd7072-a414-4ce8-800b-753b7054be23 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:13.213509376 +0000 UTC m=+1048.463681284 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" (UID: "47bd7072-a414-4ce8-800b-753b7054be23") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.517388 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.517514 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.517710 4764 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.517775 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs podName:e11f44d8-58a5-4fc7-b05b-e2e688647d01 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:13.51775644 +0000 UTC m=+1048.767928348 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs") pod "openstack-operator-controller-manager-6746d697b-lr6nx" (UID: "e11f44d8-58a5-4fc7-b05b-e2e688647d01") : secret "webhook-server-cert" not found Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.518200 4764 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.518234 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs podName:e11f44d8-58a5-4fc7-b05b-e2e688647d01 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:13.518221903 +0000 UTC m=+1048.768393811 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs") pod "openstack-operator-controller-manager-6746d697b-lr6nx" (UID: "e11f44d8-58a5-4fc7-b05b-e2e688647d01") : secret "metrics-server-cert" not found Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.889291 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-d65xp" event={"ID":"867908a2-f085-4f3d-b569-84c915f730b1","Type":"ContainerStarted","Data":"42c5e3cad1bdf17243a5ef9fb0a15003c1b108da10c6678ed364380d3c78156d"} Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.891519 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w" event={"ID":"c44e76b2-0de9-4a5b-93ee-536c6300157f","Type":"ContainerStarted","Data":"efbc7e66c2b8f2bdc691a5de2e13d156a4f5d3ccea628eea1e12b9e80cd0713e"} Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.893811 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-jfgzw" event={"ID":"615473d3-072e-4685-8f32-73a44badf1e2","Type":"ContainerStarted","Data":"2c871aa692953c9072929952c5a03cf3e3ee9fe2af4f1e7c6c3ad701482442d9"} Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.893905 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e\\\"\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w" podUID="c44e76b2-0de9-4a5b-93ee-536c6300157f" Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.897231 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dm7rn" event={"ID":"26535a82-8d70-4623-b2b4-7dd1546d48d6","Type":"ContainerStarted","Data":"532fdf9d88bb4997585ea9f96d3af86c9cc2f6e163a943806143b437035a35e0"} Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.900977 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6v2sq" event={"ID":"01ea99aa-eb21-4799-9557-42c3fb55945a","Type":"ContainerStarted","Data":"9bd570e87d64f9de5b72579e1811c0c122d2847d8eca056f67cefdcf7b8d3d6f"} Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.903543 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6v2sq" podUID="01ea99aa-eb21-4799-9557-42c3fb55945a" Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.903609 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm" event={"ID":"b54e2237-603a-44ad-a129-04736cf749b2","Type":"ContainerStarted","Data":"d354f5f451b0d3f76f0897ebe74945c06349882a6ce0bb157af5c3891bd652dd"} Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.907833 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5" event={"ID":"da851ddd-2b27-45f0-b149-de32ae21ad91","Type":"ContainerStarted","Data":"f856db644f5da9169430c0c5fab980932b910b982e9db6e7990a9c6003480f6d"} Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.908047 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm" podUID="b54e2237-603a-44ad-a129-04736cf749b2" Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.910772 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:5592ec4a6fbe2c832d1828b51af0b907e5d733d478b6f378a9b2f6d6cf0ac505\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5" podUID="da851ddd-2b27-45f0-b149-de32ae21ad91" Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.914678 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-cgv66" event={"ID":"2ddf1e89-9c89-4052-aa1b-6fb84438b86d","Type":"ContainerStarted","Data":"531be4af8b9aa765e5a8e039032a605d6f34837c3864b85c4da7c3cd37d61378"} Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.924509 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8" event={"ID":"003210d3-5572-44bd-aae5-d5e24aac16a5","Type":"ContainerStarted","Data":"463fcdec7f496474c27aba53d573e8b949915bf1feb7cf322e622bc5b56ad357"} Mar 09 13:38:11 crc kubenswrapper[4764]: I0309 13:38:11.932050 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr" event={"ID":"f705ec78-e960-4200-b5a6-f3d4310f1bd5","Type":"ContainerStarted","Data":"14622df55451775e15d1e15f57b322fc2931130167f7083e5b3dc7fefa133ca3"} Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.933569 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:f309cdea8084a4b1e8cbcd732d6e250fd93c55cfd1b48ba9026907c8591faab7\\\"\"" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8" podUID="003210d3-5572-44bd-aae5-d5e24aac16a5" Mar 09 13:38:11 crc kubenswrapper[4764]: E0309 13:38:11.934130 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr" podUID="f705ec78-e960-4200-b5a6-f3d4310f1bd5" Mar 09 13:38:12 crc kubenswrapper[4764]: I0309 13:38:12.740464 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert\") pod \"infra-operator-controller-manager-5995f4446f-m58s9\" (UID: \"bfda7896-83e3-407c-9eb5-74fbc11104f0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" Mar 09 13:38:12 crc kubenswrapper[4764]: E0309 13:38:12.742153 4764 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 13:38:12 crc kubenswrapper[4764]: E0309 13:38:12.742232 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert podName:bfda7896-83e3-407c-9eb5-74fbc11104f0 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:16.742207883 +0000 UTC m=+1051.992379791 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert") pod "infra-operator-controller-manager-5995f4446f-m58s9" (UID: "bfda7896-83e3-407c-9eb5-74fbc11104f0") : secret "infra-operator-webhook-server-cert" not found Mar 09 13:38:12 crc kubenswrapper[4764]: E0309 13:38:12.960451 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr" podUID="f705ec78-e960-4200-b5a6-f3d4310f1bd5" Mar 09 13:38:12 crc kubenswrapper[4764]: E0309 13:38:12.960587 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6v2sq" podUID="01ea99aa-eb21-4799-9557-42c3fb55945a" Mar 09 13:38:12 crc kubenswrapper[4764]: E0309 13:38:12.960683 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm" podUID="b54e2237-603a-44ad-a129-04736cf749b2" Mar 09 13:38:12 crc kubenswrapper[4764]: E0309 13:38:12.960767 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e\\\"\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w" podUID="c44e76b2-0de9-4a5b-93ee-536c6300157f" Mar 09 13:38:12 crc kubenswrapper[4764]: E0309 13:38:12.960880 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:5592ec4a6fbe2c832d1828b51af0b907e5d733d478b6f378a9b2f6d6cf0ac505\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5" podUID="da851ddd-2b27-45f0-b149-de32ae21ad91" Mar 09 13:38:12 crc kubenswrapper[4764]: E0309 13:38:12.960935 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:f309cdea8084a4b1e8cbcd732d6e250fd93c55cfd1b48ba9026907c8591faab7\\\"\"" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8" podUID="003210d3-5572-44bd-aae5-d5e24aac16a5" Mar 09 13:38:13 crc kubenswrapper[4764]: I0309 13:38:13.252254 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm\" (UID: \"47bd7072-a414-4ce8-800b-753b7054be23\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:13 crc kubenswrapper[4764]: E0309 13:38:13.252776 4764 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:38:13 crc kubenswrapper[4764]: E0309 13:38:13.252886 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert podName:47bd7072-a414-4ce8-800b-753b7054be23 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:17.2528613 +0000 UTC m=+1052.503033208 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" (UID: "47bd7072-a414-4ce8-800b-753b7054be23") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:38:13 crc kubenswrapper[4764]: I0309 13:38:13.557441 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:13 crc kubenswrapper[4764]: I0309 13:38:13.557576 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:13 crc kubenswrapper[4764]: E0309 13:38:13.557787 4764 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 13:38:13 crc kubenswrapper[4764]: E0309 13:38:13.557855 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs podName:e11f44d8-58a5-4fc7-b05b-e2e688647d01 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:17.557835143 +0000 UTC m=+1052.808007061 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs") pod "openstack-operator-controller-manager-6746d697b-lr6nx" (UID: "e11f44d8-58a5-4fc7-b05b-e2e688647d01") : secret "metrics-server-cert" not found Mar 09 13:38:13 crc kubenswrapper[4764]: E0309 13:38:13.557989 4764 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 13:38:13 crc kubenswrapper[4764]: E0309 13:38:13.558109 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs podName:e11f44d8-58a5-4fc7-b05b-e2e688647d01 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:17.55808276 +0000 UTC m=+1052.808254668 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs") pod "openstack-operator-controller-manager-6746d697b-lr6nx" (UID: "e11f44d8-58a5-4fc7-b05b-e2e688647d01") : secret "webhook-server-cert" not found Mar 09 13:38:16 crc kubenswrapper[4764]: I0309 13:38:16.824433 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert\") pod \"infra-operator-controller-manager-5995f4446f-m58s9\" (UID: \"bfda7896-83e3-407c-9eb5-74fbc11104f0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" Mar 09 13:38:16 crc kubenswrapper[4764]: E0309 13:38:16.824619 4764 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 13:38:16 crc kubenswrapper[4764]: E0309 13:38:16.824719 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert podName:bfda7896-83e3-407c-9eb5-74fbc11104f0 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:24.824696588 +0000 UTC m=+1060.074868496 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert") pod "infra-operator-controller-manager-5995f4446f-m58s9" (UID: "bfda7896-83e3-407c-9eb5-74fbc11104f0") : secret "infra-operator-webhook-server-cert" not found Mar 09 13:38:17 crc kubenswrapper[4764]: I0309 13:38:17.334073 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm\" (UID: \"47bd7072-a414-4ce8-800b-753b7054be23\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:17 crc kubenswrapper[4764]: E0309 13:38:17.334317 4764 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:38:17 crc kubenswrapper[4764]: E0309 13:38:17.334388 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert podName:47bd7072-a414-4ce8-800b-753b7054be23 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:25.334371289 +0000 UTC m=+1060.584543187 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" (UID: "47bd7072-a414-4ce8-800b-753b7054be23") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:38:17 crc kubenswrapper[4764]: I0309 13:38:17.639186 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:17 crc kubenswrapper[4764]: I0309 13:38:17.639274 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:17 crc kubenswrapper[4764]: E0309 13:38:17.639364 4764 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 13:38:17 crc kubenswrapper[4764]: E0309 13:38:17.639388 4764 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 13:38:17 crc kubenswrapper[4764]: E0309 13:38:17.639423 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs podName:e11f44d8-58a5-4fc7-b05b-e2e688647d01 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:25.639409694 +0000 UTC m=+1060.889581602 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs") pod "openstack-operator-controller-manager-6746d697b-lr6nx" (UID: "e11f44d8-58a5-4fc7-b05b-e2e688647d01") : secret "webhook-server-cert" not found Mar 09 13:38:17 crc kubenswrapper[4764]: E0309 13:38:17.639438 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs podName:e11f44d8-58a5-4fc7-b05b-e2e688647d01 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:25.639432475 +0000 UTC m=+1060.889604383 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs") pod "openstack-operator-controller-manager-6746d697b-lr6nx" (UID: "e11f44d8-58a5-4fc7-b05b-e2e688647d01") : secret "metrics-server-cert" not found Mar 09 13:38:22 crc kubenswrapper[4764]: E0309 13:38:22.687798 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:3f9b0446a124745439306dc3bb7faec8c02c0b6be33f788b9d455fa57fb60120" Mar 09 13:38:22 crc kubenswrapper[4764]: E0309 13:38:22.688596 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:3f9b0446a124745439306dc3bb7faec8c02c0b6be33f788b9d455fa57fb60120,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-znvfv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-6db6876945-82cg8_openstack-operators(e220a3f1-4dbe-4ee6-9b19-26985fa998cf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:38:22 crc kubenswrapper[4764]: E0309 13:38:22.689772 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-82cg8" podUID="e220a3f1-4dbe-4ee6-9b19-26985fa998cf" Mar 09 13:38:23 crc kubenswrapper[4764]: E0309 13:38:23.052920 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:3f9b0446a124745439306dc3bb7faec8c02c0b6be33f788b9d455fa57fb60120\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-82cg8" podUID="e220a3f1-4dbe-4ee6-9b19-26985fa998cf" Mar 09 13:38:23 crc kubenswrapper[4764]: E0309 13:38:23.351871 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:7961c67cfc87de69055f8330771af625f73d857426c4bb17ebb888ead843fff3" Mar 09 13:38:23 crc kubenswrapper[4764]: E0309 13:38:23.352187 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:7961c67cfc87de69055f8330771af625f73d857426c4bb17ebb888ead843fff3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lnz4t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-55d77d7b5c-nppjq_openstack-operators(4c271ca0-0c25-46d1-b730-e94f68397e29): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:38:23 crc kubenswrapper[4764]: E0309 13:38:23.353354 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-nppjq" podUID="4c271ca0-0c25-46d1-b730-e94f68397e29" Mar 09 13:38:24 crc kubenswrapper[4764]: E0309 13:38:24.010688 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:81e43c058d9af1d3bc31704010c630bc2a574c2ee388aa0ffe8c7b9621a7d051" Mar 09 13:38:24 crc kubenswrapper[4764]: E0309 13:38:24.012334 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:81e43c058d9af1d3bc31704010c630bc2a574c2ee388aa0ffe8c7b9621a7d051,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-95kb4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-64db6967f8-mjf6m_openstack-operators(488ff419-d889-4778-96cf-a11006c49507): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:38:24 crc kubenswrapper[4764]: E0309 13:38:24.013600 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mjf6m" podUID="488ff419-d889-4778-96cf-a11006c49507" Mar 09 13:38:24 crc kubenswrapper[4764]: E0309 13:38:24.064231 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:7961c67cfc87de69055f8330771af625f73d857426c4bb17ebb888ead843fff3\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-nppjq" podUID="4c271ca0-0c25-46d1-b730-e94f68397e29" Mar 09 13:38:24 crc kubenswrapper[4764]: E0309 13:38:24.064601 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:81e43c058d9af1d3bc31704010c630bc2a574c2ee388aa0ffe8c7b9621a7d051\\\"\"" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mjf6m" podUID="488ff419-d889-4778-96cf-a11006c49507" Mar 09 13:38:24 crc kubenswrapper[4764]: E0309 13:38:24.504800 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c" Mar 09 13:38:24 crc kubenswrapper[4764]: E0309 13:38:24.505025 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-flbm2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7c789f89c6-wv2rp_openstack-operators(32eb5815-c566-4177-8b47-f756807d4a30): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:38:24 crc kubenswrapper[4764]: E0309 13:38:24.506106 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wv2rp" podUID="32eb5815-c566-4177-8b47-f756807d4a30" Mar 09 13:38:24 crc kubenswrapper[4764]: I0309 13:38:24.849535 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert\") pod \"infra-operator-controller-manager-5995f4446f-m58s9\" (UID: \"bfda7896-83e3-407c-9eb5-74fbc11104f0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" Mar 09 13:38:24 crc kubenswrapper[4764]: I0309 13:38:24.863337 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfda7896-83e3-407c-9eb5-74fbc11104f0-cert\") pod \"infra-operator-controller-manager-5995f4446f-m58s9\" (UID: \"bfda7896-83e3-407c-9eb5-74fbc11104f0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" Mar 09 13:38:24 crc kubenswrapper[4764]: I0309 13:38:24.917297 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.070224 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-jfgzw" event={"ID":"615473d3-072e-4685-8f32-73a44badf1e2","Type":"ContainerStarted","Data":"ae438d53b49a18ccfc8b19d558db62b8a25bf552861a59f0bb99871f82729c57"} Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.070357 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-jfgzw" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.072304 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-cgv66" event={"ID":"2ddf1e89-9c89-4052-aa1b-6fb84438b86d","Type":"ContainerStarted","Data":"045d880516f76471069e4310ea270473fbf523a5785fd893e4d36f9a22913f9e"} Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.072408 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-54688575f-cgv66" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.073693 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c7bcbc569-qhpvs" event={"ID":"5cd7eb92-2fae-4978-a5e9-58fa87c63e84","Type":"ContainerStarted","Data":"0e9828b895f4ce1da3226383c620e966fe893adf71a5c073fa63a8068e4716da"} Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.073809 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c7bcbc569-qhpvs" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.074990 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dm7rn" event={"ID":"26535a82-8d70-4623-b2b4-7dd1546d48d6","Type":"ContainerStarted","Data":"75ac040f402881ef08927b332e8dadd06a3b59ceb485dd8f6e92a2c3c453a3ee"} Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.075077 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dm7rn" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.080832 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-d65xp" event={"ID":"867908a2-f085-4f3d-b569-84c915f730b1","Type":"ContainerStarted","Data":"e6be54291177274fb63d942fce6918f669c313a3bc8ed8821137142306cf6877"} Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.080934 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-d65xp" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.105370 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-jfgzw" podStartSLOduration=2.467226953 podStartE2EDuration="16.105349484s" podCreationTimestamp="2026-03-09 13:38:09 +0000 UTC" firstStartedPulling="2026-03-09 13:38:10.838683192 +0000 UTC m=+1046.088855100" lastFinishedPulling="2026-03-09 13:38:24.476805723 +0000 UTC m=+1059.726977631" observedRunningTime="2026-03-09 13:38:25.102982832 +0000 UTC m=+1060.353154740" watchObservedRunningTime="2026-03-09 13:38:25.105349484 +0000 UTC m=+1060.355521392" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.130564 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-cmtpc" event={"ID":"725c0dd0-07d1-4a1c-b223-e8bec76cc7ff","Type":"ContainerStarted","Data":"5269a9d24de0b25114963ad28ea8ef517ecd3033e32d02bf1974160e408e95f4"} Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.131465 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-cmtpc" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.143068 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c7bcbc569-qhpvs" podStartSLOduration=3.329941973 podStartE2EDuration="17.143051867s" podCreationTimestamp="2026-03-09 13:38:08 +0000 UTC" firstStartedPulling="2026-03-09 13:38:10.663590806 +0000 UTC m=+1045.913762714" lastFinishedPulling="2026-03-09 13:38:24.4767007 +0000 UTC m=+1059.726872608" observedRunningTime="2026-03-09 13:38:25.142273387 +0000 UTC m=+1060.392445305" watchObservedRunningTime="2026-03-09 13:38:25.143051867 +0000 UTC m=+1060.393223785" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.166097 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-jnmbv" event={"ID":"7295db10-1c36-4c17-bf1e-4c4a702c201b","Type":"ContainerStarted","Data":"4bc1ca006daeaf3def75bdee29704e431cff99826caa1f181218c14aa36c57d8"} Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.166794 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-jnmbv" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.189611 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dm7rn" podStartSLOduration=3.551551793 podStartE2EDuration="17.189588241s" podCreationTimestamp="2026-03-09 13:38:08 +0000 UTC" firstStartedPulling="2026-03-09 13:38:10.838664992 +0000 UTC m=+1046.088836900" lastFinishedPulling="2026-03-09 13:38:24.47670144 +0000 UTC m=+1059.726873348" observedRunningTime="2026-03-09 13:38:25.181532171 +0000 UTC m=+1060.431704079" watchObservedRunningTime="2026-03-09 13:38:25.189588241 +0000 UTC m=+1060.439760149" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.194389 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hvpbz" event={"ID":"5bd11a3c-79f3-4aa2-9d5a-78ec3c18cb31","Type":"ContainerStarted","Data":"36ae9cc3eb8d582908ec232b9caf649350d5c1ef549bff11c310a5484a27090e"} Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.195256 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hvpbz" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.234008 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-4cpsz" event={"ID":"c6a51c61-770b-4b46-9dd3-1f61e6f5ebc8","Type":"ContainerStarted","Data":"081b4d02546b89537610829d297210a1a219e9ed2e9ccc5e12eda310701d7739"} Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.234759 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-4cpsz" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.265081 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5xc2s" event={"ID":"3da43711-be34-4189-b686-e8e9bc9e7265","Type":"ContainerStarted","Data":"324c5be969cd92e40a6215cf9da106eb9e0ec3b444e208508566e41cfc6c75ee"} Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.265437 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5xc2s" Mar 09 13:38:25 crc kubenswrapper[4764]: E0309 13:38:25.273934 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wv2rp" podUID="32eb5815-c566-4177-8b47-f756807d4a30" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.311485 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-54688575f-cgv66" podStartSLOduration=3.695826775 podStartE2EDuration="17.311462789s" podCreationTimestamp="2026-03-09 13:38:08 +0000 UTC" firstStartedPulling="2026-03-09 13:38:10.862861663 +0000 UTC m=+1046.113033581" lastFinishedPulling="2026-03-09 13:38:24.478497677 +0000 UTC m=+1059.728669595" observedRunningTime="2026-03-09 13:38:25.303091711 +0000 UTC m=+1060.553263629" watchObservedRunningTime="2026-03-09 13:38:25.311462789 +0000 UTC m=+1060.561634717" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.328299 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-d65xp" podStartSLOduration=2.820098716 podStartE2EDuration="16.314248262s" podCreationTimestamp="2026-03-09 13:38:09 +0000 UTC" firstStartedPulling="2026-03-09 13:38:11.023204834 +0000 UTC m=+1046.273376732" lastFinishedPulling="2026-03-09 13:38:24.51735437 +0000 UTC m=+1059.767526278" observedRunningTime="2026-03-09 13:38:25.227996783 +0000 UTC m=+1060.478168691" watchObservedRunningTime="2026-03-09 13:38:25.314248262 +0000 UTC m=+1060.564420170" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.371890 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm\" (UID: \"47bd7072-a414-4ce8-800b-753b7054be23\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:25 crc kubenswrapper[4764]: E0309 13:38:25.372036 4764 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:38:25 crc kubenswrapper[4764]: E0309 13:38:25.372084 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert podName:47bd7072-a414-4ce8-800b-753b7054be23 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:41.37206978 +0000 UTC m=+1076.622241688 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" (UID: "47bd7072-a414-4ce8-800b-753b7054be23") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.517684 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hvpbz" podStartSLOduration=3.590751344 podStartE2EDuration="17.517666316s" podCreationTimestamp="2026-03-09 13:38:08 +0000 UTC" firstStartedPulling="2026-03-09 13:38:10.578092346 +0000 UTC m=+1045.828264254" lastFinishedPulling="2026-03-09 13:38:24.505007318 +0000 UTC m=+1059.755179226" observedRunningTime="2026-03-09 13:38:25.44114258 +0000 UTC m=+1060.691314498" watchObservedRunningTime="2026-03-09 13:38:25.517666316 +0000 UTC m=+1060.767838224" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.590523 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5xc2s" podStartSLOduration=3.676560931 podStartE2EDuration="17.590498945s" podCreationTimestamp="2026-03-09 13:38:08 +0000 UTC" firstStartedPulling="2026-03-09 13:38:10.586189817 +0000 UTC m=+1045.836361725" lastFinishedPulling="2026-03-09 13:38:24.500127831 +0000 UTC m=+1059.750299739" observedRunningTime="2026-03-09 13:38:25.55501544 +0000 UTC m=+1060.805187358" watchObservedRunningTime="2026-03-09 13:38:25.590498945 +0000 UTC m=+1060.840670853" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.604688 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-4cpsz" podStartSLOduration=2.777043832 podStartE2EDuration="16.604662835s" podCreationTimestamp="2026-03-09 13:38:09 +0000 UTC" firstStartedPulling="2026-03-09 13:38:10.67718224 +0000 UTC m=+1045.927354148" lastFinishedPulling="2026-03-09 13:38:24.504801243 +0000 UTC m=+1059.754973151" observedRunningTime="2026-03-09 13:38:25.58071015 +0000 UTC m=+1060.830882058" watchObservedRunningTime="2026-03-09 13:38:25.604662835 +0000 UTC m=+1060.854834743" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.662602 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-jnmbv" podStartSLOduration=3.781753806 podStartE2EDuration="17.662576795s" podCreationTimestamp="2026-03-09 13:38:08 +0000 UTC" firstStartedPulling="2026-03-09 13:38:10.619062145 +0000 UTC m=+1045.869234053" lastFinishedPulling="2026-03-09 13:38:24.499885134 +0000 UTC m=+1059.750057042" observedRunningTime="2026-03-09 13:38:25.656139917 +0000 UTC m=+1060.906311835" watchObservedRunningTime="2026-03-09 13:38:25.662576795 +0000 UTC m=+1060.912748713" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.673631 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9"] Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.680964 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.681053 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:25 crc kubenswrapper[4764]: E0309 13:38:25.681184 4764 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 13:38:25 crc kubenswrapper[4764]: E0309 13:38:25.681234 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs podName:e11f44d8-58a5-4fc7-b05b-e2e688647d01 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:41.681219901 +0000 UTC m=+1076.931391799 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs") pod "openstack-operator-controller-manager-6746d697b-lr6nx" (UID: "e11f44d8-58a5-4fc7-b05b-e2e688647d01") : secret "metrics-server-cert" not found Mar 09 13:38:25 crc kubenswrapper[4764]: E0309 13:38:25.682137 4764 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 13:38:25 crc kubenswrapper[4764]: E0309 13:38:25.682262 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs podName:e11f44d8-58a5-4fc7-b05b-e2e688647d01 nodeName:}" failed. No retries permitted until 2026-03-09 13:38:41.682225637 +0000 UTC m=+1076.932397545 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs") pod "openstack-operator-controller-manager-6746d697b-lr6nx" (UID: "e11f44d8-58a5-4fc7-b05b-e2e688647d01") : secret "webhook-server-cert" not found Mar 09 13:38:25 crc kubenswrapper[4764]: I0309 13:38:25.699130 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-cmtpc" podStartSLOduration=3.803510523 podStartE2EDuration="17.699106748s" podCreationTimestamp="2026-03-09 13:38:08 +0000 UTC" firstStartedPulling="2026-03-09 13:38:10.603638062 +0000 UTC m=+1045.853809960" lastFinishedPulling="2026-03-09 13:38:24.499234277 +0000 UTC m=+1059.749406185" observedRunningTime="2026-03-09 13:38:25.673231543 +0000 UTC m=+1060.923403461" watchObservedRunningTime="2026-03-09 13:38:25.699106748 +0000 UTC m=+1060.949278666" Mar 09 13:38:26 crc kubenswrapper[4764]: I0309 13:38:26.274355 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" event={"ID":"bfda7896-83e3-407c-9eb5-74fbc11104f0","Type":"ContainerStarted","Data":"95e2fad6142dad7b1e60999f1058d9a3d85f897812628e3cf85b60a66627ed9f"} Mar 09 13:38:29 crc kubenswrapper[4764]: I0309 13:38:29.042167 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-cmtpc" Mar 09 13:38:29 crc kubenswrapper[4764]: I0309 13:38:29.174476 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-jnmbv" Mar 09 13:38:29 crc kubenswrapper[4764]: I0309 13:38:29.209406 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-5xc2s" Mar 09 13:38:29 crc kubenswrapper[4764]: I0309 13:38:29.272313 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hvpbz" Mar 09 13:38:29 crc kubenswrapper[4764]: I0309 13:38:29.419762 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c7bcbc569-qhpvs" Mar 09 13:38:29 crc kubenswrapper[4764]: I0309 13:38:29.661725 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-54688575f-cgv66" Mar 09 13:38:29 crc kubenswrapper[4764]: I0309 13:38:29.662880 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dm7rn" Mar 09 13:38:29 crc kubenswrapper[4764]: I0309 13:38:29.784405 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-4cpsz" Mar 09 13:38:29 crc kubenswrapper[4764]: I0309 13:38:29.883920 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-d65xp" Mar 09 13:38:29 crc kubenswrapper[4764]: I0309 13:38:29.908734 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-jfgzw" Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.381271 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" event={"ID":"bfda7896-83e3-407c-9eb5-74fbc11104f0","Type":"ContainerStarted","Data":"974135c1a92e45c27a888b5eaf4c65975b81ae4d8749dd46f10b5bbf6ed4fa9b"} Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.382091 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.383135 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5" event={"ID":"da851ddd-2b27-45f0-b149-de32ae21ad91","Type":"ContainerStarted","Data":"7b0e39b49694b7eca18a30e1ed0d1cc40b8b2bde68d1b29cec6e28254f36d633"} Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.383321 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5" Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.385216 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8" event={"ID":"003210d3-5572-44bd-aae5-d5e24aac16a5","Type":"ContainerStarted","Data":"1794d27705384d6e9f2c74f66cae2eb2e036ea2c06eaec91e618c5c2ed9d195f"} Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.385513 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8" Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.387060 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr" event={"ID":"f705ec78-e960-4200-b5a6-f3d4310f1bd5","Type":"ContainerStarted","Data":"14d155d97f6558b33920df0a1f0a2fae6e95d124a4be794d77fceb435430053f"} Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.387982 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr" Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.389743 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6v2sq" event={"ID":"01ea99aa-eb21-4799-9557-42c3fb55945a","Type":"ContainerStarted","Data":"5fa33257a5e5048798aca7a136a63ea6e6bfb6b1ba9b5a0a5be574c0d80f2544"} Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.391981 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm" event={"ID":"b54e2237-603a-44ad-a129-04736cf749b2","Type":"ContainerStarted","Data":"fd38715546733b42ed8bcf4f24f786f982a201fe3ddb7aa977cdd779f8d33212"} Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.392480 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm" Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.393778 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w" event={"ID":"c44e76b2-0de9-4a5b-93ee-536c6300157f","Type":"ContainerStarted","Data":"70296fbb3ad4f6ca38bff2fddbcb5d04ddb04dbeaadb8abe0f16a454b63045e1"} Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.394184 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w" Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.441336 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" podStartSLOduration=18.939661329 podStartE2EDuration="27.441224907s" podCreationTimestamp="2026-03-09 13:38:08 +0000 UTC" firstStartedPulling="2026-03-09 13:38:25.688174192 +0000 UTC m=+1060.938346100" lastFinishedPulling="2026-03-09 13:38:34.18973777 +0000 UTC m=+1069.439909678" observedRunningTime="2026-03-09 13:38:35.410969998 +0000 UTC m=+1070.661141906" watchObservedRunningTime="2026-03-09 13:38:35.441224907 +0000 UTC m=+1070.691396825" Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.455282 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr" podStartSLOduration=3.334425669 podStartE2EDuration="26.455249453s" podCreationTimestamp="2026-03-09 13:38:09 +0000 UTC" firstStartedPulling="2026-03-09 13:38:11.068812474 +0000 UTC m=+1046.318984372" lastFinishedPulling="2026-03-09 13:38:34.189636258 +0000 UTC m=+1069.439808156" observedRunningTime="2026-03-09 13:38:35.439072941 +0000 UTC m=+1070.689244859" watchObservedRunningTime="2026-03-09 13:38:35.455249453 +0000 UTC m=+1070.705421371" Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.467199 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6v2sq" podStartSLOduration=3.208447443 podStartE2EDuration="26.467156563s" podCreationTimestamp="2026-03-09 13:38:09 +0000 UTC" firstStartedPulling="2026-03-09 13:38:11.045030573 +0000 UTC m=+1046.295202481" lastFinishedPulling="2026-03-09 13:38:34.303739693 +0000 UTC m=+1069.553911601" observedRunningTime="2026-03-09 13:38:35.466384253 +0000 UTC m=+1070.716556171" watchObservedRunningTime="2026-03-09 13:38:35.467156563 +0000 UTC m=+1070.717328471" Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.504637 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm" podStartSLOduration=3.793507241 podStartE2EDuration="26.50460328s" podCreationTimestamp="2026-03-09 13:38:09 +0000 UTC" firstStartedPulling="2026-03-09 13:38:11.054512111 +0000 UTC m=+1046.304684019" lastFinishedPulling="2026-03-09 13:38:33.76560815 +0000 UTC m=+1069.015780058" observedRunningTime="2026-03-09 13:38:35.490977604 +0000 UTC m=+1070.741149522" watchObservedRunningTime="2026-03-09 13:38:35.50460328 +0000 UTC m=+1070.754775188" Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.522019 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8" podStartSLOduration=3.363851555 podStartE2EDuration="26.521999713s" podCreationTimestamp="2026-03-09 13:38:09 +0000 UTC" firstStartedPulling="2026-03-09 13:38:11.032339662 +0000 UTC m=+1046.282511570" lastFinishedPulling="2026-03-09 13:38:34.19048782 +0000 UTC m=+1069.440659728" observedRunningTime="2026-03-09 13:38:35.516182252 +0000 UTC m=+1070.766354160" watchObservedRunningTime="2026-03-09 13:38:35.521999713 +0000 UTC m=+1070.772171621" Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.542972 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w" podStartSLOduration=3.384121034 podStartE2EDuration="26.54295669s" podCreationTimestamp="2026-03-09 13:38:09 +0000 UTC" firstStartedPulling="2026-03-09 13:38:11.035370171 +0000 UTC m=+1046.285542079" lastFinishedPulling="2026-03-09 13:38:34.194205827 +0000 UTC m=+1069.444377735" observedRunningTime="2026-03-09 13:38:35.537569489 +0000 UTC m=+1070.787741397" watchObservedRunningTime="2026-03-09 13:38:35.54295669 +0000 UTC m=+1070.793128598" Mar 09 13:38:35 crc kubenswrapper[4764]: I0309 13:38:35.568103 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5" podStartSLOduration=4.256144527 podStartE2EDuration="27.568070895s" podCreationTimestamp="2026-03-09 13:38:08 +0000 UTC" firstStartedPulling="2026-03-09 13:38:10.877362061 +0000 UTC m=+1046.127533969" lastFinishedPulling="2026-03-09 13:38:34.189288429 +0000 UTC m=+1069.439460337" observedRunningTime="2026-03-09 13:38:35.561285828 +0000 UTC m=+1070.811457736" watchObservedRunningTime="2026-03-09 13:38:35.568070895 +0000 UTC m=+1070.818242803" Mar 09 13:38:39 crc kubenswrapper[4764]: I0309 13:38:39.681051 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-vkns5" Mar 09 13:38:39 crc kubenswrapper[4764]: I0309 13:38:39.864795 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-gv2sm" Mar 09 13:38:39 crc kubenswrapper[4764]: I0309 13:38:39.978801 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-8ms5w" Mar 09 13:38:40 crc kubenswrapper[4764]: I0309 13:38:40.003562 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bf8w8" Mar 09 13:38:40 crc kubenswrapper[4764]: I0309 13:38:40.224539 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7f8nr" Mar 09 13:38:41 crc kubenswrapper[4764]: I0309 13:38:41.387265 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm\" (UID: \"47bd7072-a414-4ce8-800b-753b7054be23\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:41 crc kubenswrapper[4764]: I0309 13:38:41.394950 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47bd7072-a414-4ce8-800b-753b7054be23-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm\" (UID: \"47bd7072-a414-4ce8-800b-753b7054be23\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:41 crc kubenswrapper[4764]: I0309 13:38:41.444385 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:41 crc kubenswrapper[4764]: I0309 13:38:41.691907 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:41 crc kubenswrapper[4764]: I0309 13:38:41.692271 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:41 crc kubenswrapper[4764]: I0309 13:38:41.697105 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-metrics-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:41 crc kubenswrapper[4764]: I0309 13:38:41.697243 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e11f44d8-58a5-4fc7-b05b-e2e688647d01-webhook-certs\") pod \"openstack-operator-controller-manager-6746d697b-lr6nx\" (UID: \"e11f44d8-58a5-4fc7-b05b-e2e688647d01\") " pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:41 crc kubenswrapper[4764]: I0309 13:38:41.867941 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm"] Mar 09 13:38:41 crc kubenswrapper[4764]: W0309 13:38:41.870992 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47bd7072_a414_4ce8_800b_753b7054be23.slice/crio-f2d2c63ac396a240739b7e93660470611bb4046555c2f2ed2dcc9f8c30051d83 WatchSource:0}: Error finding container f2d2c63ac396a240739b7e93660470611bb4046555c2f2ed2dcc9f8c30051d83: Status 404 returned error can't find the container with id f2d2c63ac396a240739b7e93660470611bb4046555c2f2ed2dcc9f8c30051d83 Mar 09 13:38:41 crc kubenswrapper[4764]: I0309 13:38:41.899479 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:42 crc kubenswrapper[4764]: I0309 13:38:42.328959 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx"] Mar 09 13:38:42 crc kubenswrapper[4764]: W0309 13:38:42.347803 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode11f44d8_58a5_4fc7_b05b_e2e688647d01.slice/crio-9c79be0815ed12162432ecdd81215490c6a51b7f8a732124654ee50eaf6e69fc WatchSource:0}: Error finding container 9c79be0815ed12162432ecdd81215490c6a51b7f8a732124654ee50eaf6e69fc: Status 404 returned error can't find the container with id 9c79be0815ed12162432ecdd81215490c6a51b7f8a732124654ee50eaf6e69fc Mar 09 13:38:42 crc kubenswrapper[4764]: I0309 13:38:42.448627 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" event={"ID":"47bd7072-a414-4ce8-800b-753b7054be23","Type":"ContainerStarted","Data":"f2d2c63ac396a240739b7e93660470611bb4046555c2f2ed2dcc9f8c30051d83"} Mar 09 13:38:42 crc kubenswrapper[4764]: I0309 13:38:42.450231 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" event={"ID":"e11f44d8-58a5-4fc7-b05b-e2e688647d01","Type":"ContainerStarted","Data":"9c79be0815ed12162432ecdd81215490c6a51b7f8a732124654ee50eaf6e69fc"} Mar 09 13:38:42 crc kubenswrapper[4764]: I0309 13:38:42.452034 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wv2rp" event={"ID":"32eb5815-c566-4177-8b47-f756807d4a30","Type":"ContainerStarted","Data":"ec192b101ade6b100e61556ec097b32f471e12e987e05ceaccafb3bb50ae0870"} Mar 09 13:38:42 crc kubenswrapper[4764]: I0309 13:38:42.452246 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wv2rp" Mar 09 13:38:42 crc kubenswrapper[4764]: I0309 13:38:42.476779 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wv2rp" podStartSLOduration=2.899313482 podStartE2EDuration="34.476759901s" podCreationTimestamp="2026-03-09 13:38:08 +0000 UTC" firstStartedPulling="2026-03-09 13:38:10.695247031 +0000 UTC m=+1045.945418939" lastFinishedPulling="2026-03-09 13:38:42.27269344 +0000 UTC m=+1077.522865358" observedRunningTime="2026-03-09 13:38:42.471055133 +0000 UTC m=+1077.721227051" watchObservedRunningTime="2026-03-09 13:38:42.476759901 +0000 UTC m=+1077.726931809" Mar 09 13:38:43 crc kubenswrapper[4764]: I0309 13:38:43.462974 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mjf6m" event={"ID":"488ff419-d889-4778-96cf-a11006c49507","Type":"ContainerStarted","Data":"97a74b6ae0cb2a171a7bd5d9948e05701136fa8d27cb2a79d7d1897aa7e69e4f"} Mar 09 13:38:43 crc kubenswrapper[4764]: I0309 13:38:43.463396 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mjf6m" Mar 09 13:38:43 crc kubenswrapper[4764]: I0309 13:38:43.464905 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-82cg8" event={"ID":"e220a3f1-4dbe-4ee6-9b19-26985fa998cf","Type":"ContainerStarted","Data":"2bcd7da1fd872402812c12b48979d7c3ee03de5d7e45c47c01e79c5313334463"} Mar 09 13:38:43 crc kubenswrapper[4764]: I0309 13:38:43.465351 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-82cg8" Mar 09 13:38:43 crc kubenswrapper[4764]: I0309 13:38:43.468151 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" event={"ID":"e11f44d8-58a5-4fc7-b05b-e2e688647d01","Type":"ContainerStarted","Data":"fef489b0b95c2cd0698c79035bece550540b4ca1ec0dbd718ea9cb6e1b8bf684"} Mar 09 13:38:43 crc kubenswrapper[4764]: I0309 13:38:43.468495 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:38:43 crc kubenswrapper[4764]: I0309 13:38:43.469985 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-nppjq" event={"ID":"4c271ca0-0c25-46d1-b730-e94f68397e29","Type":"ContainerStarted","Data":"c3d0b6dfa252d650ad458c943102b7c5083ddce28606fe88f69dfee63b60abd4"} Mar 09 13:38:43 crc kubenswrapper[4764]: I0309 13:38:43.470310 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-nppjq" Mar 09 13:38:43 crc kubenswrapper[4764]: I0309 13:38:43.483338 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mjf6m" podStartSLOduration=3.749033001 podStartE2EDuration="35.48332168s" podCreationTimestamp="2026-03-09 13:38:08 +0000 UTC" firstStartedPulling="2026-03-09 13:38:10.619208648 +0000 UTC m=+1045.869380556" lastFinishedPulling="2026-03-09 13:38:42.353497327 +0000 UTC m=+1077.603669235" observedRunningTime="2026-03-09 13:38:43.476387069 +0000 UTC m=+1078.726558977" watchObservedRunningTime="2026-03-09 13:38:43.48332168 +0000 UTC m=+1078.733493588" Mar 09 13:38:43 crc kubenswrapper[4764]: I0309 13:38:43.502893 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-82cg8" podStartSLOduration=2.994644228 podStartE2EDuration="35.50287792s" podCreationTimestamp="2026-03-09 13:38:08 +0000 UTC" firstStartedPulling="2026-03-09 13:38:10.064102362 +0000 UTC m=+1045.314274270" lastFinishedPulling="2026-03-09 13:38:42.572336054 +0000 UTC m=+1077.822507962" observedRunningTime="2026-03-09 13:38:43.500377715 +0000 UTC m=+1078.750549623" watchObservedRunningTime="2026-03-09 13:38:43.50287792 +0000 UTC m=+1078.753049818" Mar 09 13:38:43 crc kubenswrapper[4764]: I0309 13:38:43.539900 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" podStartSLOduration=34.539882035 podStartE2EDuration="34.539882035s" podCreationTimestamp="2026-03-09 13:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:38:43.535200943 +0000 UTC m=+1078.785372871" watchObservedRunningTime="2026-03-09 13:38:43.539882035 +0000 UTC m=+1078.790053943" Mar 09 13:38:43 crc kubenswrapper[4764]: I0309 13:38:43.561133 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-nppjq" podStartSLOduration=3.16188448 podStartE2EDuration="35.561115819s" podCreationTimestamp="2026-03-09 13:38:08 +0000 UTC" firstStartedPulling="2026-03-09 13:38:10.171798181 +0000 UTC m=+1045.421970079" lastFinishedPulling="2026-03-09 13:38:42.5710295 +0000 UTC m=+1077.821201418" observedRunningTime="2026-03-09 13:38:43.556384536 +0000 UTC m=+1078.806556444" watchObservedRunningTime="2026-03-09 13:38:43.561115819 +0000 UTC m=+1078.811287727" Mar 09 13:38:44 crc kubenswrapper[4764]: I0309 13:38:44.924890 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-m58s9" Mar 09 13:38:45 crc kubenswrapper[4764]: I0309 13:38:45.491613 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" event={"ID":"47bd7072-a414-4ce8-800b-753b7054be23","Type":"ContainerStarted","Data":"debf0a3ca63ffa338695e3592e5d4da09f8382e334d0971aa3fba5cab6bb3a08"} Mar 09 13:38:45 crc kubenswrapper[4764]: I0309 13:38:45.520947 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" podStartSLOduration=33.953253568 podStartE2EDuration="36.520914508s" podCreationTimestamp="2026-03-09 13:38:09 +0000 UTC" firstStartedPulling="2026-03-09 13:38:41.874147596 +0000 UTC m=+1077.124319514" lastFinishedPulling="2026-03-09 13:38:44.441808546 +0000 UTC m=+1079.691980454" observedRunningTime="2026-03-09 13:38:45.515086136 +0000 UTC m=+1080.765258044" watchObservedRunningTime="2026-03-09 13:38:45.520914508 +0000 UTC m=+1080.771086416" Mar 09 13:38:46 crc kubenswrapper[4764]: I0309 13:38:46.499612 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:48 crc kubenswrapper[4764]: I0309 13:38:48.981034 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-82cg8" Mar 09 13:38:49 crc kubenswrapper[4764]: I0309 13:38:48.998376 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-nppjq" Mar 09 13:38:49 crc kubenswrapper[4764]: I0309 13:38:49.109500 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mjf6m" Mar 09 13:38:49 crc kubenswrapper[4764]: I0309 13:38:49.572564 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wv2rp" Mar 09 13:38:49 crc kubenswrapper[4764]: I0309 13:38:49.748196 4764 scope.go:117] "RemoveContainer" containerID="492bc9f6bc85937ff3b8aee6d3f29e3e150f1bc426852bf9cdf6e5878286e321" Mar 09 13:38:51 crc kubenswrapper[4764]: I0309 13:38:51.451626 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm" Mar 09 13:38:51 crc kubenswrapper[4764]: I0309 13:38:51.905701 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6746d697b-lr6nx" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.017161 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ggxj5"] Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.019712 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ggxj5" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.022028 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.024380 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.024861 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-x8stb" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.029509 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.033941 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ggxj5"] Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.144198 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79abb676-0322-4a75-90f5-743c942073b4-config\") pod \"dnsmasq-dns-675f4bcbfc-ggxj5\" (UID: \"79abb676-0322-4a75-90f5-743c942073b4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ggxj5" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.144269 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzz2f\" (UniqueName: \"kubernetes.io/projected/79abb676-0322-4a75-90f5-743c942073b4-kube-api-access-wzz2f\") pod \"dnsmasq-dns-675f4bcbfc-ggxj5\" (UID: \"79abb676-0322-4a75-90f5-743c942073b4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ggxj5" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.164218 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fcc4m"] Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.166078 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.169211 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.187826 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fcc4m"] Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.245734 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79abb676-0322-4a75-90f5-743c942073b4-config\") pod \"dnsmasq-dns-675f4bcbfc-ggxj5\" (UID: \"79abb676-0322-4a75-90f5-743c942073b4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ggxj5" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.246595 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-config\") pod \"dnsmasq-dns-78dd6ddcc-fcc4m\" (UID: \"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.246531 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79abb676-0322-4a75-90f5-743c942073b4-config\") pod \"dnsmasq-dns-675f4bcbfc-ggxj5\" (UID: \"79abb676-0322-4a75-90f5-743c942073b4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ggxj5" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.246673 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzz2f\" (UniqueName: \"kubernetes.io/projected/79abb676-0322-4a75-90f5-743c942073b4-kube-api-access-wzz2f\") pod \"dnsmasq-dns-675f4bcbfc-ggxj5\" (UID: \"79abb676-0322-4a75-90f5-743c942073b4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ggxj5" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.246768 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-fcc4m\" (UID: \"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.246812 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cbdt\" (UniqueName: \"kubernetes.io/projected/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-kube-api-access-8cbdt\") pod \"dnsmasq-dns-78dd6ddcc-fcc4m\" (UID: \"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.272324 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzz2f\" (UniqueName: \"kubernetes.io/projected/79abb676-0322-4a75-90f5-743c942073b4-kube-api-access-wzz2f\") pod \"dnsmasq-dns-675f4bcbfc-ggxj5\" (UID: \"79abb676-0322-4a75-90f5-743c942073b4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ggxj5" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.340137 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ggxj5" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.349263 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-fcc4m\" (UID: \"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.349310 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cbdt\" (UniqueName: \"kubernetes.io/projected/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-kube-api-access-8cbdt\") pod \"dnsmasq-dns-78dd6ddcc-fcc4m\" (UID: \"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.349376 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-config\") pod \"dnsmasq-dns-78dd6ddcc-fcc4m\" (UID: \"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.350268 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-config\") pod \"dnsmasq-dns-78dd6ddcc-fcc4m\" (UID: \"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.350373 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-fcc4m\" (UID: \"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.376419 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cbdt\" (UniqueName: \"kubernetes.io/projected/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-kube-api-access-8cbdt\") pod \"dnsmasq-dns-78dd6ddcc-fcc4m\" (UID: \"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.487335 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.830376 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ggxj5"] Mar 09 13:39:10 crc kubenswrapper[4764]: I0309 13:39:10.898681 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fcc4m"] Mar 09 13:39:10 crc kubenswrapper[4764]: W0309 13:39:10.903222 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bf8d6e3_823d_4f28_b5dd_e4df591df8bd.slice/crio-790e722a1a82b199681b86b0375d2d82ee82392b6ce4b20994ef141a06eb2c50 WatchSource:0}: Error finding container 790e722a1a82b199681b86b0375d2d82ee82392b6ce4b20994ef141a06eb2c50: Status 404 returned error can't find the container with id 790e722a1a82b199681b86b0375d2d82ee82392b6ce4b20994ef141a06eb2c50 Mar 09 13:39:11 crc kubenswrapper[4764]: I0309 13:39:11.737255 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-ggxj5" event={"ID":"79abb676-0322-4a75-90f5-743c942073b4","Type":"ContainerStarted","Data":"689139240ebd2f6bf1f73c798b5e7965a7b610107fbcfeb6b865f8c4180b122b"} Mar 09 13:39:11 crc kubenswrapper[4764]: I0309 13:39:11.739188 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" event={"ID":"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd","Type":"ContainerStarted","Data":"790e722a1a82b199681b86b0375d2d82ee82392b6ce4b20994ef141a06eb2c50"} Mar 09 13:39:12 crc kubenswrapper[4764]: I0309 13:39:12.991278 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ggxj5"] Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.012751 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nc2v8"] Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.014497 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.025671 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nc2v8"] Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.042001 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85351658-0136-4066-b39e-808260c4dae9-config\") pod \"dnsmasq-dns-5ccc8479f9-nc2v8\" (UID: \"85351658-0136-4066-b39e-808260c4dae9\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.042068 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85351658-0136-4066-b39e-808260c4dae9-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-nc2v8\" (UID: \"85351658-0136-4066-b39e-808260c4dae9\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.042236 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltsrn\" (UniqueName: \"kubernetes.io/projected/85351658-0136-4066-b39e-808260c4dae9-kube-api-access-ltsrn\") pod \"dnsmasq-dns-5ccc8479f9-nc2v8\" (UID: \"85351658-0136-4066-b39e-808260c4dae9\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.143910 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85351658-0136-4066-b39e-808260c4dae9-config\") pod \"dnsmasq-dns-5ccc8479f9-nc2v8\" (UID: \"85351658-0136-4066-b39e-808260c4dae9\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.143975 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85351658-0136-4066-b39e-808260c4dae9-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-nc2v8\" (UID: \"85351658-0136-4066-b39e-808260c4dae9\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.144009 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltsrn\" (UniqueName: \"kubernetes.io/projected/85351658-0136-4066-b39e-808260c4dae9-kube-api-access-ltsrn\") pod \"dnsmasq-dns-5ccc8479f9-nc2v8\" (UID: \"85351658-0136-4066-b39e-808260c4dae9\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.145481 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85351658-0136-4066-b39e-808260c4dae9-config\") pod \"dnsmasq-dns-5ccc8479f9-nc2v8\" (UID: \"85351658-0136-4066-b39e-808260c4dae9\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.146030 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85351658-0136-4066-b39e-808260c4dae9-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-nc2v8\" (UID: \"85351658-0136-4066-b39e-808260c4dae9\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.183393 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltsrn\" (UniqueName: \"kubernetes.io/projected/85351658-0136-4066-b39e-808260c4dae9-kube-api-access-ltsrn\") pod \"dnsmasq-dns-5ccc8479f9-nc2v8\" (UID: \"85351658-0136-4066-b39e-808260c4dae9\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.337842 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fcc4m"] Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.349135 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.375443 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-cwgfl"] Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.377209 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.394719 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-cwgfl"] Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.452411 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dtxl\" (UniqueName: \"kubernetes.io/projected/04d22384-e765-4ac8-9afa-7a31f4c347b2-kube-api-access-7dtxl\") pod \"dnsmasq-dns-57d769cc4f-cwgfl\" (UID: \"04d22384-e765-4ac8-9afa-7a31f4c347b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.452494 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04d22384-e765-4ac8-9afa-7a31f4c347b2-config\") pod \"dnsmasq-dns-57d769cc4f-cwgfl\" (UID: \"04d22384-e765-4ac8-9afa-7a31f4c347b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.452571 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04d22384-e765-4ac8-9afa-7a31f4c347b2-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-cwgfl\" (UID: \"04d22384-e765-4ac8-9afa-7a31f4c347b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.555193 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dtxl\" (UniqueName: \"kubernetes.io/projected/04d22384-e765-4ac8-9afa-7a31f4c347b2-kube-api-access-7dtxl\") pod \"dnsmasq-dns-57d769cc4f-cwgfl\" (UID: \"04d22384-e765-4ac8-9afa-7a31f4c347b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.555269 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04d22384-e765-4ac8-9afa-7a31f4c347b2-config\") pod \"dnsmasq-dns-57d769cc4f-cwgfl\" (UID: \"04d22384-e765-4ac8-9afa-7a31f4c347b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.555347 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04d22384-e765-4ac8-9afa-7a31f4c347b2-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-cwgfl\" (UID: \"04d22384-e765-4ac8-9afa-7a31f4c347b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.556409 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04d22384-e765-4ac8-9afa-7a31f4c347b2-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-cwgfl\" (UID: \"04d22384-e765-4ac8-9afa-7a31f4c347b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.557282 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04d22384-e765-4ac8-9afa-7a31f4c347b2-config\") pod \"dnsmasq-dns-57d769cc4f-cwgfl\" (UID: \"04d22384-e765-4ac8-9afa-7a31f4c347b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.575902 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dtxl\" (UniqueName: \"kubernetes.io/projected/04d22384-e765-4ac8-9afa-7a31f4c347b2-kube-api-access-7dtxl\") pod \"dnsmasq-dns-57d769cc4f-cwgfl\" (UID: \"04d22384-e765-4ac8-9afa-7a31f4c347b2\") " pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.783030 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" Mar 09 13:39:13 crc kubenswrapper[4764]: I0309 13:39:13.813040 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nc2v8"] Mar 09 13:39:13 crc kubenswrapper[4764]: W0309 13:39:13.819915 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85351658_0136_4066_b39e_808260c4dae9.slice/crio-cf38f50edd1cd803b2485a43d9bb0cf84c48deb7190f61ca1abae1471ed84bf4 WatchSource:0}: Error finding container cf38f50edd1cd803b2485a43d9bb0cf84c48deb7190f61ca1abae1471ed84bf4: Status 404 returned error can't find the container with id cf38f50edd1cd803b2485a43d9bb0cf84c48deb7190f61ca1abae1471ed84bf4 Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.120432 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-cwgfl"] Mar 09 13:39:14 crc kubenswrapper[4764]: W0309 13:39:14.124607 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04d22384_e765_4ac8_9afa_7a31f4c347b2.slice/crio-e9eba66ff4604aeb6e4bde7029daea515e55ea3fe70bfdcf16702550cc3774f9 WatchSource:0}: Error finding container e9eba66ff4604aeb6e4bde7029daea515e55ea3fe70bfdcf16702550cc3774f9: Status 404 returned error can't find the container with id e9eba66ff4604aeb6e4bde7029daea515e55ea3fe70bfdcf16702550cc3774f9 Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.168753 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.170255 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.173269 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.173491 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.173550 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.174673 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-6m67z" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.175877 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.177926 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.180754 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.187672 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.375227 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.375335 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.375399 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.375424 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.375463 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.375485 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.375512 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.375548 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.375582 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.375900 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.376037 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bzh8\" (UniqueName: \"kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-kube-api-access-4bzh8\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.477450 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.477552 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.477577 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.477619 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.477655 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.477682 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.477714 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.477760 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.477804 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.477844 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bzh8\" (UniqueName: \"kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-kube-api-access-4bzh8\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.477905 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.478326 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.479050 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.479506 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.487750 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.487964 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.488800 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.493152 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.509025 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bzh8\" (UniqueName: \"kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-kube-api-access-4bzh8\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.510789 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.513135 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.513548 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.514444 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.515830 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.519031 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.527196 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.527423 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.527571 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.532583 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.532912 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.534143 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-8dlbf" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.534297 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.539166 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.681820 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/507bcef1-e9ef-4eb1-85ce-358209b944bc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.681895 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/507bcef1-e9ef-4eb1-85ce-358209b944bc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.681932 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-config-data\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.681966 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.682241 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.682335 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w75z\" (UniqueName: \"kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-kube-api-access-2w75z\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.682507 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.682602 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.684040 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.684113 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.684157 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.768756 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" event={"ID":"85351658-0136-4066-b39e-808260c4dae9","Type":"ContainerStarted","Data":"cf38f50edd1cd803b2485a43d9bb0cf84c48deb7190f61ca1abae1471ed84bf4"} Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.770969 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" event={"ID":"04d22384-e765-4ac8-9afa-7a31f4c347b2","Type":"ContainerStarted","Data":"e9eba66ff4604aeb6e4bde7029daea515e55ea3fe70bfdcf16702550cc3774f9"} Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.785571 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.785663 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.785748 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.785780 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.785802 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.785825 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/507bcef1-e9ef-4eb1-85ce-358209b944bc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.785846 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/507bcef1-e9ef-4eb1-85ce-358209b944bc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.785872 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-config-data\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.785893 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.785911 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.785934 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w75z\" (UniqueName: \"kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-kube-api-access-2w75z\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.786017 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.786267 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.786563 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.787637 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.787825 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-config-data\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.788105 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.794932 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.795759 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/507bcef1-e9ef-4eb1-85ce-358209b944bc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.796124 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/507bcef1-e9ef-4eb1-85ce-358209b944bc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.805441 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w75z\" (UniqueName: \"kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-kube-api-access-2w75z\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.821048 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.827981 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.829963 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " pod="openstack/rabbitmq-server-0" Mar 09 13:39:14 crc kubenswrapper[4764]: I0309 13:39:14.905516 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.604238 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.606451 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.615747 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.616304 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.616369 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-8dgkg" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.621488 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.627947 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.629063 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.711161 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c87ed75-4285-4084-bce3-ee8dba7671c0-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.711351 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c87ed75-4285-4084-bce3-ee8dba7671c0-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.711617 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.711694 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0c87ed75-4285-4084-bce3-ee8dba7671c0-config-data-default\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.711802 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c57n\" (UniqueName: \"kubernetes.io/projected/0c87ed75-4285-4084-bce3-ee8dba7671c0-kube-api-access-4c57n\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.712003 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c87ed75-4285-4084-bce3-ee8dba7671c0-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.712070 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0c87ed75-4285-4084-bce3-ee8dba7671c0-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.712102 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0c87ed75-4285-4084-bce3-ee8dba7671c0-kolla-config\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.815531 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c57n\" (UniqueName: \"kubernetes.io/projected/0c87ed75-4285-4084-bce3-ee8dba7671c0-kube-api-access-4c57n\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.815626 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c87ed75-4285-4084-bce3-ee8dba7671c0-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.815672 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0c87ed75-4285-4084-bce3-ee8dba7671c0-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.815691 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0c87ed75-4285-4084-bce3-ee8dba7671c0-kolla-config\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.815722 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c87ed75-4285-4084-bce3-ee8dba7671c0-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.815776 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c87ed75-4285-4084-bce3-ee8dba7671c0-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.815811 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.815831 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0c87ed75-4285-4084-bce3-ee8dba7671c0-config-data-default\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.817866 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c87ed75-4285-4084-bce3-ee8dba7671c0-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.817961 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0c87ed75-4285-4084-bce3-ee8dba7671c0-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.818011 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.818275 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0c87ed75-4285-4084-bce3-ee8dba7671c0-config-data-default\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.825222 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0c87ed75-4285-4084-bce3-ee8dba7671c0-kolla-config\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.841894 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c87ed75-4285-4084-bce3-ee8dba7671c0-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.848820 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c57n\" (UniqueName: \"kubernetes.io/projected/0c87ed75-4285-4084-bce3-ee8dba7671c0-kube-api-access-4c57n\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.856053 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c87ed75-4285-4084-bce3-ee8dba7671c0-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.890394 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"0c87ed75-4285-4084-bce3-ee8dba7671c0\") " pod="openstack/openstack-galera-0" Mar 09 13:39:15 crc kubenswrapper[4764]: I0309 13:39:15.940911 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 09 13:39:16 crc kubenswrapper[4764]: I0309 13:39:16.956431 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 09 13:39:16 crc kubenswrapper[4764]: I0309 13:39:16.958178 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:16 crc kubenswrapper[4764]: I0309 13:39:16.961928 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 09 13:39:16 crc kubenswrapper[4764]: I0309 13:39:16.963098 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 09 13:39:16 crc kubenswrapper[4764]: I0309 13:39:16.963442 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 09 13:39:16 crc kubenswrapper[4764]: I0309 13:39:16.963601 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-cw24b" Mar 09 13:39:16 crc kubenswrapper[4764]: I0309 13:39:16.972441 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.065273 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/103cd40b-aa84-4973-8e47-8a67e5994c80-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.065320 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.065344 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/103cd40b-aa84-4973-8e47-8a67e5994c80-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.065383 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103cd40b-aa84-4973-8e47-8a67e5994c80-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.065411 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/103cd40b-aa84-4973-8e47-8a67e5994c80-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.065445 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/103cd40b-aa84-4973-8e47-8a67e5994c80-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.065487 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/103cd40b-aa84-4973-8e47-8a67e5994c80-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.065511 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stfnd\" (UniqueName: \"kubernetes.io/projected/103cd40b-aa84-4973-8e47-8a67e5994c80-kube-api-access-stfnd\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.168068 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/103cd40b-aa84-4973-8e47-8a67e5994c80-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.168131 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.168157 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/103cd40b-aa84-4973-8e47-8a67e5994c80-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.168191 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103cd40b-aa84-4973-8e47-8a67e5994c80-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.168226 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/103cd40b-aa84-4973-8e47-8a67e5994c80-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.168260 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/103cd40b-aa84-4973-8e47-8a67e5994c80-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.168298 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/103cd40b-aa84-4973-8e47-8a67e5994c80-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.168323 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stfnd\" (UniqueName: \"kubernetes.io/projected/103cd40b-aa84-4973-8e47-8a67e5994c80-kube-api-access-stfnd\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.170306 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/103cd40b-aa84-4973-8e47-8a67e5994c80-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.170435 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.171302 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/103cd40b-aa84-4973-8e47-8a67e5994c80-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.172016 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/103cd40b-aa84-4973-8e47-8a67e5994c80-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.174695 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/103cd40b-aa84-4973-8e47-8a67e5994c80-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.181393 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/103cd40b-aa84-4973-8e47-8a67e5994c80-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.181515 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103cd40b-aa84-4973-8e47-8a67e5994c80-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.208342 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.211308 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stfnd\" (UniqueName: \"kubernetes.io/projected/103cd40b-aa84-4973-8e47-8a67e5994c80-kube-api-access-stfnd\") pod \"openstack-cell1-galera-0\" (UID: \"103cd40b-aa84-4973-8e47-8a67e5994c80\") " pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.295327 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.413900 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.415011 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.419953 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.420239 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-69l42" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.420415 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.463772 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.576738 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/519ac270-ea24-47c1-b4f3-d94b0add96d1-memcached-tls-certs\") pod \"memcached-0\" (UID: \"519ac270-ea24-47c1-b4f3-d94b0add96d1\") " pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.576787 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tpjx\" (UniqueName: \"kubernetes.io/projected/519ac270-ea24-47c1-b4f3-d94b0add96d1-kube-api-access-7tpjx\") pod \"memcached-0\" (UID: \"519ac270-ea24-47c1-b4f3-d94b0add96d1\") " pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.576844 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/519ac270-ea24-47c1-b4f3-d94b0add96d1-kolla-config\") pod \"memcached-0\" (UID: \"519ac270-ea24-47c1-b4f3-d94b0add96d1\") " pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.576935 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/519ac270-ea24-47c1-b4f3-d94b0add96d1-config-data\") pod \"memcached-0\" (UID: \"519ac270-ea24-47c1-b4f3-d94b0add96d1\") " pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.576966 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/519ac270-ea24-47c1-b4f3-d94b0add96d1-combined-ca-bundle\") pod \"memcached-0\" (UID: \"519ac270-ea24-47c1-b4f3-d94b0add96d1\") " pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.678519 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/519ac270-ea24-47c1-b4f3-d94b0add96d1-kolla-config\") pod \"memcached-0\" (UID: \"519ac270-ea24-47c1-b4f3-d94b0add96d1\") " pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.678638 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/519ac270-ea24-47c1-b4f3-d94b0add96d1-config-data\") pod \"memcached-0\" (UID: \"519ac270-ea24-47c1-b4f3-d94b0add96d1\") " pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.678703 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/519ac270-ea24-47c1-b4f3-d94b0add96d1-combined-ca-bundle\") pod \"memcached-0\" (UID: \"519ac270-ea24-47c1-b4f3-d94b0add96d1\") " pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.678777 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/519ac270-ea24-47c1-b4f3-d94b0add96d1-memcached-tls-certs\") pod \"memcached-0\" (UID: \"519ac270-ea24-47c1-b4f3-d94b0add96d1\") " pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.678804 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tpjx\" (UniqueName: \"kubernetes.io/projected/519ac270-ea24-47c1-b4f3-d94b0add96d1-kube-api-access-7tpjx\") pod \"memcached-0\" (UID: \"519ac270-ea24-47c1-b4f3-d94b0add96d1\") " pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.680096 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/519ac270-ea24-47c1-b4f3-d94b0add96d1-config-data\") pod \"memcached-0\" (UID: \"519ac270-ea24-47c1-b4f3-d94b0add96d1\") " pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.681034 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/519ac270-ea24-47c1-b4f3-d94b0add96d1-kolla-config\") pod \"memcached-0\" (UID: \"519ac270-ea24-47c1-b4f3-d94b0add96d1\") " pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.682145 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/519ac270-ea24-47c1-b4f3-d94b0add96d1-memcached-tls-certs\") pod \"memcached-0\" (UID: \"519ac270-ea24-47c1-b4f3-d94b0add96d1\") " pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.688351 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/519ac270-ea24-47c1-b4f3-d94b0add96d1-combined-ca-bundle\") pod \"memcached-0\" (UID: \"519ac270-ea24-47c1-b4f3-d94b0add96d1\") " pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.701168 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tpjx\" (UniqueName: \"kubernetes.io/projected/519ac270-ea24-47c1-b4f3-d94b0add96d1-kube-api-access-7tpjx\") pod \"memcached-0\" (UID: \"519ac270-ea24-47c1-b4f3-d94b0add96d1\") " pod="openstack/memcached-0" Mar 09 13:39:17 crc kubenswrapper[4764]: I0309 13:39:17.750468 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 09 13:39:19 crc kubenswrapper[4764]: I0309 13:39:19.747294 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 13:39:19 crc kubenswrapper[4764]: I0309 13:39:19.751062 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 13:39:19 crc kubenswrapper[4764]: I0309 13:39:19.754525 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-zzhlv" Mar 09 13:39:19 crc kubenswrapper[4764]: I0309 13:39:19.759500 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 13:39:19 crc kubenswrapper[4764]: I0309 13:39:19.837481 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gphmn\" (UniqueName: \"kubernetes.io/projected/ce660994-4427-4d54-b83c-9c9ec7f64a9d-kube-api-access-gphmn\") pod \"kube-state-metrics-0\" (UID: \"ce660994-4427-4d54-b83c-9c9ec7f64a9d\") " pod="openstack/kube-state-metrics-0" Mar 09 13:39:19 crc kubenswrapper[4764]: I0309 13:39:19.939546 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gphmn\" (UniqueName: \"kubernetes.io/projected/ce660994-4427-4d54-b83c-9c9ec7f64a9d-kube-api-access-gphmn\") pod \"kube-state-metrics-0\" (UID: \"ce660994-4427-4d54-b83c-9c9ec7f64a9d\") " pod="openstack/kube-state-metrics-0" Mar 09 13:39:19 crc kubenswrapper[4764]: I0309 13:39:19.973456 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gphmn\" (UniqueName: \"kubernetes.io/projected/ce660994-4427-4d54-b83c-9c9ec7f64a9d-kube-api-access-gphmn\") pod \"kube-state-metrics-0\" (UID: \"ce660994-4427-4d54-b83c-9c9ec7f64a9d\") " pod="openstack/kube-state-metrics-0" Mar 09 13:39:20 crc kubenswrapper[4764]: I0309 13:39:20.082889 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.272094 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-qm7vs"] Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.275702 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.278216 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-dwvcr" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.278823 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.280439 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.289160 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-2zkzm"] Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.290856 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.306618 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qm7vs"] Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.316432 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-2zkzm"] Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.417892 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9bbe03cf-76d5-440a-903f-50c382aa3a4e-var-log-ovn\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.418085 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9bbe03cf-76d5-440a-903f-50c382aa3a4e-var-run\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.418265 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/05f9485e-b683-481d-87d3-fb86ebb4a832-var-lib\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.418471 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8vwf\" (UniqueName: \"kubernetes.io/projected/9bbe03cf-76d5-440a-903f-50c382aa3a4e-kube-api-access-p8vwf\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.418579 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/05f9485e-b683-481d-87d3-fb86ebb4a832-var-run\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.418621 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bbe03cf-76d5-440a-903f-50c382aa3a4e-ovn-controller-tls-certs\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.418868 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bbe03cf-76d5-440a-903f-50c382aa3a4e-combined-ca-bundle\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.418975 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9bbe03cf-76d5-440a-903f-50c382aa3a4e-var-run-ovn\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.419339 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bbe03cf-76d5-440a-903f-50c382aa3a4e-scripts\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.419443 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05f9485e-b683-481d-87d3-fb86ebb4a832-scripts\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.419528 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/05f9485e-b683-481d-87d3-fb86ebb4a832-var-log\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.419606 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v89d9\" (UniqueName: \"kubernetes.io/projected/05f9485e-b683-481d-87d3-fb86ebb4a832-kube-api-access-v89d9\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.419693 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/05f9485e-b683-481d-87d3-fb86ebb4a832-etc-ovs\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.521346 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9bbe03cf-76d5-440a-903f-50c382aa3a4e-var-run\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.521403 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/05f9485e-b683-481d-87d3-fb86ebb4a832-var-lib\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.521438 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8vwf\" (UniqueName: \"kubernetes.io/projected/9bbe03cf-76d5-440a-903f-50c382aa3a4e-kube-api-access-p8vwf\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.521470 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/05f9485e-b683-481d-87d3-fb86ebb4a832-var-run\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.521492 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bbe03cf-76d5-440a-903f-50c382aa3a4e-ovn-controller-tls-certs\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.521519 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bbe03cf-76d5-440a-903f-50c382aa3a4e-combined-ca-bundle\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.521549 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9bbe03cf-76d5-440a-903f-50c382aa3a4e-var-run-ovn\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.521589 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bbe03cf-76d5-440a-903f-50c382aa3a4e-scripts\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.521616 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05f9485e-b683-481d-87d3-fb86ebb4a832-scripts\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.521669 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/05f9485e-b683-481d-87d3-fb86ebb4a832-var-log\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.521696 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v89d9\" (UniqueName: \"kubernetes.io/projected/05f9485e-b683-481d-87d3-fb86ebb4a832-kube-api-access-v89d9\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.521727 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/05f9485e-b683-481d-87d3-fb86ebb4a832-etc-ovs\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.521774 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9bbe03cf-76d5-440a-903f-50c382aa3a4e-var-log-ovn\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.521980 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9bbe03cf-76d5-440a-903f-50c382aa3a4e-var-run\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.522018 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9bbe03cf-76d5-440a-903f-50c382aa3a4e-var-log-ovn\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.522126 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/05f9485e-b683-481d-87d3-fb86ebb4a832-var-lib\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.522145 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/05f9485e-b683-481d-87d3-fb86ebb4a832-var-log\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.522198 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9bbe03cf-76d5-440a-903f-50c382aa3a4e-var-run-ovn\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.522350 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/05f9485e-b683-481d-87d3-fb86ebb4a832-var-run\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.522512 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/05f9485e-b683-481d-87d3-fb86ebb4a832-etc-ovs\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.526906 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bbe03cf-76d5-440a-903f-50c382aa3a4e-scripts\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.526953 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05f9485e-b683-481d-87d3-fb86ebb4a832-scripts\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.528425 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bbe03cf-76d5-440a-903f-50c382aa3a4e-combined-ca-bundle\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.528490 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bbe03cf-76d5-440a-903f-50c382aa3a4e-ovn-controller-tls-certs\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.548916 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v89d9\" (UniqueName: \"kubernetes.io/projected/05f9485e-b683-481d-87d3-fb86ebb4a832-kube-api-access-v89d9\") pod \"ovn-controller-ovs-2zkzm\" (UID: \"05f9485e-b683-481d-87d3-fb86ebb4a832\") " pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.553335 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8vwf\" (UniqueName: \"kubernetes.io/projected/9bbe03cf-76d5-440a-903f-50c382aa3a4e-kube-api-access-p8vwf\") pod \"ovn-controller-qm7vs\" (UID: \"9bbe03cf-76d5-440a-903f-50c382aa3a4e\") " pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.609430 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:23 crc kubenswrapper[4764]: I0309 13:39:23.621269 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.140910 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.143873 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.148435 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.150316 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-krxnp" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.150602 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.150840 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.151020 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.154861 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.239813 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.239983 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.240042 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwcwk\" (UniqueName: \"kubernetes.io/projected/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-kube-api-access-lwcwk\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.240197 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-config\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.240241 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.240320 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.240452 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.240615 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.342553 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.342631 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.343098 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.343156 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.343181 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwcwk\" (UniqueName: \"kubernetes.io/projected/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-kube-api-access-lwcwk\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.343244 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-config\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.343298 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.343354 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.343510 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.344803 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.344952 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-config\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.346881 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.350049 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.350577 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.352639 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.365854 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwcwk\" (UniqueName: \"kubernetes.io/projected/e54bd06b-1ee2-452d-80fb-12fd4fb61c7b-kube-api-access-lwcwk\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.370070 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:24 crc kubenswrapper[4764]: I0309 13:39:24.470845 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.334674 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.337119 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.344798 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.345163 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.345221 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-gcsfc" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.345365 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.356023 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.490408 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/047aa387-9e35-4ec6-89a9-3be60e47610b-config\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.490469 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7286\" (UniqueName: \"kubernetes.io/projected/047aa387-9e35-4ec6-89a9-3be60e47610b-kube-api-access-w7286\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.490498 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/047aa387-9e35-4ec6-89a9-3be60e47610b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.490554 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/047aa387-9e35-4ec6-89a9-3be60e47610b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.490576 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.490623 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/047aa387-9e35-4ec6-89a9-3be60e47610b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.490722 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047aa387-9e35-4ec6-89a9-3be60e47610b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.490782 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/047aa387-9e35-4ec6-89a9-3be60e47610b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.593913 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/047aa387-9e35-4ec6-89a9-3be60e47610b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.594050 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/047aa387-9e35-4ec6-89a9-3be60e47610b-config\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.594136 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7286\" (UniqueName: \"kubernetes.io/projected/047aa387-9e35-4ec6-89a9-3be60e47610b-kube-api-access-w7286\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.594161 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/047aa387-9e35-4ec6-89a9-3be60e47610b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.594233 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/047aa387-9e35-4ec6-89a9-3be60e47610b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.594255 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.594330 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/047aa387-9e35-4ec6-89a9-3be60e47610b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.594372 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047aa387-9e35-4ec6-89a9-3be60e47610b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.596883 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.597468 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/047aa387-9e35-4ec6-89a9-3be60e47610b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.597547 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/047aa387-9e35-4ec6-89a9-3be60e47610b-config\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.597840 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/047aa387-9e35-4ec6-89a9-3be60e47610b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.600435 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/047aa387-9e35-4ec6-89a9-3be60e47610b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.601085 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/047aa387-9e35-4ec6-89a9-3be60e47610b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.601803 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047aa387-9e35-4ec6-89a9-3be60e47610b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.616231 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7286\" (UniqueName: \"kubernetes.io/projected/047aa387-9e35-4ec6-89a9-3be60e47610b-kube-api-access-w7286\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.631078 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"047aa387-9e35-4ec6-89a9-3be60e47610b\") " pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:26 crc kubenswrapper[4764]: I0309 13:39:26.671943 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:29 crc kubenswrapper[4764]: E0309 13:39:29.354366 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 09 13:39:29 crc kubenswrapper[4764]: E0309 13:39:29.354917 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8cbdt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-fcc4m_openstack(6bf8d6e3-823d-4f28-b5dd-e4df591df8bd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:39:29 crc kubenswrapper[4764]: E0309 13:39:29.356321 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" podUID="6bf8d6e3-823d-4f28-b5dd-e4df591df8bd" Mar 09 13:39:29 crc kubenswrapper[4764]: E0309 13:39:29.387628 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 09 13:39:29 crc kubenswrapper[4764]: E0309 13:39:29.387855 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ltsrn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-nc2v8_openstack(85351658-0136-4066-b39e-808260c4dae9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:39:29 crc kubenswrapper[4764]: E0309 13:39:29.388956 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" podUID="85351658-0136-4066-b39e-808260c4dae9" Mar 09 13:39:29 crc kubenswrapper[4764]: E0309 13:39:29.423783 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 09 13:39:29 crc kubenswrapper[4764]: E0309 13:39:29.423972 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wzz2f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-ggxj5_openstack(79abb676-0322-4a75-90f5-743c942073b4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:39:29 crc kubenswrapper[4764]: E0309 13:39:29.425186 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-ggxj5" podUID="79abb676-0322-4a75-90f5-743c942073b4" Mar 09 13:39:29 crc kubenswrapper[4764]: E0309 13:39:29.452820 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 09 13:39:29 crc kubenswrapper[4764]: E0309 13:39:29.452994 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7dtxl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-cwgfl_openstack(04d22384-e765-4ac8-9afa-7a31f4c347b2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:39:29 crc kubenswrapper[4764]: E0309 13:39:29.454276 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" podUID="04d22384-e765-4ac8-9afa-7a31f4c347b2" Mar 09 13:39:29 crc kubenswrapper[4764]: E0309 13:39:29.949160 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" podUID="04d22384-e765-4ac8-9afa-7a31f4c347b2" Mar 09 13:39:29 crc kubenswrapper[4764]: E0309 13:39:29.949607 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" podUID="85351658-0136-4066-b39e-808260c4dae9" Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.042437 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.262964 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.276685 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.297785 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.337404 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.423479 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.470516 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qm7vs"] Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.471546 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ggxj5" Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.485661 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.487736 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.517275 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-2zkzm"] Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.587955 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-config\") pod \"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd\" (UID: \"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd\") " Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.588523 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cbdt\" (UniqueName: \"kubernetes.io/projected/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-kube-api-access-8cbdt\") pod \"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd\" (UID: \"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd\") " Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.588583 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-config" (OuterVolumeSpecName: "config") pod "6bf8d6e3-823d-4f28-b5dd-e4df591df8bd" (UID: "6bf8d6e3-823d-4f28-b5dd-e4df591df8bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.589812 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-dns-svc\") pod \"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd\" (UID: \"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd\") " Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.589909 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79abb676-0322-4a75-90f5-743c942073b4-config\") pod \"79abb676-0322-4a75-90f5-743c942073b4\" (UID: \"79abb676-0322-4a75-90f5-743c942073b4\") " Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.589949 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzz2f\" (UniqueName: \"kubernetes.io/projected/79abb676-0322-4a75-90f5-743c942073b4-kube-api-access-wzz2f\") pod \"79abb676-0322-4a75-90f5-743c942073b4\" (UID: \"79abb676-0322-4a75-90f5-743c942073b4\") " Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.590356 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6bf8d6e3-823d-4f28-b5dd-e4df591df8bd" (UID: "6bf8d6e3-823d-4f28-b5dd-e4df591df8bd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.590685 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.590705 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.590619 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79abb676-0322-4a75-90f5-743c942073b4-config" (OuterVolumeSpecName: "config") pod "79abb676-0322-4a75-90f5-743c942073b4" (UID: "79abb676-0322-4a75-90f5-743c942073b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.595360 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79abb676-0322-4a75-90f5-743c942073b4-kube-api-access-wzz2f" (OuterVolumeSpecName: "kube-api-access-wzz2f") pod "79abb676-0322-4a75-90f5-743c942073b4" (UID: "79abb676-0322-4a75-90f5-743c942073b4"). InnerVolumeSpecName "kube-api-access-wzz2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.595413 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-kube-api-access-8cbdt" (OuterVolumeSpecName: "kube-api-access-8cbdt") pod "6bf8d6e3-823d-4f28-b5dd-e4df591df8bd" (UID: "6bf8d6e3-823d-4f28-b5dd-e4df591df8bd"). InnerVolumeSpecName "kube-api-access-8cbdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.692456 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cbdt\" (UniqueName: \"kubernetes.io/projected/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd-kube-api-access-8cbdt\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.692498 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79abb676-0322-4a75-90f5-743c942073b4-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.692514 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzz2f\" (UniqueName: \"kubernetes.io/projected/79abb676-0322-4a75-90f5-743c942073b4-kube-api-access-wzz2f\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.957616 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"519ac270-ea24-47c1-b4f3-d94b0add96d1","Type":"ContainerStarted","Data":"8b96065c71ce956d7ca9e3f9b0dc1039ec60f27e5692b9eabf4e01ffb71c54e6"} Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.962451 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-ggxj5" event={"ID":"79abb676-0322-4a75-90f5-743c942073b4","Type":"ContainerDied","Data":"689139240ebd2f6bf1f73c798b5e7965a7b610107fbcfeb6b865f8c4180b122b"} Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.962463 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ggxj5" Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.965189 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"103cd40b-aa84-4973-8e47-8a67e5994c80","Type":"ContainerStarted","Data":"d4896fdeaab9e0ab6795bf0feabe6e710312aa60d40358c1313f6f61fd1384b8"} Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.968134 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"11c8bf9f-a031-4e56-b1d7-49b407eabaf7","Type":"ContainerStarted","Data":"b97c921b54e1f12956d845171f6d90fe64a80d32c024a23960cca4b47667dc15"} Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.970797 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.970797 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-fcc4m" event={"ID":"6bf8d6e3-823d-4f28-b5dd-e4df591df8bd","Type":"ContainerDied","Data":"790e722a1a82b199681b86b0375d2d82ee82392b6ce4b20994ef141a06eb2c50"} Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.973432 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ce660994-4427-4d54-b83c-9c9ec7f64a9d","Type":"ContainerStarted","Data":"b8a8570b31f5838a5c44b32bc16eef1004cf79cfc6a6ee8f31255abe6b100221"} Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.983118 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"047aa387-9e35-4ec6-89a9-3be60e47610b","Type":"ContainerStarted","Data":"73534d7403f4d4b8a3d758d05ae643c12076d7d324d41d9ae662fa58e1d331bb"} Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.985608 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2zkzm" event={"ID":"05f9485e-b683-481d-87d3-fb86ebb4a832","Type":"ContainerStarted","Data":"0c9065ceb2c5b59137c3a5f22de500855f80722abaf07ce360f91a72b093f0aa"} Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.986672 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0c87ed75-4285-4084-bce3-ee8dba7671c0","Type":"ContainerStarted","Data":"cd15c32a15b75663d44df8d62c33aca309dbb77a08f8075368a95588ab9bf712"} Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.991613 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"507bcef1-e9ef-4eb1-85ce-358209b944bc","Type":"ContainerStarted","Data":"ee9809e2cf751402688e9f6828a75759ba83ac17c29d13b65aa1aa2a2afdc207"} Mar 09 13:39:30 crc kubenswrapper[4764]: I0309 13:39:30.992894 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qm7vs" event={"ID":"9bbe03cf-76d5-440a-903f-50c382aa3a4e","Type":"ContainerStarted","Data":"5868a99ec8e1266049239845f3b47de2af15a5d3b1e44885fe764fb14f411078"} Mar 09 13:39:31 crc kubenswrapper[4764]: I0309 13:39:31.037120 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ggxj5"] Mar 09 13:39:31 crc kubenswrapper[4764]: I0309 13:39:31.054095 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ggxj5"] Mar 09 13:39:31 crc kubenswrapper[4764]: I0309 13:39:31.079041 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fcc4m"] Mar 09 13:39:31 crc kubenswrapper[4764]: I0309 13:39:31.086148 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fcc4m"] Mar 09 13:39:31 crc kubenswrapper[4764]: I0309 13:39:31.585489 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bf8d6e3-823d-4f28-b5dd-e4df591df8bd" path="/var/lib/kubelet/pods/6bf8d6e3-823d-4f28-b5dd-e4df591df8bd/volumes" Mar 09 13:39:31 crc kubenswrapper[4764]: I0309 13:39:31.586317 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79abb676-0322-4a75-90f5-743c942073b4" path="/var/lib/kubelet/pods/79abb676-0322-4a75-90f5-743c942073b4/volumes" Mar 09 13:39:31 crc kubenswrapper[4764]: I0309 13:39:31.605353 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 09 13:39:33 crc kubenswrapper[4764]: W0309 13:39:33.812148 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode54bd06b_1ee2_452d_80fb_12fd4fb61c7b.slice/crio-587d898cabcd704974d0b76f839425d05f6d9c4e36b24c70baeee270aea0c8dc WatchSource:0}: Error finding container 587d898cabcd704974d0b76f839425d05f6d9c4e36b24c70baeee270aea0c8dc: Status 404 returned error can't find the container with id 587d898cabcd704974d0b76f839425d05f6d9c4e36b24c70baeee270aea0c8dc Mar 09 13:39:34 crc kubenswrapper[4764]: I0309 13:39:34.018945 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b","Type":"ContainerStarted","Data":"587d898cabcd704974d0b76f839425d05f6d9c4e36b24c70baeee270aea0c8dc"} Mar 09 13:39:39 crc kubenswrapper[4764]: I0309 13:39:39.075741 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"519ac270-ea24-47c1-b4f3-d94b0add96d1","Type":"ContainerStarted","Data":"857b2ed5ce566d3011852236c58b722132628d1f16a978cb03c1aa03305707b9"} Mar 09 13:39:39 crc kubenswrapper[4764]: I0309 13:39:39.076488 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 09 13:39:39 crc kubenswrapper[4764]: I0309 13:39:39.077206 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0c87ed75-4285-4084-bce3-ee8dba7671c0","Type":"ContainerStarted","Data":"ee6313eb1e523335aac7fc5b10eeb2f3e2a30c9ec3b48d04c221cffc105acf19"} Mar 09 13:39:39 crc kubenswrapper[4764]: I0309 13:39:39.117440 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=14.756977368 podStartE2EDuration="22.117418617s" podCreationTimestamp="2026-03-09 13:39:17 +0000 UTC" firstStartedPulling="2026-03-09 13:39:30.286180492 +0000 UTC m=+1125.536352400" lastFinishedPulling="2026-03-09 13:39:37.646621741 +0000 UTC m=+1132.896793649" observedRunningTime="2026-03-09 13:39:39.100064455 +0000 UTC m=+1134.350236363" watchObservedRunningTime="2026-03-09 13:39:39.117418617 +0000 UTC m=+1134.367590535" Mar 09 13:39:40 crc kubenswrapper[4764]: I0309 13:39:40.095585 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qm7vs" event={"ID":"9bbe03cf-76d5-440a-903f-50c382aa3a4e","Type":"ContainerStarted","Data":"7780fec5f70c7a0c4ccec4952ccb60908acc7914c6f1cd488d5262ad3acf8061"} Mar 09 13:39:40 crc kubenswrapper[4764]: I0309 13:39:40.096203 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-qm7vs" Mar 09 13:39:40 crc kubenswrapper[4764]: I0309 13:39:40.099826 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ce660994-4427-4d54-b83c-9c9ec7f64a9d","Type":"ContainerStarted","Data":"9847def4972a082de5de5ce66af1aea1d5c2f1147d6cb0e73ffadb728f7879a5"} Mar 09 13:39:40 crc kubenswrapper[4764]: I0309 13:39:40.099978 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 09 13:39:40 crc kubenswrapper[4764]: I0309 13:39:40.104607 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"103cd40b-aa84-4973-8e47-8a67e5994c80","Type":"ContainerStarted","Data":"de2693cfb592190f100e10043f32ff98b97a17851959b75e973a39d89f3be6c8"} Mar 09 13:39:40 crc kubenswrapper[4764]: I0309 13:39:40.108631 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"047aa387-9e35-4ec6-89a9-3be60e47610b","Type":"ContainerStarted","Data":"b40ecda29016254d1a36a05d5e6839eebe91e78d5ca273840367053a0ddf1f53"} Mar 09 13:39:40 crc kubenswrapper[4764]: I0309 13:39:40.113712 4764 generic.go:334] "Generic (PLEG): container finished" podID="05f9485e-b683-481d-87d3-fb86ebb4a832" containerID="2b0a7ff734ca1505391a107d091f7de45231761d7a49f25ccee267462bd84773" exitCode=0 Mar 09 13:39:40 crc kubenswrapper[4764]: I0309 13:39:40.115091 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2zkzm" event={"ID":"05f9485e-b683-481d-87d3-fb86ebb4a832","Type":"ContainerDied","Data":"2b0a7ff734ca1505391a107d091f7de45231761d7a49f25ccee267462bd84773"} Mar 09 13:39:40 crc kubenswrapper[4764]: I0309 13:39:40.117994 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b","Type":"ContainerStarted","Data":"e3787b08ede6c8e8ef08219c9e0ba6cf1a7c8bd5c292aead18b3f917f6e8d996"} Mar 09 13:39:40 crc kubenswrapper[4764]: I0309 13:39:40.118698 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-qm7vs" podStartSLOduration=9.481128873 podStartE2EDuration="17.118676299s" podCreationTimestamp="2026-03-09 13:39:23 +0000 UTC" firstStartedPulling="2026-03-09 13:39:30.491988179 +0000 UTC m=+1125.742160087" lastFinishedPulling="2026-03-09 13:39:38.129535605 +0000 UTC m=+1133.379707513" observedRunningTime="2026-03-09 13:39:40.116353798 +0000 UTC m=+1135.366525706" watchObservedRunningTime="2026-03-09 13:39:40.118676299 +0000 UTC m=+1135.368848207" Mar 09 13:39:40 crc kubenswrapper[4764]: I0309 13:39:40.190598 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=13.088438162 podStartE2EDuration="21.190574274s" podCreationTimestamp="2026-03-09 13:39:19 +0000 UTC" firstStartedPulling="2026-03-09 13:39:30.510095301 +0000 UTC m=+1125.760267209" lastFinishedPulling="2026-03-09 13:39:38.612231413 +0000 UTC m=+1133.862403321" observedRunningTime="2026-03-09 13:39:40.183108199 +0000 UTC m=+1135.433280117" watchObservedRunningTime="2026-03-09 13:39:40.190574274 +0000 UTC m=+1135.440746182" Mar 09 13:39:41 crc kubenswrapper[4764]: I0309 13:39:41.131851 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"507bcef1-e9ef-4eb1-85ce-358209b944bc","Type":"ContainerStarted","Data":"f2c108c98ecfd46a140cf2428c8bd7a9e4bb88eab84b2da0f3783d087eb2e601"} Mar 09 13:39:41 crc kubenswrapper[4764]: I0309 13:39:41.143302 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"11c8bf9f-a031-4e56-b1d7-49b407eabaf7","Type":"ContainerStarted","Data":"fd0a79b758702c401b2bbc87884a5f0ad37053c07ad4da0c2c69610d9e0509c2"} Mar 09 13:39:41 crc kubenswrapper[4764]: I0309 13:39:41.146758 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2zkzm" event={"ID":"05f9485e-b683-481d-87d3-fb86ebb4a832","Type":"ContainerStarted","Data":"6d06d27d33c7238bf1afccbff4afe123a96246645b51d5bfa4856ce504137597"} Mar 09 13:39:41 crc kubenswrapper[4764]: I0309 13:39:41.146787 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2zkzm" event={"ID":"05f9485e-b683-481d-87d3-fb86ebb4a832","Type":"ContainerStarted","Data":"2c3a94d19625a148abe11e06ba10b6b5dc33c7a969c78936633b8e6bacf89e8a"} Mar 09 13:39:41 crc kubenswrapper[4764]: I0309 13:39:41.222245 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-2zkzm" podStartSLOduration=11.103769899 podStartE2EDuration="18.222216267s" podCreationTimestamp="2026-03-09 13:39:23 +0000 UTC" firstStartedPulling="2026-03-09 13:39:30.530427762 +0000 UTC m=+1125.780599670" lastFinishedPulling="2026-03-09 13:39:37.64887413 +0000 UTC m=+1132.899046038" observedRunningTime="2026-03-09 13:39:41.218752546 +0000 UTC m=+1136.468924464" watchObservedRunningTime="2026-03-09 13:39:41.222216267 +0000 UTC m=+1136.472388195" Mar 09 13:39:42 crc kubenswrapper[4764]: I0309 13:39:42.160733 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:42 crc kubenswrapper[4764]: I0309 13:39:42.161092 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:39:44 crc kubenswrapper[4764]: I0309 13:39:44.184992 4764 generic.go:334] "Generic (PLEG): container finished" podID="103cd40b-aa84-4973-8e47-8a67e5994c80" containerID="de2693cfb592190f100e10043f32ff98b97a17851959b75e973a39d89f3be6c8" exitCode=0 Mar 09 13:39:44 crc kubenswrapper[4764]: I0309 13:39:44.185096 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"103cd40b-aa84-4973-8e47-8a67e5994c80","Type":"ContainerDied","Data":"de2693cfb592190f100e10043f32ff98b97a17851959b75e973a39d89f3be6c8"} Mar 09 13:39:44 crc kubenswrapper[4764]: I0309 13:39:44.187694 4764 generic.go:334] "Generic (PLEG): container finished" podID="0c87ed75-4285-4084-bce3-ee8dba7671c0" containerID="ee6313eb1e523335aac7fc5b10eeb2f3e2a30c9ec3b48d04c221cffc105acf19" exitCode=0 Mar 09 13:39:44 crc kubenswrapper[4764]: I0309 13:39:44.187727 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0c87ed75-4285-4084-bce3-ee8dba7671c0","Type":"ContainerDied","Data":"ee6313eb1e523335aac7fc5b10eeb2f3e2a30c9ec3b48d04c221cffc105acf19"} Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.693634 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-8ctgr"] Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.695423 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.697826 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.708462 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8ctgr"] Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.832352 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4db14a6b-d372-48be-86a1-bf651618b4a4-combined-ca-bundle\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.832426 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/4db14a6b-d372-48be-86a1-bf651618b4a4-ovs-rundir\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.832685 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4db14a6b-d372-48be-86a1-bf651618b4a4-config\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.832948 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4db14a6b-d372-48be-86a1-bf651618b4a4-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.833159 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/4db14a6b-d372-48be-86a1-bf651618b4a4-ovn-rundir\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.833428 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgb6g\" (UniqueName: \"kubernetes.io/projected/4db14a6b-d372-48be-86a1-bf651618b4a4-kube-api-access-vgb6g\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.887150 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nc2v8"] Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.929276 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-fmxtl"] Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.931327 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.935741 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4db14a6b-d372-48be-86a1-bf651618b4a4-combined-ca-bundle\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.935811 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/4db14a6b-d372-48be-86a1-bf651618b4a4-ovs-rundir\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.935849 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4db14a6b-d372-48be-86a1-bf651618b4a4-config\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.935881 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4db14a6b-d372-48be-86a1-bf651618b4a4-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.935917 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/4db14a6b-d372-48be-86a1-bf651618b4a4-ovn-rundir\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.935972 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgb6g\" (UniqueName: \"kubernetes.io/projected/4db14a6b-d372-48be-86a1-bf651618b4a4-kube-api-access-vgb6g\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.941730 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.942570 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/4db14a6b-d372-48be-86a1-bf651618b4a4-ovs-rundir\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.943008 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4db14a6b-d372-48be-86a1-bf651618b4a4-config\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.943135 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/4db14a6b-d372-48be-86a1-bf651618b4a4-ovn-rundir\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.945900 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4db14a6b-d372-48be-86a1-bf651618b4a4-combined-ca-bundle\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.946593 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-fmxtl"] Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.948403 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4db14a6b-d372-48be-86a1-bf651618b4a4-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:46 crc kubenswrapper[4764]: I0309 13:39:46.968337 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgb6g\" (UniqueName: \"kubernetes.io/projected/4db14a6b-d372-48be-86a1-bf651618b4a4-kube-api-access-vgb6g\") pod \"ovn-controller-metrics-8ctgr\" (UID: \"4db14a6b-d372-48be-86a1-bf651618b4a4\") " pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.014363 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8ctgr" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.038370 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-fmxtl\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.038467 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-fmxtl\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.038626 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-config\") pod \"dnsmasq-dns-5bf47b49b7-fmxtl\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.038812 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl6fl\" (UniqueName: \"kubernetes.io/projected/2bb93a9a-6443-4352-b7ae-64f953af06c3-kube-api-access-kl6fl\") pod \"dnsmasq-dns-5bf47b49b7-fmxtl\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.086584 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-cwgfl"] Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.114195 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-p289h"] Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.115662 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.117967 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.128622 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-p289h"] Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.141229 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-config\") pod \"dnsmasq-dns-5bf47b49b7-fmxtl\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.141356 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl6fl\" (UniqueName: \"kubernetes.io/projected/2bb93a9a-6443-4352-b7ae-64f953af06c3-kube-api-access-kl6fl\") pod \"dnsmasq-dns-5bf47b49b7-fmxtl\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.141406 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-fmxtl\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.141433 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-fmxtl\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.142247 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-config\") pod \"dnsmasq-dns-5bf47b49b7-fmxtl\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.142358 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-fmxtl\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.142687 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-fmxtl\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.162628 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl6fl\" (UniqueName: \"kubernetes.io/projected/2bb93a9a-6443-4352-b7ae-64f953af06c3-kube-api-access-kl6fl\") pod \"dnsmasq-dns-5bf47b49b7-fmxtl\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.243451 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-p289h\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.243552 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-dns-svc\") pod \"dnsmasq-dns-8554648995-p289h\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.243577 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-config\") pod \"dnsmasq-dns-8554648995-p289h\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.243662 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-p289h\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.243700 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkvzb\" (UniqueName: \"kubernetes.io/projected/e921061b-2a0f-4b22-beb1-0d52993dc06b-kube-api-access-xkvzb\") pod \"dnsmasq-dns-8554648995-p289h\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.325873 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.345482 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-p289h\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.345568 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-dns-svc\") pod \"dnsmasq-dns-8554648995-p289h\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.345590 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-config\") pod \"dnsmasq-dns-8554648995-p289h\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.345635 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-p289h\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.345680 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkvzb\" (UniqueName: \"kubernetes.io/projected/e921061b-2a0f-4b22-beb1-0d52993dc06b-kube-api-access-xkvzb\") pod \"dnsmasq-dns-8554648995-p289h\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.346589 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-p289h\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.346792 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-p289h\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.346792 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-dns-svc\") pod \"dnsmasq-dns-8554648995-p289h\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.346897 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-config\") pod \"dnsmasq-dns-8554648995-p289h\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.364846 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkvzb\" (UniqueName: \"kubernetes.io/projected/e921061b-2a0f-4b22-beb1-0d52993dc06b-kube-api-access-xkvzb\") pod \"dnsmasq-dns-8554648995-p289h\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.434296 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:47 crc kubenswrapper[4764]: I0309 13:39:47.753089 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 09 13:39:48 crc kubenswrapper[4764]: I0309 13:39:48.861316 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-fmxtl"] Mar 09 13:39:48 crc kubenswrapper[4764]: W0309 13:39:48.908365 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bb93a9a_6443_4352_b7ae_64f953af06c3.slice/crio-28b675926fb711438a050500ec110912822d9d8bb336d51faa38b7f7dfaac442 WatchSource:0}: Error finding container 28b675926fb711438a050500ec110912822d9d8bb336d51faa38b7f7dfaac442: Status 404 returned error can't find the container with id 28b675926fb711438a050500ec110912822d9d8bb336d51faa38b7f7dfaac442 Mar 09 13:39:48 crc kubenswrapper[4764]: I0309 13:39:48.944794 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-p289h"] Mar 09 13:39:48 crc kubenswrapper[4764]: I0309 13:39:48.952018 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8ctgr"] Mar 09 13:39:48 crc kubenswrapper[4764]: W0309 13:39:48.967795 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4db14a6b_d372_48be_86a1_bf651618b4a4.slice/crio-5f1f3b695c9092d35c13f63640cf706db0ae1c1f6b92f018ab38db40ef8ce375 WatchSource:0}: Error finding container 5f1f3b695c9092d35c13f63640cf706db0ae1c1f6b92f018ab38db40ef8ce375: Status 404 returned error can't find the container with id 5f1f3b695c9092d35c13f63640cf706db0ae1c1f6b92f018ab38db40ef8ce375 Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.236876 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"103cd40b-aa84-4973-8e47-8a67e5994c80","Type":"ContainerStarted","Data":"410872c79b144ea284f97f2a3d87683399483aba478ba3a14219e2f185e1ae16"} Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.248302 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"047aa387-9e35-4ec6-89a9-3be60e47610b","Type":"ContainerStarted","Data":"e73ea01d875b730903dcad2e132535b2d49a086bb92f6db3352cfa5ee5ca2450"} Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.250696 4764 generic.go:334] "Generic (PLEG): container finished" podID="85351658-0136-4066-b39e-808260c4dae9" containerID="ec052281dc06284d14011281ec9b60b7e02248f374cf9034b717280c98b8f271" exitCode=0 Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.250749 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" event={"ID":"85351658-0136-4066-b39e-808260c4dae9","Type":"ContainerDied","Data":"ec052281dc06284d14011281ec9b60b7e02248f374cf9034b717280c98b8f271"} Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.258616 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e54bd06b-1ee2-452d-80fb-12fd4fb61c7b","Type":"ContainerStarted","Data":"735fc93c908deb58f7b6ff73db8f24473e71aafae363e9eb4e59d8cba57c104f"} Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.260567 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-p289h" event={"ID":"e921061b-2a0f-4b22-beb1-0d52993dc06b","Type":"ContainerStarted","Data":"9c01a77a060dbab4de5d1ba1f06fcd3807020da1983c9df162b0099cb08b09d0"} Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.261710 4764 generic.go:334] "Generic (PLEG): container finished" podID="2bb93a9a-6443-4352-b7ae-64f953af06c3" containerID="b6ee8ca17d3dd32257565eee931384458b6f3e51cb8dbc64e3792bc8375f65d4" exitCode=0 Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.261755 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" event={"ID":"2bb93a9a-6443-4352-b7ae-64f953af06c3","Type":"ContainerDied","Data":"b6ee8ca17d3dd32257565eee931384458b6f3e51cb8dbc64e3792bc8375f65d4"} Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.261772 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" event={"ID":"2bb93a9a-6443-4352-b7ae-64f953af06c3","Type":"ContainerStarted","Data":"28b675926fb711438a050500ec110912822d9d8bb336d51faa38b7f7dfaac442"} Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.266503 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0c87ed75-4285-4084-bce3-ee8dba7671c0","Type":"ContainerStarted","Data":"13a98e60c52c501f0ea3d76503821122c0bc72169ccfcb9e1c871be7d09f9f96"} Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.275351 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=26.419149893 podStartE2EDuration="34.27533139s" podCreationTimestamp="2026-03-09 13:39:15 +0000 UTC" firstStartedPulling="2026-03-09 13:39:30.275063922 +0000 UTC m=+1125.525235830" lastFinishedPulling="2026-03-09 13:39:38.131245399 +0000 UTC m=+1133.381417327" observedRunningTime="2026-03-09 13:39:49.26615364 +0000 UTC m=+1144.516325548" watchObservedRunningTime="2026-03-09 13:39:49.27533139 +0000 UTC m=+1144.525503298" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.281565 4764 generic.go:334] "Generic (PLEG): container finished" podID="04d22384-e765-4ac8-9afa-7a31f4c347b2" containerID="12c1d1328197b6dafdcdd8647a30e40939b8ce0dfd16c52057876ad12cecfc4a" exitCode=0 Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.281623 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" event={"ID":"04d22384-e765-4ac8-9afa-7a31f4c347b2","Type":"ContainerDied","Data":"12c1d1328197b6dafdcdd8647a30e40939b8ce0dfd16c52057876ad12cecfc4a"} Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.284478 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8ctgr" event={"ID":"4db14a6b-d372-48be-86a1-bf651618b4a4","Type":"ContainerStarted","Data":"5f1f3b695c9092d35c13f63640cf706db0ae1c1f6b92f018ab38db40ef8ce375"} Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.331390 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=6.358309222 podStartE2EDuration="24.331371651s" podCreationTimestamp="2026-03-09 13:39:25 +0000 UTC" firstStartedPulling="2026-03-09 13:39:30.42529054 +0000 UTC m=+1125.675462448" lastFinishedPulling="2026-03-09 13:39:48.398352979 +0000 UTC m=+1143.648524877" observedRunningTime="2026-03-09 13:39:49.314581033 +0000 UTC m=+1144.564752951" watchObservedRunningTime="2026-03-09 13:39:49.331371651 +0000 UTC m=+1144.581543559" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.350995 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=11.757765874 podStartE2EDuration="26.350971512s" podCreationTimestamp="2026-03-09 13:39:23 +0000 UTC" firstStartedPulling="2026-03-09 13:39:33.830109592 +0000 UTC m=+1129.080281500" lastFinishedPulling="2026-03-09 13:39:48.42331524 +0000 UTC m=+1143.673487138" observedRunningTime="2026-03-09 13:39:49.344249487 +0000 UTC m=+1144.594421395" watchObservedRunningTime="2026-03-09 13:39:49.350971512 +0000 UTC m=+1144.601143420" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.423492 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=27.881124361 podStartE2EDuration="35.423460053s" podCreationTimestamp="2026-03-09 13:39:14 +0000 UTC" firstStartedPulling="2026-03-09 13:39:30.272748322 +0000 UTC m=+1125.522920230" lastFinishedPulling="2026-03-09 13:39:37.815084014 +0000 UTC m=+1133.065255922" observedRunningTime="2026-03-09 13:39:49.41759339 +0000 UTC m=+1144.667765308" watchObservedRunningTime="2026-03-09 13:39:49.423460053 +0000 UTC m=+1144.673631961" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.471321 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.734525 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.739465 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.897365 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85351658-0136-4066-b39e-808260c4dae9-config\") pod \"85351658-0136-4066-b39e-808260c4dae9\" (UID: \"85351658-0136-4066-b39e-808260c4dae9\") " Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.897607 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltsrn\" (UniqueName: \"kubernetes.io/projected/85351658-0136-4066-b39e-808260c4dae9-kube-api-access-ltsrn\") pod \"85351658-0136-4066-b39e-808260c4dae9\" (UID: \"85351658-0136-4066-b39e-808260c4dae9\") " Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.897638 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85351658-0136-4066-b39e-808260c4dae9-dns-svc\") pod \"85351658-0136-4066-b39e-808260c4dae9\" (UID: \"85351658-0136-4066-b39e-808260c4dae9\") " Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.897691 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04d22384-e765-4ac8-9afa-7a31f4c347b2-dns-svc\") pod \"04d22384-e765-4ac8-9afa-7a31f4c347b2\" (UID: \"04d22384-e765-4ac8-9afa-7a31f4c347b2\") " Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.897740 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04d22384-e765-4ac8-9afa-7a31f4c347b2-config\") pod \"04d22384-e765-4ac8-9afa-7a31f4c347b2\" (UID: \"04d22384-e765-4ac8-9afa-7a31f4c347b2\") " Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.897770 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dtxl\" (UniqueName: \"kubernetes.io/projected/04d22384-e765-4ac8-9afa-7a31f4c347b2-kube-api-access-7dtxl\") pod \"04d22384-e765-4ac8-9afa-7a31f4c347b2\" (UID: \"04d22384-e765-4ac8-9afa-7a31f4c347b2\") " Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.902962 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85351658-0136-4066-b39e-808260c4dae9-kube-api-access-ltsrn" (OuterVolumeSpecName: "kube-api-access-ltsrn") pod "85351658-0136-4066-b39e-808260c4dae9" (UID: "85351658-0136-4066-b39e-808260c4dae9"). InnerVolumeSpecName "kube-api-access-ltsrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.903940 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04d22384-e765-4ac8-9afa-7a31f4c347b2-kube-api-access-7dtxl" (OuterVolumeSpecName: "kube-api-access-7dtxl") pod "04d22384-e765-4ac8-9afa-7a31f4c347b2" (UID: "04d22384-e765-4ac8-9afa-7a31f4c347b2"). InnerVolumeSpecName "kube-api-access-7dtxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.919610 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04d22384-e765-4ac8-9afa-7a31f4c347b2-config" (OuterVolumeSpecName: "config") pod "04d22384-e765-4ac8-9afa-7a31f4c347b2" (UID: "04d22384-e765-4ac8-9afa-7a31f4c347b2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.920832 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85351658-0136-4066-b39e-808260c4dae9-config" (OuterVolumeSpecName: "config") pod "85351658-0136-4066-b39e-808260c4dae9" (UID: "85351658-0136-4066-b39e-808260c4dae9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.922048 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85351658-0136-4066-b39e-808260c4dae9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "85351658-0136-4066-b39e-808260c4dae9" (UID: "85351658-0136-4066-b39e-808260c4dae9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.925134 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04d22384-e765-4ac8-9afa-7a31f4c347b2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "04d22384-e765-4ac8-9afa-7a31f4c347b2" (UID: "04d22384-e765-4ac8-9afa-7a31f4c347b2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.999746 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltsrn\" (UniqueName: \"kubernetes.io/projected/85351658-0136-4066-b39e-808260c4dae9-kube-api-access-ltsrn\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.999791 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85351658-0136-4066-b39e-808260c4dae9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.999802 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04d22384-e765-4ac8-9afa-7a31f4c347b2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.999811 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04d22384-e765-4ac8-9afa-7a31f4c347b2-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.999821 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dtxl\" (UniqueName: \"kubernetes.io/projected/04d22384-e765-4ac8-9afa-7a31f4c347b2-kube-api-access-7dtxl\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:49 crc kubenswrapper[4764]: I0309 13:39:49.999830 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85351658-0136-4066-b39e-808260c4dae9-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.087027 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.295990 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" event={"ID":"2bb93a9a-6443-4352-b7ae-64f953af06c3","Type":"ContainerStarted","Data":"c1e00e2d332968b73536bcf21385c897383efbfb11adcc7b64e82bf17537d564"} Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.296460 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.301779 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.301776 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-nc2v8" event={"ID":"85351658-0136-4066-b39e-808260c4dae9","Type":"ContainerDied","Data":"cf38f50edd1cd803b2485a43d9bb0cf84c48deb7190f61ca1abae1471ed84bf4"} Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.301854 4764 scope.go:117] "RemoveContainer" containerID="ec052281dc06284d14011281ec9b60b7e02248f374cf9034b717280c98b8f271" Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.304335 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" event={"ID":"04d22384-e765-4ac8-9afa-7a31f4c347b2","Type":"ContainerDied","Data":"e9eba66ff4604aeb6e4bde7029daea515e55ea3fe70bfdcf16702550cc3774f9"} Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.304427 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-cwgfl" Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.306116 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8ctgr" event={"ID":"4db14a6b-d372-48be-86a1-bf651618b4a4","Type":"ContainerStarted","Data":"4c99c68cd386dd5f2ee4e470f12775336204f36e86a09f448217410fe83d4556"} Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.309222 4764 generic.go:334] "Generic (PLEG): container finished" podID="e921061b-2a0f-4b22-beb1-0d52993dc06b" containerID="7c314d6ef6d0a466bb5e22ea5e084d1cbdc6ed4e3e9ab6aee110b11fa331f5fb" exitCode=0 Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.309413 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-p289h" event={"ID":"e921061b-2a0f-4b22-beb1-0d52993dc06b","Type":"ContainerDied","Data":"7c314d6ef6d0a466bb5e22ea5e084d1cbdc6ed4e3e9ab6aee110b11fa331f5fb"} Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.326029 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" podStartSLOduration=4.32600759 podStartE2EDuration="4.32600759s" podCreationTimestamp="2026-03-09 13:39:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:39:50.324709166 +0000 UTC m=+1145.574881084" watchObservedRunningTime="2026-03-09 13:39:50.32600759 +0000 UTC m=+1145.576179498" Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.341011 4764 scope.go:117] "RemoveContainer" containerID="12c1d1328197b6dafdcdd8647a30e40939b8ce0dfd16c52057876ad12cecfc4a" Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.492716 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-8ctgr" podStartSLOduration=4.492689267 podStartE2EDuration="4.492689267s" podCreationTimestamp="2026-03-09 13:39:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:39:50.380140912 +0000 UTC m=+1145.630312830" watchObservedRunningTime="2026-03-09 13:39:50.492689267 +0000 UTC m=+1145.742861175" Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.578036 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nc2v8"] Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.587280 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nc2v8"] Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.603720 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-cwgfl"] Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.606552 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-cwgfl"] Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.673632 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:50 crc kubenswrapper[4764]: I0309 13:39:50.715076 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:51 crc kubenswrapper[4764]: I0309 13:39:51.319801 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-p289h" event={"ID":"e921061b-2a0f-4b22-beb1-0d52993dc06b","Type":"ContainerStarted","Data":"fce1b5bc8a63a55a1345ac17564049805c448dc5638e52c8bc6cdc086dbbb27d"} Mar 09 13:39:51 crc kubenswrapper[4764]: I0309 13:39:51.319948 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:51 crc kubenswrapper[4764]: I0309 13:39:51.322706 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:51 crc kubenswrapper[4764]: I0309 13:39:51.379223 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 09 13:39:51 crc kubenswrapper[4764]: I0309 13:39:51.416312 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-p289h" podStartSLOduration=4.416292992 podStartE2EDuration="4.416292992s" podCreationTimestamp="2026-03-09 13:39:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:39:51.344636383 +0000 UTC m=+1146.594808301" watchObservedRunningTime="2026-03-09 13:39:51.416292992 +0000 UTC m=+1146.666464900" Mar 09 13:39:51 crc kubenswrapper[4764]: I0309 13:39:51.471961 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:51 crc kubenswrapper[4764]: I0309 13:39:51.518472 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:51 crc kubenswrapper[4764]: I0309 13:39:51.569994 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04d22384-e765-4ac8-9afa-7a31f4c347b2" path="/var/lib/kubelet/pods/04d22384-e765-4ac8-9afa-7a31f4c347b2/volumes" Mar 09 13:39:51 crc kubenswrapper[4764]: I0309 13:39:51.570502 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85351658-0136-4066-b39e-808260c4dae9" path="/var/lib/kubelet/pods/85351658-0136-4066-b39e-808260c4dae9/volumes" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.484386 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.689021 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 09 13:39:52 crc kubenswrapper[4764]: E0309 13:39:52.689404 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04d22384-e765-4ac8-9afa-7a31f4c347b2" containerName="init" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.689425 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="04d22384-e765-4ac8-9afa-7a31f4c347b2" containerName="init" Mar 09 13:39:52 crc kubenswrapper[4764]: E0309 13:39:52.689466 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85351658-0136-4066-b39e-808260c4dae9" containerName="init" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.689473 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="85351658-0136-4066-b39e-808260c4dae9" containerName="init" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.689634 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="85351658-0136-4066-b39e-808260c4dae9" containerName="init" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.691209 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="04d22384-e765-4ac8-9afa-7a31f4c347b2" containerName="init" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.692341 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.695047 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.695333 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.695535 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.696268 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-xvphx" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.720710 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.765893 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c76h5\" (UniqueName: \"kubernetes.io/projected/142a1ef0-f024-4a81-85de-72435cd72d9e-kube-api-access-c76h5\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.765968 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/142a1ef0-f024-4a81-85de-72435cd72d9e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.766033 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/142a1ef0-f024-4a81-85de-72435cd72d9e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.766066 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142a1ef0-f024-4a81-85de-72435cd72d9e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.766125 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/142a1ef0-f024-4a81-85de-72435cd72d9e-config\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.766299 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/142a1ef0-f024-4a81-85de-72435cd72d9e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.766371 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/142a1ef0-f024-4a81-85de-72435cd72d9e-scripts\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.868226 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/142a1ef0-f024-4a81-85de-72435cd72d9e-config\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.868297 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/142a1ef0-f024-4a81-85de-72435cd72d9e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.868336 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/142a1ef0-f024-4a81-85de-72435cd72d9e-scripts\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.868396 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c76h5\" (UniqueName: \"kubernetes.io/projected/142a1ef0-f024-4a81-85de-72435cd72d9e-kube-api-access-c76h5\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.868442 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/142a1ef0-f024-4a81-85de-72435cd72d9e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.868498 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/142a1ef0-f024-4a81-85de-72435cd72d9e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.868533 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142a1ef0-f024-4a81-85de-72435cd72d9e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.869086 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/142a1ef0-f024-4a81-85de-72435cd72d9e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.869510 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/142a1ef0-f024-4a81-85de-72435cd72d9e-config\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.869583 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/142a1ef0-f024-4a81-85de-72435cd72d9e-scripts\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.878451 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/142a1ef0-f024-4a81-85de-72435cd72d9e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.878812 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/142a1ef0-f024-4a81-85de-72435cd72d9e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.880482 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142a1ef0-f024-4a81-85de-72435cd72d9e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:52 crc kubenswrapper[4764]: I0309 13:39:52.892336 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c76h5\" (UniqueName: \"kubernetes.io/projected/142a1ef0-f024-4a81-85de-72435cd72d9e-kube-api-access-c76h5\") pod \"ovn-northd-0\" (UID: \"142a1ef0-f024-4a81-85de-72435cd72d9e\") " pod="openstack/ovn-northd-0" Mar 09 13:39:53 crc kubenswrapper[4764]: I0309 13:39:53.049799 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 09 13:39:53 crc kubenswrapper[4764]: I0309 13:39:53.505684 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 09 13:39:54 crc kubenswrapper[4764]: I0309 13:39:54.352981 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"142a1ef0-f024-4a81-85de-72435cd72d9e","Type":"ContainerStarted","Data":"ccf88e978a422fc942b0c5235260f37090e1f60b98b5997f974460cdc3f6c062"} Mar 09 13:39:55 crc kubenswrapper[4764]: I0309 13:39:55.363569 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"142a1ef0-f024-4a81-85de-72435cd72d9e","Type":"ContainerStarted","Data":"6f05851cc84a9a82ca7126142476a5945146d39221905b5e2c6ca997048ccdc9"} Mar 09 13:39:55 crc kubenswrapper[4764]: I0309 13:39:55.363941 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"142a1ef0-f024-4a81-85de-72435cd72d9e","Type":"ContainerStarted","Data":"f8b383ab68c5e5e270fd806d9d8701126761656254d8c76342087690668262ca"} Mar 09 13:39:55 crc kubenswrapper[4764]: I0309 13:39:55.364110 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 09 13:39:55 crc kubenswrapper[4764]: I0309 13:39:55.418299 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.422010255 podStartE2EDuration="3.418273096s" podCreationTimestamp="2026-03-09 13:39:52 +0000 UTC" firstStartedPulling="2026-03-09 13:39:53.511973943 +0000 UTC m=+1148.762145851" lastFinishedPulling="2026-03-09 13:39:54.508236784 +0000 UTC m=+1149.758408692" observedRunningTime="2026-03-09 13:39:55.410296508 +0000 UTC m=+1150.660468416" watchObservedRunningTime="2026-03-09 13:39:55.418273096 +0000 UTC m=+1150.668445004" Mar 09 13:39:55 crc kubenswrapper[4764]: I0309 13:39:55.942221 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 09 13:39:55 crc kubenswrapper[4764]: I0309 13:39:55.942636 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 09 13:39:56 crc kubenswrapper[4764]: I0309 13:39:56.015281 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 09 13:39:56 crc kubenswrapper[4764]: I0309 13:39:56.449165 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 09 13:39:57 crc kubenswrapper[4764]: I0309 13:39:57.295868 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:57 crc kubenswrapper[4764]: I0309 13:39:57.296365 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:57 crc kubenswrapper[4764]: I0309 13:39:57.329575 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:57 crc kubenswrapper[4764]: I0309 13:39:57.427786 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:57 crc kubenswrapper[4764]: I0309 13:39:57.440025 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:39:57 crc kubenswrapper[4764]: I0309 13:39:57.512024 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-fmxtl"] Mar 09 13:39:57 crc kubenswrapper[4764]: I0309 13:39:57.512243 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" podUID="2bb93a9a-6443-4352-b7ae-64f953af06c3" containerName="dnsmasq-dns" containerID="cri-o://c1e00e2d332968b73536bcf21385c897383efbfb11adcc7b64e82bf17537d564" gracePeriod=10 Mar 09 13:39:57 crc kubenswrapper[4764]: I0309 13:39:57.557420 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 09 13:39:57 crc kubenswrapper[4764]: I0309 13:39:57.995671 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-d7e9-account-create-update-n7gsb"] Mar 09 13:39:57 crc kubenswrapper[4764]: I0309 13:39:57.997447 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d7e9-account-create-update-n7gsb" Mar 09 13:39:57 crc kubenswrapper[4764]: I0309 13:39:57.999594 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:57.999992 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.019677 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d7e9-account-create-update-n7gsb"] Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.028833 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-wkxnp"] Mar 09 13:39:58 crc kubenswrapper[4764]: E0309 13:39:58.039874 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb93a9a-6443-4352-b7ae-64f953af06c3" containerName="dnsmasq-dns" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.039916 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb93a9a-6443-4352-b7ae-64f953af06c3" containerName="dnsmasq-dns" Mar 09 13:39:58 crc kubenswrapper[4764]: E0309 13:39:58.039970 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb93a9a-6443-4352-b7ae-64f953af06c3" containerName="init" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.039982 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb93a9a-6443-4352-b7ae-64f953af06c3" containerName="init" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.040363 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bb93a9a-6443-4352-b7ae-64f953af06c3" containerName="dnsmasq-dns" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.041097 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-wkxnp"] Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.041202 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wkxnp" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.071596 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-ovsdbserver-nb\") pod \"2bb93a9a-6443-4352-b7ae-64f953af06c3\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.071716 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-config\") pod \"2bb93a9a-6443-4352-b7ae-64f953af06c3\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.071740 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-dns-svc\") pod \"2bb93a9a-6443-4352-b7ae-64f953af06c3\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.071781 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kl6fl\" (UniqueName: \"kubernetes.io/projected/2bb93a9a-6443-4352-b7ae-64f953af06c3-kube-api-access-kl6fl\") pod \"2bb93a9a-6443-4352-b7ae-64f953af06c3\" (UID: \"2bb93a9a-6443-4352-b7ae-64f953af06c3\") " Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.072059 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdsck\" (UniqueName: \"kubernetes.io/projected/75f29150-3689-48a6-9248-b6774f85fcd2-kube-api-access-vdsck\") pod \"glance-db-create-wkxnp\" (UID: \"75f29150-3689-48a6-9248-b6774f85fcd2\") " pod="openstack/glance-db-create-wkxnp" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.072087 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/693ba99b-99d0-4b09-9f49-9deefe05abac-operator-scripts\") pod \"glance-d7e9-account-create-update-n7gsb\" (UID: \"693ba99b-99d0-4b09-9f49-9deefe05abac\") " pod="openstack/glance-d7e9-account-create-update-n7gsb" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.072150 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hszn\" (UniqueName: \"kubernetes.io/projected/693ba99b-99d0-4b09-9f49-9deefe05abac-kube-api-access-2hszn\") pod \"glance-d7e9-account-create-update-n7gsb\" (UID: \"693ba99b-99d0-4b09-9f49-9deefe05abac\") " pod="openstack/glance-d7e9-account-create-update-n7gsb" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.072222 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75f29150-3689-48a6-9248-b6774f85fcd2-operator-scripts\") pod \"glance-db-create-wkxnp\" (UID: \"75f29150-3689-48a6-9248-b6774f85fcd2\") " pod="openstack/glance-db-create-wkxnp" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.084938 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bb93a9a-6443-4352-b7ae-64f953af06c3-kube-api-access-kl6fl" (OuterVolumeSpecName: "kube-api-access-kl6fl") pod "2bb93a9a-6443-4352-b7ae-64f953af06c3" (UID: "2bb93a9a-6443-4352-b7ae-64f953af06c3"). InnerVolumeSpecName "kube-api-access-kl6fl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.133270 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2bb93a9a-6443-4352-b7ae-64f953af06c3" (UID: "2bb93a9a-6443-4352-b7ae-64f953af06c3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.144240 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2bb93a9a-6443-4352-b7ae-64f953af06c3" (UID: "2bb93a9a-6443-4352-b7ae-64f953af06c3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.145511 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-config" (OuterVolumeSpecName: "config") pod "2bb93a9a-6443-4352-b7ae-64f953af06c3" (UID: "2bb93a9a-6443-4352-b7ae-64f953af06c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.174258 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75f29150-3689-48a6-9248-b6774f85fcd2-operator-scripts\") pod \"glance-db-create-wkxnp\" (UID: \"75f29150-3689-48a6-9248-b6774f85fcd2\") " pod="openstack/glance-db-create-wkxnp" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.174364 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdsck\" (UniqueName: \"kubernetes.io/projected/75f29150-3689-48a6-9248-b6774f85fcd2-kube-api-access-vdsck\") pod \"glance-db-create-wkxnp\" (UID: \"75f29150-3689-48a6-9248-b6774f85fcd2\") " pod="openstack/glance-db-create-wkxnp" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.174388 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/693ba99b-99d0-4b09-9f49-9deefe05abac-operator-scripts\") pod \"glance-d7e9-account-create-update-n7gsb\" (UID: \"693ba99b-99d0-4b09-9f49-9deefe05abac\") " pod="openstack/glance-d7e9-account-create-update-n7gsb" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.174438 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hszn\" (UniqueName: \"kubernetes.io/projected/693ba99b-99d0-4b09-9f49-9deefe05abac-kube-api-access-2hszn\") pod \"glance-d7e9-account-create-update-n7gsb\" (UID: \"693ba99b-99d0-4b09-9f49-9deefe05abac\") " pod="openstack/glance-d7e9-account-create-update-n7gsb" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.174628 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.175077 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.175110 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bb93a9a-6443-4352-b7ae-64f953af06c3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.175123 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kl6fl\" (UniqueName: \"kubernetes.io/projected/2bb93a9a-6443-4352-b7ae-64f953af06c3-kube-api-access-kl6fl\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.175397 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75f29150-3689-48a6-9248-b6774f85fcd2-operator-scripts\") pod \"glance-db-create-wkxnp\" (UID: \"75f29150-3689-48a6-9248-b6774f85fcd2\") " pod="openstack/glance-db-create-wkxnp" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.175514 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/693ba99b-99d0-4b09-9f49-9deefe05abac-operator-scripts\") pod \"glance-d7e9-account-create-update-n7gsb\" (UID: \"693ba99b-99d0-4b09-9f49-9deefe05abac\") " pod="openstack/glance-d7e9-account-create-update-n7gsb" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.192933 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hszn\" (UniqueName: \"kubernetes.io/projected/693ba99b-99d0-4b09-9f49-9deefe05abac-kube-api-access-2hszn\") pod \"glance-d7e9-account-create-update-n7gsb\" (UID: \"693ba99b-99d0-4b09-9f49-9deefe05abac\") " pod="openstack/glance-d7e9-account-create-update-n7gsb" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.194588 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdsck\" (UniqueName: \"kubernetes.io/projected/75f29150-3689-48a6-9248-b6774f85fcd2-kube-api-access-vdsck\") pod \"glance-db-create-wkxnp\" (UID: \"75f29150-3689-48a6-9248-b6774f85fcd2\") " pod="openstack/glance-db-create-wkxnp" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.324705 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d7e9-account-create-update-n7gsb" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.364924 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wkxnp" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.370240 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.370285 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.390128 4764 generic.go:334] "Generic (PLEG): container finished" podID="2bb93a9a-6443-4352-b7ae-64f953af06c3" containerID="c1e00e2d332968b73536bcf21385c897383efbfb11adcc7b64e82bf17537d564" exitCode=0 Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.390240 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.390474 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" event={"ID":"2bb93a9a-6443-4352-b7ae-64f953af06c3","Type":"ContainerDied","Data":"c1e00e2d332968b73536bcf21385c897383efbfb11adcc7b64e82bf17537d564"} Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.390588 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-fmxtl" event={"ID":"2bb93a9a-6443-4352-b7ae-64f953af06c3","Type":"ContainerDied","Data":"28b675926fb711438a050500ec110912822d9d8bb336d51faa38b7f7dfaac442"} Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.390713 4764 scope.go:117] "RemoveContainer" containerID="c1e00e2d332968b73536bcf21385c897383efbfb11adcc7b64e82bf17537d564" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.433086 4764 scope.go:117] "RemoveContainer" containerID="b6ee8ca17d3dd32257565eee931384458b6f3e51cb8dbc64e3792bc8375f65d4" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.437292 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-fmxtl"] Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.447324 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-fmxtl"] Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.485944 4764 scope.go:117] "RemoveContainer" containerID="c1e00e2d332968b73536bcf21385c897383efbfb11adcc7b64e82bf17537d564" Mar 09 13:39:58 crc kubenswrapper[4764]: E0309 13:39:58.495333 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1e00e2d332968b73536bcf21385c897383efbfb11adcc7b64e82bf17537d564\": container with ID starting with c1e00e2d332968b73536bcf21385c897383efbfb11adcc7b64e82bf17537d564 not found: ID does not exist" containerID="c1e00e2d332968b73536bcf21385c897383efbfb11adcc7b64e82bf17537d564" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.495385 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1e00e2d332968b73536bcf21385c897383efbfb11adcc7b64e82bf17537d564"} err="failed to get container status \"c1e00e2d332968b73536bcf21385c897383efbfb11adcc7b64e82bf17537d564\": rpc error: code = NotFound desc = could not find container \"c1e00e2d332968b73536bcf21385c897383efbfb11adcc7b64e82bf17537d564\": container with ID starting with c1e00e2d332968b73536bcf21385c897383efbfb11adcc7b64e82bf17537d564 not found: ID does not exist" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.495420 4764 scope.go:117] "RemoveContainer" containerID="b6ee8ca17d3dd32257565eee931384458b6f3e51cb8dbc64e3792bc8375f65d4" Mar 09 13:39:58 crc kubenswrapper[4764]: E0309 13:39:58.498910 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6ee8ca17d3dd32257565eee931384458b6f3e51cb8dbc64e3792bc8375f65d4\": container with ID starting with b6ee8ca17d3dd32257565eee931384458b6f3e51cb8dbc64e3792bc8375f65d4 not found: ID does not exist" containerID="b6ee8ca17d3dd32257565eee931384458b6f3e51cb8dbc64e3792bc8375f65d4" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.498940 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6ee8ca17d3dd32257565eee931384458b6f3e51cb8dbc64e3792bc8375f65d4"} err="failed to get container status \"b6ee8ca17d3dd32257565eee931384458b6f3e51cb8dbc64e3792bc8375f65d4\": rpc error: code = NotFound desc = could not find container \"b6ee8ca17d3dd32257565eee931384458b6f3e51cb8dbc64e3792bc8375f65d4\": container with ID starting with b6ee8ca17d3dd32257565eee931384458b6f3e51cb8dbc64e3792bc8375f65d4 not found: ID does not exist" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.764420 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-66ln9"] Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.774886 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-66ln9" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.781973 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-66ln9"] Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.829078 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d7e9-account-create-update-n7gsb"] Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.894283 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9pfh\" (UniqueName: \"kubernetes.io/projected/811ef770-3be6-4f3b-9fc3-dee4df710c4f-kube-api-access-c9pfh\") pod \"keystone-db-create-66ln9\" (UID: \"811ef770-3be6-4f3b-9fc3-dee4df710c4f\") " pod="openstack/keystone-db-create-66ln9" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.895068 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/811ef770-3be6-4f3b-9fc3-dee4df710c4f-operator-scripts\") pod \"keystone-db-create-66ln9\" (UID: \"811ef770-3be6-4f3b-9fc3-dee4df710c4f\") " pod="openstack/keystone-db-create-66ln9" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.924023 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-594d-account-create-update-dxsw5"] Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.934693 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-594d-account-create-update-dxsw5" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.945360 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.970273 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-594d-account-create-update-dxsw5"] Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.992953 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-kn2lh"] Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.994576 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kn2lh" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.997911 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9pfh\" (UniqueName: \"kubernetes.io/projected/811ef770-3be6-4f3b-9fc3-dee4df710c4f-kube-api-access-c9pfh\") pod \"keystone-db-create-66ln9\" (UID: \"811ef770-3be6-4f3b-9fc3-dee4df710c4f\") " pod="openstack/keystone-db-create-66ln9" Mar 09 13:39:58 crc kubenswrapper[4764]: I0309 13:39:58.998042 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/811ef770-3be6-4f3b-9fc3-dee4df710c4f-operator-scripts\") pod \"keystone-db-create-66ln9\" (UID: \"811ef770-3be6-4f3b-9fc3-dee4df710c4f\") " pod="openstack/keystone-db-create-66ln9" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:58.999051 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/811ef770-3be6-4f3b-9fc3-dee4df710c4f-operator-scripts\") pod \"keystone-db-create-66ln9\" (UID: \"811ef770-3be6-4f3b-9fc3-dee4df710c4f\") " pod="openstack/keystone-db-create-66ln9" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.028974 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-kn2lh"] Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.054402 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-wkxnp"] Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.067559 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9pfh\" (UniqueName: \"kubernetes.io/projected/811ef770-3be6-4f3b-9fc3-dee4df710c4f-kube-api-access-c9pfh\") pod \"keystone-db-create-66ln9\" (UID: \"811ef770-3be6-4f3b-9fc3-dee4df710c4f\") " pod="openstack/keystone-db-create-66ln9" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.081363 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-0f8b-account-create-update-mxbcn"] Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.082808 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0f8b-account-create-update-mxbcn" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.087461 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.092327 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0f8b-account-create-update-mxbcn"] Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.100695 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-665kq\" (UniqueName: \"kubernetes.io/projected/7d681487-9af9-48e3-bb79-569b8c7bf26d-kube-api-access-665kq\") pod \"placement-db-create-kn2lh\" (UID: \"7d681487-9af9-48e3-bb79-569b8c7bf26d\") " pod="openstack/placement-db-create-kn2lh" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.100767 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01e4dc90-6790-447b-ac2a-d2dfcde88d17-operator-scripts\") pod \"keystone-594d-account-create-update-dxsw5\" (UID: \"01e4dc90-6790-447b-ac2a-d2dfcde88d17\") " pod="openstack/keystone-594d-account-create-update-dxsw5" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.100928 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d681487-9af9-48e3-bb79-569b8c7bf26d-operator-scripts\") pod \"placement-db-create-kn2lh\" (UID: \"7d681487-9af9-48e3-bb79-569b8c7bf26d\") " pod="openstack/placement-db-create-kn2lh" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.101076 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpt8k\" (UniqueName: \"kubernetes.io/projected/01e4dc90-6790-447b-ac2a-d2dfcde88d17-kube-api-access-lpt8k\") pod \"keystone-594d-account-create-update-dxsw5\" (UID: \"01e4dc90-6790-447b-ac2a-d2dfcde88d17\") " pod="openstack/keystone-594d-account-create-update-dxsw5" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.174020 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-66ln9" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.204215 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d681487-9af9-48e3-bb79-569b8c7bf26d-operator-scripts\") pod \"placement-db-create-kn2lh\" (UID: \"7d681487-9af9-48e3-bb79-569b8c7bf26d\") " pod="openstack/placement-db-create-kn2lh" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.204330 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rst47\" (UniqueName: \"kubernetes.io/projected/9d27c011-b8dd-4f14-9833-413f7a8faf8a-kube-api-access-rst47\") pod \"placement-0f8b-account-create-update-mxbcn\" (UID: \"9d27c011-b8dd-4f14-9833-413f7a8faf8a\") " pod="openstack/placement-0f8b-account-create-update-mxbcn" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.204395 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpt8k\" (UniqueName: \"kubernetes.io/projected/01e4dc90-6790-447b-ac2a-d2dfcde88d17-kube-api-access-lpt8k\") pod \"keystone-594d-account-create-update-dxsw5\" (UID: \"01e4dc90-6790-447b-ac2a-d2dfcde88d17\") " pod="openstack/keystone-594d-account-create-update-dxsw5" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.204453 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d27c011-b8dd-4f14-9833-413f7a8faf8a-operator-scripts\") pod \"placement-0f8b-account-create-update-mxbcn\" (UID: \"9d27c011-b8dd-4f14-9833-413f7a8faf8a\") " pod="openstack/placement-0f8b-account-create-update-mxbcn" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.204504 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-665kq\" (UniqueName: \"kubernetes.io/projected/7d681487-9af9-48e3-bb79-569b8c7bf26d-kube-api-access-665kq\") pod \"placement-db-create-kn2lh\" (UID: \"7d681487-9af9-48e3-bb79-569b8c7bf26d\") " pod="openstack/placement-db-create-kn2lh" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.204661 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01e4dc90-6790-447b-ac2a-d2dfcde88d17-operator-scripts\") pod \"keystone-594d-account-create-update-dxsw5\" (UID: \"01e4dc90-6790-447b-ac2a-d2dfcde88d17\") " pod="openstack/keystone-594d-account-create-update-dxsw5" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.205225 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d681487-9af9-48e3-bb79-569b8c7bf26d-operator-scripts\") pod \"placement-db-create-kn2lh\" (UID: \"7d681487-9af9-48e3-bb79-569b8c7bf26d\") " pod="openstack/placement-db-create-kn2lh" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.205792 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01e4dc90-6790-447b-ac2a-d2dfcde88d17-operator-scripts\") pod \"keystone-594d-account-create-update-dxsw5\" (UID: \"01e4dc90-6790-447b-ac2a-d2dfcde88d17\") " pod="openstack/keystone-594d-account-create-update-dxsw5" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.228709 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpt8k\" (UniqueName: \"kubernetes.io/projected/01e4dc90-6790-447b-ac2a-d2dfcde88d17-kube-api-access-lpt8k\") pod \"keystone-594d-account-create-update-dxsw5\" (UID: \"01e4dc90-6790-447b-ac2a-d2dfcde88d17\") " pod="openstack/keystone-594d-account-create-update-dxsw5" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.232237 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-665kq\" (UniqueName: \"kubernetes.io/projected/7d681487-9af9-48e3-bb79-569b8c7bf26d-kube-api-access-665kq\") pod \"placement-db-create-kn2lh\" (UID: \"7d681487-9af9-48e3-bb79-569b8c7bf26d\") " pod="openstack/placement-db-create-kn2lh" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.306987 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rst47\" (UniqueName: \"kubernetes.io/projected/9d27c011-b8dd-4f14-9833-413f7a8faf8a-kube-api-access-rst47\") pod \"placement-0f8b-account-create-update-mxbcn\" (UID: \"9d27c011-b8dd-4f14-9833-413f7a8faf8a\") " pod="openstack/placement-0f8b-account-create-update-mxbcn" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.307443 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d27c011-b8dd-4f14-9833-413f7a8faf8a-operator-scripts\") pod \"placement-0f8b-account-create-update-mxbcn\" (UID: \"9d27c011-b8dd-4f14-9833-413f7a8faf8a\") " pod="openstack/placement-0f8b-account-create-update-mxbcn" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.308250 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d27c011-b8dd-4f14-9833-413f7a8faf8a-operator-scripts\") pod \"placement-0f8b-account-create-update-mxbcn\" (UID: \"9d27c011-b8dd-4f14-9833-413f7a8faf8a\") " pod="openstack/placement-0f8b-account-create-update-mxbcn" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.330669 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-594d-account-create-update-dxsw5" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.350443 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rst47\" (UniqueName: \"kubernetes.io/projected/9d27c011-b8dd-4f14-9833-413f7a8faf8a-kube-api-access-rst47\") pod \"placement-0f8b-account-create-update-mxbcn\" (UID: \"9d27c011-b8dd-4f14-9833-413f7a8faf8a\") " pod="openstack/placement-0f8b-account-create-update-mxbcn" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.415805 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d7e9-account-create-update-n7gsb" event={"ID":"693ba99b-99d0-4b09-9f49-9deefe05abac","Type":"ContainerStarted","Data":"da76ff68a5f6a4d7d50712be1138bbea55547749d69b4508eda52be21fd47ce6"} Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.415852 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d7e9-account-create-update-n7gsb" event={"ID":"693ba99b-99d0-4b09-9f49-9deefe05abac","Type":"ContainerStarted","Data":"f80fe4762d2af82804e15db7d3b8d4627b64c32ef1882fdf483222746daef33d"} Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.438172 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-wkxnp" event={"ID":"75f29150-3689-48a6-9248-b6774f85fcd2","Type":"ContainerStarted","Data":"98058d7a077d55dcc4dc3081744bc24f8ca7e82793695f8748e2850e19fbd5a3"} Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.438225 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-wkxnp" event={"ID":"75f29150-3689-48a6-9248-b6774f85fcd2","Type":"ContainerStarted","Data":"f6bcb6e597f958bade23e793909cd2f5b0c44f96f937977251a4fc97d7a51cda"} Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.444510 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-d7e9-account-create-update-n7gsb" podStartSLOduration=2.444484843 podStartE2EDuration="2.444484843s" podCreationTimestamp="2026-03-09 13:39:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:39:59.437881491 +0000 UTC m=+1154.688053419" watchObservedRunningTime="2026-03-09 13:39:59.444484843 +0000 UTC m=+1154.694656761" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.462537 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kn2lh" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.463734 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0f8b-account-create-update-mxbcn" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.471991 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-wkxnp" podStartSLOduration=2.471955059 podStartE2EDuration="2.471955059s" podCreationTimestamp="2026-03-09 13:39:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:39:59.46315196 +0000 UTC m=+1154.713323868" watchObservedRunningTime="2026-03-09 13:39:59.471955059 +0000 UTC m=+1154.722126967" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.603462 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bb93a9a-6443-4352-b7ae-64f953af06c3" path="/var/lib/kubelet/pods/2bb93a9a-6443-4352-b7ae-64f953af06c3/volumes" Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.748365 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-66ln9"] Mar 09 13:39:59 crc kubenswrapper[4764]: I0309 13:39:59.909529 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-594d-account-create-update-dxsw5"] Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.122152 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-kn2lh"] Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.148692 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551060-n7f54"] Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.154321 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551060-n7f54" Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.157385 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551060-n7f54"] Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.160429 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.160477 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.161039 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.225320 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb5hw\" (UniqueName: \"kubernetes.io/projected/16623a65-1bef-4faa-a891-bae0a7d04977-kube-api-access-gb5hw\") pod \"auto-csr-approver-29551060-n7f54\" (UID: \"16623a65-1bef-4faa-a891-bae0a7d04977\") " pod="openshift-infra/auto-csr-approver-29551060-n7f54" Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.241616 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0f8b-account-create-update-mxbcn"] Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.327497 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb5hw\" (UniqueName: \"kubernetes.io/projected/16623a65-1bef-4faa-a891-bae0a7d04977-kube-api-access-gb5hw\") pod \"auto-csr-approver-29551060-n7f54\" (UID: \"16623a65-1bef-4faa-a891-bae0a7d04977\") " pod="openshift-infra/auto-csr-approver-29551060-n7f54" Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.352961 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb5hw\" (UniqueName: \"kubernetes.io/projected/16623a65-1bef-4faa-a891-bae0a7d04977-kube-api-access-gb5hw\") pod \"auto-csr-approver-29551060-n7f54\" (UID: \"16623a65-1bef-4faa-a891-bae0a7d04977\") " pod="openshift-infra/auto-csr-approver-29551060-n7f54" Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.451678 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-594d-account-create-update-dxsw5" event={"ID":"01e4dc90-6790-447b-ac2a-d2dfcde88d17","Type":"ContainerStarted","Data":"6de1cfcabddc41034a2e1c58fb3ef2484991e0985876cbbdc822fe1df3641db1"} Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.451752 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-594d-account-create-update-dxsw5" event={"ID":"01e4dc90-6790-447b-ac2a-d2dfcde88d17","Type":"ContainerStarted","Data":"6db0dcbeaa0e445d9563d80bdb4ee91587520750c60db46e9ceacf66f6d60c54"} Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.455406 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0f8b-account-create-update-mxbcn" event={"ID":"9d27c011-b8dd-4f14-9833-413f7a8faf8a","Type":"ContainerStarted","Data":"01ef1c726b7e7270c862ba5b9e31c73fecc7bf0ea188cc226f1bf52e6cb5af33"} Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.455440 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0f8b-account-create-update-mxbcn" event={"ID":"9d27c011-b8dd-4f14-9833-413f7a8faf8a","Type":"ContainerStarted","Data":"1f18354cc3ede52419192c5d865dc6321532287310adb5cffdafe9ba08f30940"} Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.460691 4764 generic.go:334] "Generic (PLEG): container finished" podID="693ba99b-99d0-4b09-9f49-9deefe05abac" containerID="da76ff68a5f6a4d7d50712be1138bbea55547749d69b4508eda52be21fd47ce6" exitCode=0 Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.460775 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d7e9-account-create-update-n7gsb" event={"ID":"693ba99b-99d0-4b09-9f49-9deefe05abac","Type":"ContainerDied","Data":"da76ff68a5f6a4d7d50712be1138bbea55547749d69b4508eda52be21fd47ce6"} Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.463236 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kn2lh" event={"ID":"7d681487-9af9-48e3-bb79-569b8c7bf26d","Type":"ContainerStarted","Data":"6bccf5846329079b445a150aae1cbe1d8637bb12a3644ba9afce7863cefc0fe8"} Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.463291 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kn2lh" event={"ID":"7d681487-9af9-48e3-bb79-569b8c7bf26d","Type":"ContainerStarted","Data":"cb35dabb9f5f0edd526374363673fcbd6955aa3235f600a648fe50273b0796dd"} Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.465087 4764 generic.go:334] "Generic (PLEG): container finished" podID="75f29150-3689-48a6-9248-b6774f85fcd2" containerID="98058d7a077d55dcc4dc3081744bc24f8ca7e82793695f8748e2850e19fbd5a3" exitCode=0 Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.465197 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-wkxnp" event={"ID":"75f29150-3689-48a6-9248-b6774f85fcd2","Type":"ContainerDied","Data":"98058d7a077d55dcc4dc3081744bc24f8ca7e82793695f8748e2850e19fbd5a3"} Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.490781 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551060-n7f54" Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.499965 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-66ln9" event={"ID":"811ef770-3be6-4f3b-9fc3-dee4df710c4f","Type":"ContainerStarted","Data":"cb8ff00b99c398bb890b473678e6ce951f3d71cc55317946f469bc84cba9ae54"} Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.500052 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-66ln9" event={"ID":"811ef770-3be6-4f3b-9fc3-dee4df710c4f","Type":"ContainerStarted","Data":"616b45c0cfdfa8c2edf2207afc781be5cb32d60b419fa2c0f9c02788b1b97cc1"} Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.511243 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-594d-account-create-update-dxsw5" podStartSLOduration=2.511205151 podStartE2EDuration="2.511205151s" podCreationTimestamp="2026-03-09 13:39:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:40:00.482795351 +0000 UTC m=+1155.732967259" watchObservedRunningTime="2026-03-09 13:40:00.511205151 +0000 UTC m=+1155.761377049" Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.546892 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-0f8b-account-create-update-mxbcn" podStartSLOduration=1.546859171 podStartE2EDuration="1.546859171s" podCreationTimestamp="2026-03-09 13:39:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:40:00.537363824 +0000 UTC m=+1155.787535732" watchObservedRunningTime="2026-03-09 13:40:00.546859171 +0000 UTC m=+1155.797031079" Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.634029 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-kn2lh" podStartSLOduration=2.6339984039999997 podStartE2EDuration="2.633998404s" podCreationTimestamp="2026-03-09 13:39:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:40:00.602264176 +0000 UTC m=+1155.852436084" watchObservedRunningTime="2026-03-09 13:40:00.633998404 +0000 UTC m=+1155.884170312" Mar 09 13:40:00 crc kubenswrapper[4764]: I0309 13:40:00.648858 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-66ln9" podStartSLOduration=2.648834021 podStartE2EDuration="2.648834021s" podCreationTimestamp="2026-03-09 13:39:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:40:00.640565375 +0000 UTC m=+1155.890737283" watchObservedRunningTime="2026-03-09 13:40:00.648834021 +0000 UTC m=+1155.899005929" Mar 09 13:40:01 crc kubenswrapper[4764]: I0309 13:40:01.067787 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551060-n7f54"] Mar 09 13:40:01 crc kubenswrapper[4764]: I0309 13:40:01.087110 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:40:01 crc kubenswrapper[4764]: I0309 13:40:01.512717 4764 generic.go:334] "Generic (PLEG): container finished" podID="01e4dc90-6790-447b-ac2a-d2dfcde88d17" containerID="6de1cfcabddc41034a2e1c58fb3ef2484991e0985876cbbdc822fe1df3641db1" exitCode=0 Mar 09 13:40:01 crc kubenswrapper[4764]: I0309 13:40:01.512851 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-594d-account-create-update-dxsw5" event={"ID":"01e4dc90-6790-447b-ac2a-d2dfcde88d17","Type":"ContainerDied","Data":"6de1cfcabddc41034a2e1c58fb3ef2484991e0985876cbbdc822fe1df3641db1"} Mar 09 13:40:01 crc kubenswrapper[4764]: I0309 13:40:01.517289 4764 generic.go:334] "Generic (PLEG): container finished" podID="9d27c011-b8dd-4f14-9833-413f7a8faf8a" containerID="01ef1c726b7e7270c862ba5b9e31c73fecc7bf0ea188cc226f1bf52e6cb5af33" exitCode=0 Mar 09 13:40:01 crc kubenswrapper[4764]: I0309 13:40:01.517339 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0f8b-account-create-update-mxbcn" event={"ID":"9d27c011-b8dd-4f14-9833-413f7a8faf8a","Type":"ContainerDied","Data":"01ef1c726b7e7270c862ba5b9e31c73fecc7bf0ea188cc226f1bf52e6cb5af33"} Mar 09 13:40:01 crc kubenswrapper[4764]: I0309 13:40:01.519457 4764 generic.go:334] "Generic (PLEG): container finished" podID="7d681487-9af9-48e3-bb79-569b8c7bf26d" containerID="6bccf5846329079b445a150aae1cbe1d8637bb12a3644ba9afce7863cefc0fe8" exitCode=0 Mar 09 13:40:01 crc kubenswrapper[4764]: I0309 13:40:01.519607 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kn2lh" event={"ID":"7d681487-9af9-48e3-bb79-569b8c7bf26d","Type":"ContainerDied","Data":"6bccf5846329079b445a150aae1cbe1d8637bb12a3644ba9afce7863cefc0fe8"} Mar 09 13:40:01 crc kubenswrapper[4764]: I0309 13:40:01.522321 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551060-n7f54" event={"ID":"16623a65-1bef-4faa-a891-bae0a7d04977","Type":"ContainerStarted","Data":"7002d9bc70f8ed4dbd1fb5ae2c17202e34dbcfe2e8bc11fc9b82eda847bd6796"} Mar 09 13:40:01 crc kubenswrapper[4764]: I0309 13:40:01.530768 4764 generic.go:334] "Generic (PLEG): container finished" podID="811ef770-3be6-4f3b-9fc3-dee4df710c4f" containerID="cb8ff00b99c398bb890b473678e6ce951f3d71cc55317946f469bc84cba9ae54" exitCode=0 Mar 09 13:40:01 crc kubenswrapper[4764]: I0309 13:40:01.531228 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-66ln9" event={"ID":"811ef770-3be6-4f3b-9fc3-dee4df710c4f","Type":"ContainerDied","Data":"cb8ff00b99c398bb890b473678e6ce951f3d71cc55317946f469bc84cba9ae54"} Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.032203 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d7e9-account-create-update-n7gsb" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.039414 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wkxnp" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.170521 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hszn\" (UniqueName: \"kubernetes.io/projected/693ba99b-99d0-4b09-9f49-9deefe05abac-kube-api-access-2hszn\") pod \"693ba99b-99d0-4b09-9f49-9deefe05abac\" (UID: \"693ba99b-99d0-4b09-9f49-9deefe05abac\") " Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.170628 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75f29150-3689-48a6-9248-b6774f85fcd2-operator-scripts\") pod \"75f29150-3689-48a6-9248-b6774f85fcd2\" (UID: \"75f29150-3689-48a6-9248-b6774f85fcd2\") " Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.170705 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdsck\" (UniqueName: \"kubernetes.io/projected/75f29150-3689-48a6-9248-b6774f85fcd2-kube-api-access-vdsck\") pod \"75f29150-3689-48a6-9248-b6774f85fcd2\" (UID: \"75f29150-3689-48a6-9248-b6774f85fcd2\") " Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.170912 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/693ba99b-99d0-4b09-9f49-9deefe05abac-operator-scripts\") pod \"693ba99b-99d0-4b09-9f49-9deefe05abac\" (UID: \"693ba99b-99d0-4b09-9f49-9deefe05abac\") " Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.171610 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75f29150-3689-48a6-9248-b6774f85fcd2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "75f29150-3689-48a6-9248-b6774f85fcd2" (UID: "75f29150-3689-48a6-9248-b6774f85fcd2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.171996 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/693ba99b-99d0-4b09-9f49-9deefe05abac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "693ba99b-99d0-4b09-9f49-9deefe05abac" (UID: "693ba99b-99d0-4b09-9f49-9deefe05abac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.184007 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75f29150-3689-48a6-9248-b6774f85fcd2-kube-api-access-vdsck" (OuterVolumeSpecName: "kube-api-access-vdsck") pod "75f29150-3689-48a6-9248-b6774f85fcd2" (UID: "75f29150-3689-48a6-9248-b6774f85fcd2"). InnerVolumeSpecName "kube-api-access-vdsck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.184104 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/693ba99b-99d0-4b09-9f49-9deefe05abac-kube-api-access-2hszn" (OuterVolumeSpecName: "kube-api-access-2hszn") pod "693ba99b-99d0-4b09-9f49-9deefe05abac" (UID: "693ba99b-99d0-4b09-9f49-9deefe05abac"). InnerVolumeSpecName "kube-api-access-2hszn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.273513 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/693ba99b-99d0-4b09-9f49-9deefe05abac-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.273551 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hszn\" (UniqueName: \"kubernetes.io/projected/693ba99b-99d0-4b09-9f49-9deefe05abac-kube-api-access-2hszn\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.273567 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75f29150-3689-48a6-9248-b6774f85fcd2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.273577 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdsck\" (UniqueName: \"kubernetes.io/projected/75f29150-3689-48a6-9248-b6774f85fcd2-kube-api-access-vdsck\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.548825 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-wkxnp" event={"ID":"75f29150-3689-48a6-9248-b6774f85fcd2","Type":"ContainerDied","Data":"f6bcb6e597f958bade23e793909cd2f5b0c44f96f937977251a4fc97d7a51cda"} Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.548883 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6bcb6e597f958bade23e793909cd2f5b0c44f96f937977251a4fc97d7a51cda" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.548840 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wkxnp" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.552336 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d7e9-account-create-update-n7gsb" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.555920 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d7e9-account-create-update-n7gsb" event={"ID":"693ba99b-99d0-4b09-9f49-9deefe05abac","Type":"ContainerDied","Data":"f80fe4762d2af82804e15db7d3b8d4627b64c32ef1882fdf483222746daef33d"} Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.555970 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f80fe4762d2af82804e15db7d3b8d4627b64c32ef1882fdf483222746daef33d" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.923583 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kn2lh" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.985498 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-665kq\" (UniqueName: \"kubernetes.io/projected/7d681487-9af9-48e3-bb79-569b8c7bf26d-kube-api-access-665kq\") pod \"7d681487-9af9-48e3-bb79-569b8c7bf26d\" (UID: \"7d681487-9af9-48e3-bb79-569b8c7bf26d\") " Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.985583 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d681487-9af9-48e3-bb79-569b8c7bf26d-operator-scripts\") pod \"7d681487-9af9-48e3-bb79-569b8c7bf26d\" (UID: \"7d681487-9af9-48e3-bb79-569b8c7bf26d\") " Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.986275 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d681487-9af9-48e3-bb79-569b8c7bf26d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7d681487-9af9-48e3-bb79-569b8c7bf26d" (UID: "7d681487-9af9-48e3-bb79-569b8c7bf26d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:02 crc kubenswrapper[4764]: I0309 13:40:02.992316 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d681487-9af9-48e3-bb79-569b8c7bf26d-kube-api-access-665kq" (OuterVolumeSpecName: "kube-api-access-665kq") pod "7d681487-9af9-48e3-bb79-569b8c7bf26d" (UID: "7d681487-9af9-48e3-bb79-569b8c7bf26d"). InnerVolumeSpecName "kube-api-access-665kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.089163 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-665kq\" (UniqueName: \"kubernetes.io/projected/7d681487-9af9-48e3-bb79-569b8c7bf26d-kube-api-access-665kq\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.089222 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d681487-9af9-48e3-bb79-569b8c7bf26d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.214232 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-594d-account-create-update-dxsw5" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.222625 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0f8b-account-create-update-mxbcn" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.229993 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-66ln9" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.291731 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rst47\" (UniqueName: \"kubernetes.io/projected/9d27c011-b8dd-4f14-9833-413f7a8faf8a-kube-api-access-rst47\") pod \"9d27c011-b8dd-4f14-9833-413f7a8faf8a\" (UID: \"9d27c011-b8dd-4f14-9833-413f7a8faf8a\") " Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.292290 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d27c011-b8dd-4f14-9833-413f7a8faf8a-operator-scripts\") pod \"9d27c011-b8dd-4f14-9833-413f7a8faf8a\" (UID: \"9d27c011-b8dd-4f14-9833-413f7a8faf8a\") " Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.292385 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/811ef770-3be6-4f3b-9fc3-dee4df710c4f-operator-scripts\") pod \"811ef770-3be6-4f3b-9fc3-dee4df710c4f\" (UID: \"811ef770-3be6-4f3b-9fc3-dee4df710c4f\") " Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.292475 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9pfh\" (UniqueName: \"kubernetes.io/projected/811ef770-3be6-4f3b-9fc3-dee4df710c4f-kube-api-access-c9pfh\") pod \"811ef770-3be6-4f3b-9fc3-dee4df710c4f\" (UID: \"811ef770-3be6-4f3b-9fc3-dee4df710c4f\") " Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.292520 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpt8k\" (UniqueName: \"kubernetes.io/projected/01e4dc90-6790-447b-ac2a-d2dfcde88d17-kube-api-access-lpt8k\") pod \"01e4dc90-6790-447b-ac2a-d2dfcde88d17\" (UID: \"01e4dc90-6790-447b-ac2a-d2dfcde88d17\") " Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.292672 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01e4dc90-6790-447b-ac2a-d2dfcde88d17-operator-scripts\") pod \"01e4dc90-6790-447b-ac2a-d2dfcde88d17\" (UID: \"01e4dc90-6790-447b-ac2a-d2dfcde88d17\") " Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.294662 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01e4dc90-6790-447b-ac2a-d2dfcde88d17-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "01e4dc90-6790-447b-ac2a-d2dfcde88d17" (UID: "01e4dc90-6790-447b-ac2a-d2dfcde88d17"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.296324 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/811ef770-3be6-4f3b-9fc3-dee4df710c4f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "811ef770-3be6-4f3b-9fc3-dee4df710c4f" (UID: "811ef770-3be6-4f3b-9fc3-dee4df710c4f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.296529 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d27c011-b8dd-4f14-9833-413f7a8faf8a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9d27c011-b8dd-4f14-9833-413f7a8faf8a" (UID: "9d27c011-b8dd-4f14-9833-413f7a8faf8a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.299387 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d27c011-b8dd-4f14-9833-413f7a8faf8a-kube-api-access-rst47" (OuterVolumeSpecName: "kube-api-access-rst47") pod "9d27c011-b8dd-4f14-9833-413f7a8faf8a" (UID: "9d27c011-b8dd-4f14-9833-413f7a8faf8a"). InnerVolumeSpecName "kube-api-access-rst47". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.299484 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/811ef770-3be6-4f3b-9fc3-dee4df710c4f-kube-api-access-c9pfh" (OuterVolumeSpecName: "kube-api-access-c9pfh") pod "811ef770-3be6-4f3b-9fc3-dee4df710c4f" (UID: "811ef770-3be6-4f3b-9fc3-dee4df710c4f"). InnerVolumeSpecName "kube-api-access-c9pfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.300415 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01e4dc90-6790-447b-ac2a-d2dfcde88d17-kube-api-access-lpt8k" (OuterVolumeSpecName: "kube-api-access-lpt8k") pod "01e4dc90-6790-447b-ac2a-d2dfcde88d17" (UID: "01e4dc90-6790-447b-ac2a-d2dfcde88d17"). InnerVolumeSpecName "kube-api-access-lpt8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.395397 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rst47\" (UniqueName: \"kubernetes.io/projected/9d27c011-b8dd-4f14-9833-413f7a8faf8a-kube-api-access-rst47\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.395446 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d27c011-b8dd-4f14-9833-413f7a8faf8a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.395464 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/811ef770-3be6-4f3b-9fc3-dee4df710c4f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.395477 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9pfh\" (UniqueName: \"kubernetes.io/projected/811ef770-3be6-4f3b-9fc3-dee4df710c4f-kube-api-access-c9pfh\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.395490 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpt8k\" (UniqueName: \"kubernetes.io/projected/01e4dc90-6790-447b-ac2a-d2dfcde88d17-kube-api-access-lpt8k\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.395507 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01e4dc90-6790-447b-ac2a-d2dfcde88d17-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.572191 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-594d-account-create-update-dxsw5" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.579065 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0f8b-account-create-update-mxbcn" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.583433 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-594d-account-create-update-dxsw5" event={"ID":"01e4dc90-6790-447b-ac2a-d2dfcde88d17","Type":"ContainerDied","Data":"6db0dcbeaa0e445d9563d80bdb4ee91587520750c60db46e9ceacf66f6d60c54"} Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.583487 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6db0dcbeaa0e445d9563d80bdb4ee91587520750c60db46e9ceacf66f6d60c54" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.583500 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0f8b-account-create-update-mxbcn" event={"ID":"9d27c011-b8dd-4f14-9833-413f7a8faf8a","Type":"ContainerDied","Data":"1f18354cc3ede52419192c5d865dc6321532287310adb5cffdafe9ba08f30940"} Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.583518 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f18354cc3ede52419192c5d865dc6321532287310adb5cffdafe9ba08f30940" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.586222 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kn2lh" event={"ID":"7d681487-9af9-48e3-bb79-569b8c7bf26d","Type":"ContainerDied","Data":"cb35dabb9f5f0edd526374363673fcbd6955aa3235f600a648fe50273b0796dd"} Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.586266 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb35dabb9f5f0edd526374363673fcbd6955aa3235f600a648fe50273b0796dd" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.586461 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kn2lh" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.593372 4764 generic.go:334] "Generic (PLEG): container finished" podID="16623a65-1bef-4faa-a891-bae0a7d04977" containerID="e53ea6806faef5cb682bac2b7668fda78ebb3a7b30791520082d7ab2a13f5aa0" exitCode=0 Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.593496 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551060-n7f54" event={"ID":"16623a65-1bef-4faa-a891-bae0a7d04977","Type":"ContainerDied","Data":"e53ea6806faef5cb682bac2b7668fda78ebb3a7b30791520082d7ab2a13f5aa0"} Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.595215 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-66ln9" event={"ID":"811ef770-3be6-4f3b-9fc3-dee4df710c4f","Type":"ContainerDied","Data":"616b45c0cfdfa8c2edf2207afc781be5cb32d60b419fa2c0f9c02788b1b97cc1"} Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.595256 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="616b45c0cfdfa8c2edf2207afc781be5cb32d60b419fa2c0f9c02788b1b97cc1" Mar 09 13:40:03 crc kubenswrapper[4764]: I0309 13:40:03.595322 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-66ln9" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.524041 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-rvfs9"] Mar 09 13:40:04 crc kubenswrapper[4764]: E0309 13:40:04.524700 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d27c011-b8dd-4f14-9833-413f7a8faf8a" containerName="mariadb-account-create-update" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.524716 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d27c011-b8dd-4f14-9833-413f7a8faf8a" containerName="mariadb-account-create-update" Mar 09 13:40:04 crc kubenswrapper[4764]: E0309 13:40:04.524739 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75f29150-3689-48a6-9248-b6774f85fcd2" containerName="mariadb-database-create" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.524745 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="75f29150-3689-48a6-9248-b6774f85fcd2" containerName="mariadb-database-create" Mar 09 13:40:04 crc kubenswrapper[4764]: E0309 13:40:04.524760 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d681487-9af9-48e3-bb79-569b8c7bf26d" containerName="mariadb-database-create" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.524766 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d681487-9af9-48e3-bb79-569b8c7bf26d" containerName="mariadb-database-create" Mar 09 13:40:04 crc kubenswrapper[4764]: E0309 13:40:04.524779 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e4dc90-6790-447b-ac2a-d2dfcde88d17" containerName="mariadb-account-create-update" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.524786 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e4dc90-6790-447b-ac2a-d2dfcde88d17" containerName="mariadb-account-create-update" Mar 09 13:40:04 crc kubenswrapper[4764]: E0309 13:40:04.524797 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="811ef770-3be6-4f3b-9fc3-dee4df710c4f" containerName="mariadb-database-create" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.524803 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="811ef770-3be6-4f3b-9fc3-dee4df710c4f" containerName="mariadb-database-create" Mar 09 13:40:04 crc kubenswrapper[4764]: E0309 13:40:04.524814 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="693ba99b-99d0-4b09-9f49-9deefe05abac" containerName="mariadb-account-create-update" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.524820 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="693ba99b-99d0-4b09-9f49-9deefe05abac" containerName="mariadb-account-create-update" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.524979 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d27c011-b8dd-4f14-9833-413f7a8faf8a" containerName="mariadb-account-create-update" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.524990 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="811ef770-3be6-4f3b-9fc3-dee4df710c4f" containerName="mariadb-database-create" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.524999 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="01e4dc90-6790-447b-ac2a-d2dfcde88d17" containerName="mariadb-account-create-update" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.525007 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d681487-9af9-48e3-bb79-569b8c7bf26d" containerName="mariadb-database-create" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.525018 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="75f29150-3689-48a6-9248-b6774f85fcd2" containerName="mariadb-database-create" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.525029 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="693ba99b-99d0-4b09-9f49-9deefe05abac" containerName="mariadb-account-create-update" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.525596 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rvfs9" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.534610 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-rvfs9"] Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.534972 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.618918 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b28dc3f9-47a1-436c-865c-4d98e6ba960c-operator-scripts\") pod \"root-account-create-update-rvfs9\" (UID: \"b28dc3f9-47a1-436c-865c-4d98e6ba960c\") " pod="openstack/root-account-create-update-rvfs9" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.619442 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ftcz\" (UniqueName: \"kubernetes.io/projected/b28dc3f9-47a1-436c-865c-4d98e6ba960c-kube-api-access-6ftcz\") pod \"root-account-create-update-rvfs9\" (UID: \"b28dc3f9-47a1-436c-865c-4d98e6ba960c\") " pod="openstack/root-account-create-update-rvfs9" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.723072 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ftcz\" (UniqueName: \"kubernetes.io/projected/b28dc3f9-47a1-436c-865c-4d98e6ba960c-kube-api-access-6ftcz\") pod \"root-account-create-update-rvfs9\" (UID: \"b28dc3f9-47a1-436c-865c-4d98e6ba960c\") " pod="openstack/root-account-create-update-rvfs9" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.723717 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b28dc3f9-47a1-436c-865c-4d98e6ba960c-operator-scripts\") pod \"root-account-create-update-rvfs9\" (UID: \"b28dc3f9-47a1-436c-865c-4d98e6ba960c\") " pod="openstack/root-account-create-update-rvfs9" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.724899 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b28dc3f9-47a1-436c-865c-4d98e6ba960c-operator-scripts\") pod \"root-account-create-update-rvfs9\" (UID: \"b28dc3f9-47a1-436c-865c-4d98e6ba960c\") " pod="openstack/root-account-create-update-rvfs9" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.746412 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ftcz\" (UniqueName: \"kubernetes.io/projected/b28dc3f9-47a1-436c-865c-4d98e6ba960c-kube-api-access-6ftcz\") pod \"root-account-create-update-rvfs9\" (UID: \"b28dc3f9-47a1-436c-865c-4d98e6ba960c\") " pod="openstack/root-account-create-update-rvfs9" Mar 09 13:40:04 crc kubenswrapper[4764]: I0309 13:40:04.854133 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rvfs9" Mar 09 13:40:05 crc kubenswrapper[4764]: I0309 13:40:05.000417 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551060-n7f54" Mar 09 13:40:05 crc kubenswrapper[4764]: I0309 13:40:05.134203 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gb5hw\" (UniqueName: \"kubernetes.io/projected/16623a65-1bef-4faa-a891-bae0a7d04977-kube-api-access-gb5hw\") pod \"16623a65-1bef-4faa-a891-bae0a7d04977\" (UID: \"16623a65-1bef-4faa-a891-bae0a7d04977\") " Mar 09 13:40:05 crc kubenswrapper[4764]: I0309 13:40:05.152943 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16623a65-1bef-4faa-a891-bae0a7d04977-kube-api-access-gb5hw" (OuterVolumeSpecName: "kube-api-access-gb5hw") pod "16623a65-1bef-4faa-a891-bae0a7d04977" (UID: "16623a65-1bef-4faa-a891-bae0a7d04977"). InnerVolumeSpecName "kube-api-access-gb5hw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:05 crc kubenswrapper[4764]: I0309 13:40:05.237116 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gb5hw\" (UniqueName: \"kubernetes.io/projected/16623a65-1bef-4faa-a891-bae0a7d04977-kube-api-access-gb5hw\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:05 crc kubenswrapper[4764]: I0309 13:40:05.336717 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-rvfs9"] Mar 09 13:40:05 crc kubenswrapper[4764]: W0309 13:40:05.336807 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb28dc3f9_47a1_436c_865c_4d98e6ba960c.slice/crio-c8e1b592a881fc9ae62941eafc71a37253c86e963d35cd07e1fd548914aa0be6 WatchSource:0}: Error finding container c8e1b592a881fc9ae62941eafc71a37253c86e963d35cd07e1fd548914aa0be6: Status 404 returned error can't find the container with id c8e1b592a881fc9ae62941eafc71a37253c86e963d35cd07e1fd548914aa0be6 Mar 09 13:40:05 crc kubenswrapper[4764]: I0309 13:40:05.612983 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rvfs9" event={"ID":"b28dc3f9-47a1-436c-865c-4d98e6ba960c","Type":"ContainerStarted","Data":"c8e1b592a881fc9ae62941eafc71a37253c86e963d35cd07e1fd548914aa0be6"} Mar 09 13:40:05 crc kubenswrapper[4764]: I0309 13:40:05.614701 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551060-n7f54" event={"ID":"16623a65-1bef-4faa-a891-bae0a7d04977","Type":"ContainerDied","Data":"7002d9bc70f8ed4dbd1fb5ae2c17202e34dbcfe2e8bc11fc9b82eda847bd6796"} Mar 09 13:40:05 crc kubenswrapper[4764]: I0309 13:40:05.614732 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7002d9bc70f8ed4dbd1fb5ae2c17202e34dbcfe2e8bc11fc9b82eda847bd6796" Mar 09 13:40:05 crc kubenswrapper[4764]: I0309 13:40:05.614815 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551060-n7f54" Mar 09 13:40:06 crc kubenswrapper[4764]: I0309 13:40:06.101094 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551054-n54vp"] Mar 09 13:40:06 crc kubenswrapper[4764]: I0309 13:40:06.110234 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551054-n54vp"] Mar 09 13:40:06 crc kubenswrapper[4764]: I0309 13:40:06.627233 4764 generic.go:334] "Generic (PLEG): container finished" podID="b28dc3f9-47a1-436c-865c-4d98e6ba960c" containerID="e74ba0435a2089e000c8df414a7a0d71d67c7b4a00cc240811cbefae67ccd184" exitCode=0 Mar 09 13:40:06 crc kubenswrapper[4764]: I0309 13:40:06.627284 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rvfs9" event={"ID":"b28dc3f9-47a1-436c-865c-4d98e6ba960c","Type":"ContainerDied","Data":"e74ba0435a2089e000c8df414a7a0d71d67c7b4a00cc240811cbefae67ccd184"} Mar 09 13:40:07 crc kubenswrapper[4764]: I0309 13:40:07.571778 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="034371f5-4d6d-4a44-9678-9093ffaf3f9d" path="/var/lib/kubelet/pods/034371f5-4d6d-4a44-9678-9093ffaf3f9d/volumes" Mar 09 13:40:07 crc kubenswrapper[4764]: I0309 13:40:07.969739 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rvfs9" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.101842 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ftcz\" (UniqueName: \"kubernetes.io/projected/b28dc3f9-47a1-436c-865c-4d98e6ba960c-kube-api-access-6ftcz\") pod \"b28dc3f9-47a1-436c-865c-4d98e6ba960c\" (UID: \"b28dc3f9-47a1-436c-865c-4d98e6ba960c\") " Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.101942 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b28dc3f9-47a1-436c-865c-4d98e6ba960c-operator-scripts\") pod \"b28dc3f9-47a1-436c-865c-4d98e6ba960c\" (UID: \"b28dc3f9-47a1-436c-865c-4d98e6ba960c\") " Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.107826 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b28dc3f9-47a1-436c-865c-4d98e6ba960c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b28dc3f9-47a1-436c-865c-4d98e6ba960c" (UID: "b28dc3f9-47a1-436c-865c-4d98e6ba960c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.110405 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b28dc3f9-47a1-436c-865c-4d98e6ba960c-kube-api-access-6ftcz" (OuterVolumeSpecName: "kube-api-access-6ftcz") pod "b28dc3f9-47a1-436c-865c-4d98e6ba960c" (UID: "b28dc3f9-47a1-436c-865c-4d98e6ba960c"). InnerVolumeSpecName "kube-api-access-6ftcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.204563 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ftcz\" (UniqueName: \"kubernetes.io/projected/b28dc3f9-47a1-436c-865c-4d98e6ba960c-kube-api-access-6ftcz\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.204602 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b28dc3f9-47a1-436c-865c-4d98e6ba960c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.251927 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-vzxr2"] Mar 09 13:40:08 crc kubenswrapper[4764]: E0309 13:40:08.252347 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b28dc3f9-47a1-436c-865c-4d98e6ba960c" containerName="mariadb-account-create-update" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.252374 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b28dc3f9-47a1-436c-865c-4d98e6ba960c" containerName="mariadb-account-create-update" Mar 09 13:40:08 crc kubenswrapper[4764]: E0309 13:40:08.252423 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16623a65-1bef-4faa-a891-bae0a7d04977" containerName="oc" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.252433 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="16623a65-1bef-4faa-a891-bae0a7d04977" containerName="oc" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.252630 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="16623a65-1bef-4faa-a891-bae0a7d04977" containerName="oc" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.252680 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b28dc3f9-47a1-436c-865c-4d98e6ba960c" containerName="mariadb-account-create-update" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.253339 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.255556 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-8p7c7" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.256637 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.268334 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-vzxr2"] Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.409353 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-db-sync-config-data\") pod \"glance-db-sync-vzxr2\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.409417 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-config-data\") pod \"glance-db-sync-vzxr2\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.409612 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-combined-ca-bundle\") pod \"glance-db-sync-vzxr2\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.409682 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pflc\" (UniqueName: \"kubernetes.io/projected/29e20119-f7d3-4b10-82c3-afbfa462c831-kube-api-access-2pflc\") pod \"glance-db-sync-vzxr2\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.511393 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-combined-ca-bundle\") pod \"glance-db-sync-vzxr2\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.511445 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pflc\" (UniqueName: \"kubernetes.io/projected/29e20119-f7d3-4b10-82c3-afbfa462c831-kube-api-access-2pflc\") pod \"glance-db-sync-vzxr2\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.511515 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-db-sync-config-data\") pod \"glance-db-sync-vzxr2\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.511543 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-config-data\") pod \"glance-db-sync-vzxr2\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.516157 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-config-data\") pod \"glance-db-sync-vzxr2\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.517052 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-db-sync-config-data\") pod \"glance-db-sync-vzxr2\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.518559 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-combined-ca-bundle\") pod \"glance-db-sync-vzxr2\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.529325 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pflc\" (UniqueName: \"kubernetes.io/projected/29e20119-f7d3-4b10-82c3-afbfa462c831-kube-api-access-2pflc\") pod \"glance-db-sync-vzxr2\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.573020 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.650831 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rvfs9" event={"ID":"b28dc3f9-47a1-436c-865c-4d98e6ba960c","Type":"ContainerDied","Data":"c8e1b592a881fc9ae62941eafc71a37253c86e963d35cd07e1fd548914aa0be6"} Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.650866 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8e1b592a881fc9ae62941eafc71a37253c86e963d35cd07e1fd548914aa0be6" Mar 09 13:40:08 crc kubenswrapper[4764]: I0309 13:40:08.650926 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rvfs9" Mar 09 13:40:09 crc kubenswrapper[4764]: I0309 13:40:09.151392 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-vzxr2"] Mar 09 13:40:09 crc kubenswrapper[4764]: I0309 13:40:09.658612 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vzxr2" event={"ID":"29e20119-f7d3-4b10-82c3-afbfa462c831","Type":"ContainerStarted","Data":"bf52819fae166b95cbd52539da218f51f13c6084d42cea34252b0b03900eaee5"} Mar 09 13:40:10 crc kubenswrapper[4764]: I0309 13:40:10.942111 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-rvfs9"] Mar 09 13:40:10 crc kubenswrapper[4764]: I0309 13:40:10.948802 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-rvfs9"] Mar 09 13:40:11 crc kubenswrapper[4764]: I0309 13:40:11.571400 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b28dc3f9-47a1-436c-865c-4d98e6ba960c" path="/var/lib/kubelet/pods/b28dc3f9-47a1-436c-865c-4d98e6ba960c/volumes" Mar 09 13:40:12 crc kubenswrapper[4764]: I0309 13:40:12.685829 4764 generic.go:334] "Generic (PLEG): container finished" podID="11c8bf9f-a031-4e56-b1d7-49b407eabaf7" containerID="fd0a79b758702c401b2bbc87884a5f0ad37053c07ad4da0c2c69610d9e0509c2" exitCode=0 Mar 09 13:40:12 crc kubenswrapper[4764]: I0309 13:40:12.685910 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"11c8bf9f-a031-4e56-b1d7-49b407eabaf7","Type":"ContainerDied","Data":"fd0a79b758702c401b2bbc87884a5f0ad37053c07ad4da0c2c69610d9e0509c2"} Mar 09 13:40:12 crc kubenswrapper[4764]: I0309 13:40:12.692073 4764 generic.go:334] "Generic (PLEG): container finished" podID="507bcef1-e9ef-4eb1-85ce-358209b944bc" containerID="f2c108c98ecfd46a140cf2428c8bd7a9e4bb88eab84b2da0f3783d087eb2e601" exitCode=0 Mar 09 13:40:12 crc kubenswrapper[4764]: I0309 13:40:12.692158 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"507bcef1-e9ef-4eb1-85ce-358209b944bc","Type":"ContainerDied","Data":"f2c108c98ecfd46a140cf2428c8bd7a9e4bb88eab84b2da0f3783d087eb2e601"} Mar 09 13:40:13 crc kubenswrapper[4764]: I0309 13:40:13.153995 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 09 13:40:13 crc kubenswrapper[4764]: I0309 13:40:13.658595 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-qm7vs" podUID="9bbe03cf-76d5-440a-903f-50c382aa3a4e" containerName="ovn-controller" probeResult="failure" output=< Mar 09 13:40:13 crc kubenswrapper[4764]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 09 13:40:13 crc kubenswrapper[4764]: > Mar 09 13:40:13 crc kubenswrapper[4764]: I0309 13:40:13.666686 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:40:13 crc kubenswrapper[4764]: I0309 13:40:13.680926 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-2zkzm" Mar 09 13:40:13 crc kubenswrapper[4764]: I0309 13:40:13.704688 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"11c8bf9f-a031-4e56-b1d7-49b407eabaf7","Type":"ContainerStarted","Data":"c051454f6de8558a84ceb371da4edc9fd851c2e829bf18eb804c0c4e657ef0ee"} Mar 09 13:40:13 crc kubenswrapper[4764]: I0309 13:40:13.705959 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:40:13 crc kubenswrapper[4764]: I0309 13:40:13.715547 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"507bcef1-e9ef-4eb1-85ce-358209b944bc","Type":"ContainerStarted","Data":"bed2e6d72dc23d14fbbd197ac01731271bf14e49fb44cf788ecc2cd8bf1b2828"} Mar 09 13:40:13 crc kubenswrapper[4764]: I0309 13:40:13.716074 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 09 13:40:13 crc kubenswrapper[4764]: I0309 13:40:13.734673 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=53.050281617 podStartE2EDuration="1m0.734628852s" podCreationTimestamp="2026-03-09 13:39:13 +0000 UTC" firstStartedPulling="2026-03-09 13:39:30.053943006 +0000 UTC m=+1125.304114924" lastFinishedPulling="2026-03-09 13:39:37.738290251 +0000 UTC m=+1132.988462159" observedRunningTime="2026-03-09 13:40:13.733423032 +0000 UTC m=+1168.983594970" watchObservedRunningTime="2026-03-09 13:40:13.734628852 +0000 UTC m=+1168.984800770" Mar 09 13:40:13 crc kubenswrapper[4764]: I0309 13:40:13.770334 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=52.988453527 podStartE2EDuration="1m0.770306306s" podCreationTimestamp="2026-03-09 13:39:13 +0000 UTC" firstStartedPulling="2026-03-09 13:39:30.259112606 +0000 UTC m=+1125.509284514" lastFinishedPulling="2026-03-09 13:39:38.040965385 +0000 UTC m=+1133.291137293" observedRunningTime="2026-03-09 13:40:13.764637344 +0000 UTC m=+1169.014809272" watchObservedRunningTime="2026-03-09 13:40:13.770306306 +0000 UTC m=+1169.020478214" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.030107 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-qm7vs-config-pddrl"] Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.031450 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.034253 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.056608 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qm7vs-config-pddrl"] Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.142991 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-run\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.143087 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-additional-scripts\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.143125 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgwlt\" (UniqueName: \"kubernetes.io/projected/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-kube-api-access-hgwlt\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.143181 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-scripts\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.143260 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-log-ovn\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.143282 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-run-ovn\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.244802 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-scripts\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.244918 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-log-ovn\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.244944 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-run-ovn\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.244997 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-run\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.245045 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-additional-scripts\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.245077 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgwlt\" (UniqueName: \"kubernetes.io/projected/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-kube-api-access-hgwlt\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.247899 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-scripts\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.247995 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-run\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.248006 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-run-ovn\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.248130 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-log-ovn\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.248481 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-additional-scripts\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.274184 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgwlt\" (UniqueName: \"kubernetes.io/projected/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-kube-api-access-hgwlt\") pod \"ovn-controller-qm7vs-config-pddrl\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.362728 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:14 crc kubenswrapper[4764]: I0309 13:40:14.829399 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qm7vs-config-pddrl"] Mar 09 13:40:14 crc kubenswrapper[4764]: W0309 13:40:14.839475 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3a4b4d4_cc62_4a80_91f9_fa0c2e98292c.slice/crio-222c901f3647541831b70903ff46ece77edb02628f24984141b48907d046ac11 WatchSource:0}: Error finding container 222c901f3647541831b70903ff46ece77edb02628f24984141b48907d046ac11: Status 404 returned error can't find the container with id 222c901f3647541831b70903ff46ece77edb02628f24984141b48907d046ac11 Mar 09 13:40:15 crc kubenswrapper[4764]: I0309 13:40:15.736138 4764 generic.go:334] "Generic (PLEG): container finished" podID="d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c" containerID="f865dbb42b23c282654ed11e8388916b4f7e6868644331082fbaed39e3cbb723" exitCode=0 Mar 09 13:40:15 crc kubenswrapper[4764]: I0309 13:40:15.736734 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qm7vs-config-pddrl" event={"ID":"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c","Type":"ContainerDied","Data":"f865dbb42b23c282654ed11e8388916b4f7e6868644331082fbaed39e3cbb723"} Mar 09 13:40:15 crc kubenswrapper[4764]: I0309 13:40:15.736769 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qm7vs-config-pddrl" event={"ID":"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c","Type":"ContainerStarted","Data":"222c901f3647541831b70903ff46ece77edb02628f24984141b48907d046ac11"} Mar 09 13:40:15 crc kubenswrapper[4764]: I0309 13:40:15.962537 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-htpf8"] Mar 09 13:40:15 crc kubenswrapper[4764]: I0309 13:40:15.965244 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-htpf8" Mar 09 13:40:15 crc kubenswrapper[4764]: I0309 13:40:15.967795 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-htpf8"] Mar 09 13:40:15 crc kubenswrapper[4764]: I0309 13:40:15.970499 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 09 13:40:16 crc kubenswrapper[4764]: I0309 13:40:16.080249 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w89w5\" (UniqueName: \"kubernetes.io/projected/88e0f4c9-1553-4aca-83f2-e0461ddf062b-kube-api-access-w89w5\") pod \"root-account-create-update-htpf8\" (UID: \"88e0f4c9-1553-4aca-83f2-e0461ddf062b\") " pod="openstack/root-account-create-update-htpf8" Mar 09 13:40:16 crc kubenswrapper[4764]: I0309 13:40:16.080375 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88e0f4c9-1553-4aca-83f2-e0461ddf062b-operator-scripts\") pod \"root-account-create-update-htpf8\" (UID: \"88e0f4c9-1553-4aca-83f2-e0461ddf062b\") " pod="openstack/root-account-create-update-htpf8" Mar 09 13:40:16 crc kubenswrapper[4764]: I0309 13:40:16.183885 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w89w5\" (UniqueName: \"kubernetes.io/projected/88e0f4c9-1553-4aca-83f2-e0461ddf062b-kube-api-access-w89w5\") pod \"root-account-create-update-htpf8\" (UID: \"88e0f4c9-1553-4aca-83f2-e0461ddf062b\") " pod="openstack/root-account-create-update-htpf8" Mar 09 13:40:16 crc kubenswrapper[4764]: I0309 13:40:16.185383 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88e0f4c9-1553-4aca-83f2-e0461ddf062b-operator-scripts\") pod \"root-account-create-update-htpf8\" (UID: \"88e0f4c9-1553-4aca-83f2-e0461ddf062b\") " pod="openstack/root-account-create-update-htpf8" Mar 09 13:40:16 crc kubenswrapper[4764]: I0309 13:40:16.186452 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88e0f4c9-1553-4aca-83f2-e0461ddf062b-operator-scripts\") pod \"root-account-create-update-htpf8\" (UID: \"88e0f4c9-1553-4aca-83f2-e0461ddf062b\") " pod="openstack/root-account-create-update-htpf8" Mar 09 13:40:16 crc kubenswrapper[4764]: I0309 13:40:16.224295 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w89w5\" (UniqueName: \"kubernetes.io/projected/88e0f4c9-1553-4aca-83f2-e0461ddf062b-kube-api-access-w89w5\") pod \"root-account-create-update-htpf8\" (UID: \"88e0f4c9-1553-4aca-83f2-e0461ddf062b\") " pod="openstack/root-account-create-update-htpf8" Mar 09 13:40:16 crc kubenswrapper[4764]: I0309 13:40:16.289584 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-htpf8" Mar 09 13:40:18 crc kubenswrapper[4764]: I0309 13:40:18.661360 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-qm7vs" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.117212 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.301082 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgwlt\" (UniqueName: \"kubernetes.io/projected/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-kube-api-access-hgwlt\") pod \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.301505 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-scripts\") pod \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.301638 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-additional-scripts\") pod \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.301711 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-run-ovn\") pod \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.301775 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-run\") pod \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.301843 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-log-ovn\") pod \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\" (UID: \"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c\") " Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.301923 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c" (UID: "d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.301997 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-run" (OuterVolumeSpecName: "var-run") pod "d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c" (UID: "d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.302077 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c" (UID: "d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.302716 4764 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.302751 4764 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.302764 4764 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-var-run\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.303401 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c" (UID: "d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.304072 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-scripts" (OuterVolumeSpecName: "scripts") pod "d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c" (UID: "d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.305922 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-kube-api-access-hgwlt" (OuterVolumeSpecName: "kube-api-access-hgwlt") pod "d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c" (UID: "d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c"). InnerVolumeSpecName "kube-api-access-hgwlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.404952 4764 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.405000 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgwlt\" (UniqueName: \"kubernetes.io/projected/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-kube-api-access-hgwlt\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.405015 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.536971 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-htpf8"] Mar 09 13:40:22 crc kubenswrapper[4764]: W0309 13:40:22.550445 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88e0f4c9_1553_4aca_83f2_e0461ddf062b.slice/crio-26051806488a9ea5ced1bc0738712a4f97689b2959961164a42a4d12b30ca188 WatchSource:0}: Error finding container 26051806488a9ea5ced1bc0738712a4f97689b2959961164a42a4d12b30ca188: Status 404 returned error can't find the container with id 26051806488a9ea5ced1bc0738712a4f97689b2959961164a42a4d12b30ca188 Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.797915 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vzxr2" event={"ID":"29e20119-f7d3-4b10-82c3-afbfa462c831","Type":"ContainerStarted","Data":"06f1bf05b0fa436ae59020308ffb3c9a4a73f8d30f844f8ee3828c34f9dbc702"} Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.800948 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qm7vs-config-pddrl" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.800998 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qm7vs-config-pddrl" event={"ID":"d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c","Type":"ContainerDied","Data":"222c901f3647541831b70903ff46ece77edb02628f24984141b48907d046ac11"} Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.801050 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="222c901f3647541831b70903ff46ece77edb02628f24984141b48907d046ac11" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.804051 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-htpf8" event={"ID":"88e0f4c9-1553-4aca-83f2-e0461ddf062b","Type":"ContainerStarted","Data":"ccd695e4102eac8ad1bb7aec25a2c1252558ea5824f39855b788c7a3324a1c3b"} Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.804094 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-htpf8" event={"ID":"88e0f4c9-1553-4aca-83f2-e0461ddf062b","Type":"ContainerStarted","Data":"26051806488a9ea5ced1bc0738712a4f97689b2959961164a42a4d12b30ca188"} Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.822120 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-vzxr2" podStartSLOduration=1.8062364720000001 podStartE2EDuration="14.822093359s" podCreationTimestamp="2026-03-09 13:40:08 +0000 UTC" firstStartedPulling="2026-03-09 13:40:09.155374838 +0000 UTC m=+1164.405546746" lastFinishedPulling="2026-03-09 13:40:22.171231725 +0000 UTC m=+1177.421403633" observedRunningTime="2026-03-09 13:40:22.819462883 +0000 UTC m=+1178.069634791" watchObservedRunningTime="2026-03-09 13:40:22.822093359 +0000 UTC m=+1178.072265267" Mar 09 13:40:22 crc kubenswrapper[4764]: I0309 13:40:22.847062 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-htpf8" podStartSLOduration=7.847032414 podStartE2EDuration="7.847032414s" podCreationTimestamp="2026-03-09 13:40:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:40:22.840761967 +0000 UTC m=+1178.090933895" watchObservedRunningTime="2026-03-09 13:40:22.847032414 +0000 UTC m=+1178.097204342" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.251958 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-qm7vs-config-pddrl"] Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.260093 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-qm7vs-config-pddrl"] Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.355119 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-qm7vs-config-xgh6g"] Mar 09 13:40:23 crc kubenswrapper[4764]: E0309 13:40:23.355563 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c" containerName="ovn-config" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.355589 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c" containerName="ovn-config" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.355812 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c" containerName="ovn-config" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.356577 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.359265 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.374514 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qm7vs-config-xgh6g"] Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.423390 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/186bfdab-9518-4d38-9f43-a0eafa335ed9-additional-scripts\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.423445 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-879pf\" (UniqueName: \"kubernetes.io/projected/186bfdab-9518-4d38-9f43-a0eafa335ed9-kube-api-access-879pf\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.423667 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-run-ovn\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.423786 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/186bfdab-9518-4d38-9f43-a0eafa335ed9-scripts\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.423922 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-run\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.424044 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-log-ovn\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.525730 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/186bfdab-9518-4d38-9f43-a0eafa335ed9-scripts\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.528402 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-run\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.528473 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-log-ovn\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.528549 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/186bfdab-9518-4d38-9f43-a0eafa335ed9-additional-scripts\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.528572 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-879pf\" (UniqueName: \"kubernetes.io/projected/186bfdab-9518-4d38-9f43-a0eafa335ed9-kube-api-access-879pf\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.528741 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-run\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.528775 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-run-ovn\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.528870 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-log-ovn\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.528959 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-run-ovn\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.529201 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/186bfdab-9518-4d38-9f43-a0eafa335ed9-scripts\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.529480 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/186bfdab-9518-4d38-9f43-a0eafa335ed9-additional-scripts\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.555072 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-879pf\" (UniqueName: \"kubernetes.io/projected/186bfdab-9518-4d38-9f43-a0eafa335ed9-kube-api-access-879pf\") pod \"ovn-controller-qm7vs-config-xgh6g\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.572493 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c" path="/var/lib/kubelet/pods/d3a4b4d4-cc62-4a80-91f9-fa0c2e98292c/volumes" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.676454 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.844215 4764 generic.go:334] "Generic (PLEG): container finished" podID="88e0f4c9-1553-4aca-83f2-e0461ddf062b" containerID="ccd695e4102eac8ad1bb7aec25a2c1252558ea5824f39855b788c7a3324a1c3b" exitCode=0 Mar 09 13:40:23 crc kubenswrapper[4764]: I0309 13:40:23.845041 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-htpf8" event={"ID":"88e0f4c9-1553-4aca-83f2-e0461ddf062b","Type":"ContainerDied","Data":"ccd695e4102eac8ad1bb7aec25a2c1252558ea5824f39855b788c7a3324a1c3b"} Mar 09 13:40:24 crc kubenswrapper[4764]: W0309 13:40:24.197479 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod186bfdab_9518_4d38_9f43_a0eafa335ed9.slice/crio-182ebbef535a87278100e77191a38749232657d5601f48b86924346516887649 WatchSource:0}: Error finding container 182ebbef535a87278100e77191a38749232657d5601f48b86924346516887649: Status 404 returned error can't find the container with id 182ebbef535a87278100e77191a38749232657d5601f48b86924346516887649 Mar 09 13:40:24 crc kubenswrapper[4764]: I0309 13:40:24.198613 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qm7vs-config-xgh6g"] Mar 09 13:40:24 crc kubenswrapper[4764]: I0309 13:40:24.831963 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:40:24 crc kubenswrapper[4764]: I0309 13:40:24.857269 4764 generic.go:334] "Generic (PLEG): container finished" podID="186bfdab-9518-4d38-9f43-a0eafa335ed9" containerID="8959c8a0509c3231f91de89d62e7a7d8e52e391a4941e4498d6d53a3afa1efea" exitCode=0 Mar 09 13:40:24 crc kubenswrapper[4764]: I0309 13:40:24.857593 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qm7vs-config-xgh6g" event={"ID":"186bfdab-9518-4d38-9f43-a0eafa335ed9","Type":"ContainerDied","Data":"8959c8a0509c3231f91de89d62e7a7d8e52e391a4941e4498d6d53a3afa1efea"} Mar 09 13:40:24 crc kubenswrapper[4764]: I0309 13:40:24.857692 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qm7vs-config-xgh6g" event={"ID":"186bfdab-9518-4d38-9f43-a0eafa335ed9","Type":"ContainerStarted","Data":"182ebbef535a87278100e77191a38749232657d5601f48b86924346516887649"} Mar 09 13:40:24 crc kubenswrapper[4764]: I0309 13:40:24.909938 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 09 13:40:25 crc kubenswrapper[4764]: I0309 13:40:25.178042 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-htpf8" Mar 09 13:40:25 crc kubenswrapper[4764]: I0309 13:40:25.270214 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88e0f4c9-1553-4aca-83f2-e0461ddf062b-operator-scripts\") pod \"88e0f4c9-1553-4aca-83f2-e0461ddf062b\" (UID: \"88e0f4c9-1553-4aca-83f2-e0461ddf062b\") " Mar 09 13:40:25 crc kubenswrapper[4764]: I0309 13:40:25.270375 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w89w5\" (UniqueName: \"kubernetes.io/projected/88e0f4c9-1553-4aca-83f2-e0461ddf062b-kube-api-access-w89w5\") pod \"88e0f4c9-1553-4aca-83f2-e0461ddf062b\" (UID: \"88e0f4c9-1553-4aca-83f2-e0461ddf062b\") " Mar 09 13:40:25 crc kubenswrapper[4764]: I0309 13:40:25.272038 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88e0f4c9-1553-4aca-83f2-e0461ddf062b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "88e0f4c9-1553-4aca-83f2-e0461ddf062b" (UID: "88e0f4c9-1553-4aca-83f2-e0461ddf062b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:25 crc kubenswrapper[4764]: I0309 13:40:25.282490 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88e0f4c9-1553-4aca-83f2-e0461ddf062b-kube-api-access-w89w5" (OuterVolumeSpecName: "kube-api-access-w89w5") pod "88e0f4c9-1553-4aca-83f2-e0461ddf062b" (UID: "88e0f4c9-1553-4aca-83f2-e0461ddf062b"). InnerVolumeSpecName "kube-api-access-w89w5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:25 crc kubenswrapper[4764]: I0309 13:40:25.372478 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w89w5\" (UniqueName: \"kubernetes.io/projected/88e0f4c9-1553-4aca-83f2-e0461ddf062b-kube-api-access-w89w5\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:25 crc kubenswrapper[4764]: I0309 13:40:25.372523 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88e0f4c9-1553-4aca-83f2-e0461ddf062b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:25 crc kubenswrapper[4764]: I0309 13:40:25.871021 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-htpf8" event={"ID":"88e0f4c9-1553-4aca-83f2-e0461ddf062b","Type":"ContainerDied","Data":"26051806488a9ea5ced1bc0738712a4f97689b2959961164a42a4d12b30ca188"} Mar 09 13:40:25 crc kubenswrapper[4764]: I0309 13:40:25.871105 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26051806488a9ea5ced1bc0738712a4f97689b2959961164a42a4d12b30ca188" Mar 09 13:40:25 crc kubenswrapper[4764]: I0309 13:40:25.871036 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-htpf8" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.209851 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.395614 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/186bfdab-9518-4d38-9f43-a0eafa335ed9-additional-scripts\") pod \"186bfdab-9518-4d38-9f43-a0eafa335ed9\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.395835 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-run\") pod \"186bfdab-9518-4d38-9f43-a0eafa335ed9\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.395911 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-log-ovn\") pod \"186bfdab-9518-4d38-9f43-a0eafa335ed9\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.395944 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-879pf\" (UniqueName: \"kubernetes.io/projected/186bfdab-9518-4d38-9f43-a0eafa335ed9-kube-api-access-879pf\") pod \"186bfdab-9518-4d38-9f43-a0eafa335ed9\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.396019 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/186bfdab-9518-4d38-9f43-a0eafa335ed9-scripts\") pod \"186bfdab-9518-4d38-9f43-a0eafa335ed9\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.396041 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "186bfdab-9518-4d38-9f43-a0eafa335ed9" (UID: "186bfdab-9518-4d38-9f43-a0eafa335ed9"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.396019 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-run" (OuterVolumeSpecName: "var-run") pod "186bfdab-9518-4d38-9f43-a0eafa335ed9" (UID: "186bfdab-9518-4d38-9f43-a0eafa335ed9"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.396128 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-run-ovn\") pod \"186bfdab-9518-4d38-9f43-a0eafa335ed9\" (UID: \"186bfdab-9518-4d38-9f43-a0eafa335ed9\") " Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.396290 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "186bfdab-9518-4d38-9f43-a0eafa335ed9" (UID: "186bfdab-9518-4d38-9f43-a0eafa335ed9"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.396742 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/186bfdab-9518-4d38-9f43-a0eafa335ed9-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "186bfdab-9518-4d38-9f43-a0eafa335ed9" (UID: "186bfdab-9518-4d38-9f43-a0eafa335ed9"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.396760 4764 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.396775 4764 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-run\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.396784 4764 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/186bfdab-9518-4d38-9f43-a0eafa335ed9-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.397101 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/186bfdab-9518-4d38-9f43-a0eafa335ed9-scripts" (OuterVolumeSpecName: "scripts") pod "186bfdab-9518-4d38-9f43-a0eafa335ed9" (UID: "186bfdab-9518-4d38-9f43-a0eafa335ed9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.401396 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/186bfdab-9518-4d38-9f43-a0eafa335ed9-kube-api-access-879pf" (OuterVolumeSpecName: "kube-api-access-879pf") pod "186bfdab-9518-4d38-9f43-a0eafa335ed9" (UID: "186bfdab-9518-4d38-9f43-a0eafa335ed9"). InnerVolumeSpecName "kube-api-access-879pf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.498833 4764 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/186bfdab-9518-4d38-9f43-a0eafa335ed9-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.498880 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-879pf\" (UniqueName: \"kubernetes.io/projected/186bfdab-9518-4d38-9f43-a0eafa335ed9-kube-api-access-879pf\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.498894 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/186bfdab-9518-4d38-9f43-a0eafa335ed9-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.810323 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-mqv59"] Mar 09 13:40:26 crc kubenswrapper[4764]: E0309 13:40:26.810792 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88e0f4c9-1553-4aca-83f2-e0461ddf062b" containerName="mariadb-account-create-update" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.810818 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="88e0f4c9-1553-4aca-83f2-e0461ddf062b" containerName="mariadb-account-create-update" Mar 09 13:40:26 crc kubenswrapper[4764]: E0309 13:40:26.810848 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="186bfdab-9518-4d38-9f43-a0eafa335ed9" containerName="ovn-config" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.810856 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="186bfdab-9518-4d38-9f43-a0eafa335ed9" containerName="ovn-config" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.811056 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="186bfdab-9518-4d38-9f43-a0eafa335ed9" containerName="ovn-config" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.811081 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="88e0f4c9-1553-4aca-83f2-e0461ddf062b" containerName="mariadb-account-create-update" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.811820 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mqv59" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.817846 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-mqv59"] Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.881600 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qm7vs-config-xgh6g" event={"ID":"186bfdab-9518-4d38-9f43-a0eafa335ed9","Type":"ContainerDied","Data":"182ebbef535a87278100e77191a38749232657d5601f48b86924346516887649"} Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.881662 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="182ebbef535a87278100e77191a38749232657d5601f48b86924346516887649" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.881686 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qm7vs-config-xgh6g" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.906071 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46124175-b282-444f-8d9c-0397e35cf8ae-operator-scripts\") pod \"cinder-db-create-mqv59\" (UID: \"46124175-b282-444f-8d9c-0397e35cf8ae\") " pod="openstack/cinder-db-create-mqv59" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.906195 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zqq7\" (UniqueName: \"kubernetes.io/projected/46124175-b282-444f-8d9c-0397e35cf8ae-kube-api-access-6zqq7\") pod \"cinder-db-create-mqv59\" (UID: \"46124175-b282-444f-8d9c-0397e35cf8ae\") " pod="openstack/cinder-db-create-mqv59" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.942123 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-15a9-account-create-update-5s8tj"] Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.943296 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-15a9-account-create-update-5s8tj" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.945713 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 09 13:40:26 crc kubenswrapper[4764]: I0309 13:40:26.958288 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-15a9-account-create-update-5s8tj"] Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.003704 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-thhcb"] Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.005048 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-thhcb" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.010355 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46124175-b282-444f-8d9c-0397e35cf8ae-operator-scripts\") pod \"cinder-db-create-mqv59\" (UID: \"46124175-b282-444f-8d9c-0397e35cf8ae\") " pod="openstack/cinder-db-create-mqv59" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.010438 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zqq7\" (UniqueName: \"kubernetes.io/projected/46124175-b282-444f-8d9c-0397e35cf8ae-kube-api-access-6zqq7\") pod \"cinder-db-create-mqv59\" (UID: \"46124175-b282-444f-8d9c-0397e35cf8ae\") " pod="openstack/cinder-db-create-mqv59" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.011605 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46124175-b282-444f-8d9c-0397e35cf8ae-operator-scripts\") pod \"cinder-db-create-mqv59\" (UID: \"46124175-b282-444f-8d9c-0397e35cf8ae\") " pod="openstack/cinder-db-create-mqv59" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.016113 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-thhcb"] Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.038633 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zqq7\" (UniqueName: \"kubernetes.io/projected/46124175-b282-444f-8d9c-0397e35cf8ae-kube-api-access-6zqq7\") pod \"cinder-db-create-mqv59\" (UID: \"46124175-b282-444f-8d9c-0397e35cf8ae\") " pod="openstack/cinder-db-create-mqv59" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.112779 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppxms\" (UniqueName: \"kubernetes.io/projected/642b5df5-dec0-47cc-9595-02b254277452-kube-api-access-ppxms\") pod \"barbican-db-create-thhcb\" (UID: \"642b5df5-dec0-47cc-9595-02b254277452\") " pod="openstack/barbican-db-create-thhcb" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.113205 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82410bc0-aa4c-450d-8fbc-67cfb9dd615b-operator-scripts\") pod \"cinder-15a9-account-create-update-5s8tj\" (UID: \"82410bc0-aa4c-450d-8fbc-67cfb9dd615b\") " pod="openstack/cinder-15a9-account-create-update-5s8tj" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.113261 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbcrf\" (UniqueName: \"kubernetes.io/projected/82410bc0-aa4c-450d-8fbc-67cfb9dd615b-kube-api-access-sbcrf\") pod \"cinder-15a9-account-create-update-5s8tj\" (UID: \"82410bc0-aa4c-450d-8fbc-67cfb9dd615b\") " pod="openstack/cinder-15a9-account-create-update-5s8tj" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.113297 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/642b5df5-dec0-47cc-9595-02b254277452-operator-scripts\") pod \"barbican-db-create-thhcb\" (UID: \"642b5df5-dec0-47cc-9595-02b254277452\") " pod="openstack/barbican-db-create-thhcb" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.133359 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mqv59" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.176980 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-rjx7v"] Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.178396 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rjx7v" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.182290 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.182412 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.182679 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4hfql" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.183406 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.191848 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-rjx7v"] Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.201731 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-gkf9g"] Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.202755 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gkf9g" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.219051 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/642b5df5-dec0-47cc-9595-02b254277452-operator-scripts\") pod \"barbican-db-create-thhcb\" (UID: \"642b5df5-dec0-47cc-9595-02b254277452\") " pod="openstack/barbican-db-create-thhcb" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.219158 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppxms\" (UniqueName: \"kubernetes.io/projected/642b5df5-dec0-47cc-9595-02b254277452-kube-api-access-ppxms\") pod \"barbican-db-create-thhcb\" (UID: \"642b5df5-dec0-47cc-9595-02b254277452\") " pod="openstack/barbican-db-create-thhcb" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.219211 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82410bc0-aa4c-450d-8fbc-67cfb9dd615b-operator-scripts\") pod \"cinder-15a9-account-create-update-5s8tj\" (UID: \"82410bc0-aa4c-450d-8fbc-67cfb9dd615b\") " pod="openstack/cinder-15a9-account-create-update-5s8tj" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.219250 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbcrf\" (UniqueName: \"kubernetes.io/projected/82410bc0-aa4c-450d-8fbc-67cfb9dd615b-kube-api-access-sbcrf\") pod \"cinder-15a9-account-create-update-5s8tj\" (UID: \"82410bc0-aa4c-450d-8fbc-67cfb9dd615b\") " pod="openstack/cinder-15a9-account-create-update-5s8tj" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.220332 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82410bc0-aa4c-450d-8fbc-67cfb9dd615b-operator-scripts\") pod \"cinder-15a9-account-create-update-5s8tj\" (UID: \"82410bc0-aa4c-450d-8fbc-67cfb9dd615b\") " pod="openstack/cinder-15a9-account-create-update-5s8tj" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.220558 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/642b5df5-dec0-47cc-9595-02b254277452-operator-scripts\") pod \"barbican-db-create-thhcb\" (UID: \"642b5df5-dec0-47cc-9595-02b254277452\") " pod="openstack/barbican-db-create-thhcb" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.233448 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-6d6f-account-create-update-qxm8j"] Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.234588 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6d6f-account-create-update-qxm8j" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.240790 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.249276 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbcrf\" (UniqueName: \"kubernetes.io/projected/82410bc0-aa4c-450d-8fbc-67cfb9dd615b-kube-api-access-sbcrf\") pod \"cinder-15a9-account-create-update-5s8tj\" (UID: \"82410bc0-aa4c-450d-8fbc-67cfb9dd615b\") " pod="openstack/cinder-15a9-account-create-update-5s8tj" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.254422 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-gkf9g"] Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.259989 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppxms\" (UniqueName: \"kubernetes.io/projected/642b5df5-dec0-47cc-9595-02b254277452-kube-api-access-ppxms\") pod \"barbican-db-create-thhcb\" (UID: \"642b5df5-dec0-47cc-9595-02b254277452\") " pod="openstack/barbican-db-create-thhcb" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.276984 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-15a9-account-create-update-5s8tj" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.286721 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6d6f-account-create-update-qxm8j"] Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.326080 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-thhcb" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.327804 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bf72fda-56e5-427c-b2d0-8267613d8a9e-operator-scripts\") pod \"neutron-db-create-gkf9g\" (UID: \"1bf72fda-56e5-427c-b2d0-8267613d8a9e\") " pod="openstack/neutron-db-create-gkf9g" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.327867 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw7zq\" (UniqueName: \"kubernetes.io/projected/1bf72fda-56e5-427c-b2d0-8267613d8a9e-kube-api-access-gw7zq\") pod \"neutron-db-create-gkf9g\" (UID: \"1bf72fda-56e5-427c-b2d0-8267613d8a9e\") " pod="openstack/neutron-db-create-gkf9g" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.328097 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea9388c2-526b-49ff-8f42-03ca66ae08dd-config-data\") pod \"keystone-db-sync-rjx7v\" (UID: \"ea9388c2-526b-49ff-8f42-03ca66ae08dd\") " pod="openstack/keystone-db-sync-rjx7v" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.328120 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea9388c2-526b-49ff-8f42-03ca66ae08dd-combined-ca-bundle\") pod \"keystone-db-sync-rjx7v\" (UID: \"ea9388c2-526b-49ff-8f42-03ca66ae08dd\") " pod="openstack/keystone-db-sync-rjx7v" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.328165 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8bhv\" (UniqueName: \"kubernetes.io/projected/ea9388c2-526b-49ff-8f42-03ca66ae08dd-kube-api-access-q8bhv\") pod \"keystone-db-sync-rjx7v\" (UID: \"ea9388c2-526b-49ff-8f42-03ca66ae08dd\") " pod="openstack/keystone-db-sync-rjx7v" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.376847 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-a166-account-create-update-kswwc"] Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.378337 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a166-account-create-update-kswwc" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.384328 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-qm7vs-config-xgh6g"] Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.395164 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.395228 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-qm7vs-config-xgh6g"] Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.395799 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-a166-account-create-update-kswwc"] Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.429891 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea9388c2-526b-49ff-8f42-03ca66ae08dd-config-data\") pod \"keystone-db-sync-rjx7v\" (UID: \"ea9388c2-526b-49ff-8f42-03ca66ae08dd\") " pod="openstack/keystone-db-sync-rjx7v" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.430307 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea9388c2-526b-49ff-8f42-03ca66ae08dd-combined-ca-bundle\") pod \"keystone-db-sync-rjx7v\" (UID: \"ea9388c2-526b-49ff-8f42-03ca66ae08dd\") " pod="openstack/keystone-db-sync-rjx7v" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.430354 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8bhv\" (UniqueName: \"kubernetes.io/projected/ea9388c2-526b-49ff-8f42-03ca66ae08dd-kube-api-access-q8bhv\") pod \"keystone-db-sync-rjx7v\" (UID: \"ea9388c2-526b-49ff-8f42-03ca66ae08dd\") " pod="openstack/keystone-db-sync-rjx7v" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.430399 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b86f9b8-6493-4a60-85b3-12057a6a8f65-operator-scripts\") pod \"barbican-6d6f-account-create-update-qxm8j\" (UID: \"5b86f9b8-6493-4a60-85b3-12057a6a8f65\") " pod="openstack/barbican-6d6f-account-create-update-qxm8j" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.430433 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsb9v\" (UniqueName: \"kubernetes.io/projected/5b86f9b8-6493-4a60-85b3-12057a6a8f65-kube-api-access-wsb9v\") pod \"barbican-6d6f-account-create-update-qxm8j\" (UID: \"5b86f9b8-6493-4a60-85b3-12057a6a8f65\") " pod="openstack/barbican-6d6f-account-create-update-qxm8j" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.430482 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bf72fda-56e5-427c-b2d0-8267613d8a9e-operator-scripts\") pod \"neutron-db-create-gkf9g\" (UID: \"1bf72fda-56e5-427c-b2d0-8267613d8a9e\") " pod="openstack/neutron-db-create-gkf9g" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.430508 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw7zq\" (UniqueName: \"kubernetes.io/projected/1bf72fda-56e5-427c-b2d0-8267613d8a9e-kube-api-access-gw7zq\") pod \"neutron-db-create-gkf9g\" (UID: \"1bf72fda-56e5-427c-b2d0-8267613d8a9e\") " pod="openstack/neutron-db-create-gkf9g" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.443141 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bf72fda-56e5-427c-b2d0-8267613d8a9e-operator-scripts\") pod \"neutron-db-create-gkf9g\" (UID: \"1bf72fda-56e5-427c-b2d0-8267613d8a9e\") " pod="openstack/neutron-db-create-gkf9g" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.444670 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea9388c2-526b-49ff-8f42-03ca66ae08dd-combined-ca-bundle\") pod \"keystone-db-sync-rjx7v\" (UID: \"ea9388c2-526b-49ff-8f42-03ca66ae08dd\") " pod="openstack/keystone-db-sync-rjx7v" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.449401 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea9388c2-526b-49ff-8f42-03ca66ae08dd-config-data\") pod \"keystone-db-sync-rjx7v\" (UID: \"ea9388c2-526b-49ff-8f42-03ca66ae08dd\") " pod="openstack/keystone-db-sync-rjx7v" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.467471 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw7zq\" (UniqueName: \"kubernetes.io/projected/1bf72fda-56e5-427c-b2d0-8267613d8a9e-kube-api-access-gw7zq\") pod \"neutron-db-create-gkf9g\" (UID: \"1bf72fda-56e5-427c-b2d0-8267613d8a9e\") " pod="openstack/neutron-db-create-gkf9g" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.471183 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8bhv\" (UniqueName: \"kubernetes.io/projected/ea9388c2-526b-49ff-8f42-03ca66ae08dd-kube-api-access-q8bhv\") pod \"keystone-db-sync-rjx7v\" (UID: \"ea9388c2-526b-49ff-8f42-03ca66ae08dd\") " pod="openstack/keystone-db-sync-rjx7v" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.534518 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b86f9b8-6493-4a60-85b3-12057a6a8f65-operator-scripts\") pod \"barbican-6d6f-account-create-update-qxm8j\" (UID: \"5b86f9b8-6493-4a60-85b3-12057a6a8f65\") " pod="openstack/barbican-6d6f-account-create-update-qxm8j" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.534581 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsb9v\" (UniqueName: \"kubernetes.io/projected/5b86f9b8-6493-4a60-85b3-12057a6a8f65-kube-api-access-wsb9v\") pod \"barbican-6d6f-account-create-update-qxm8j\" (UID: \"5b86f9b8-6493-4a60-85b3-12057a6a8f65\") " pod="openstack/barbican-6d6f-account-create-update-qxm8j" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.534701 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tdsb\" (UniqueName: \"kubernetes.io/projected/ad7d32c2-ffe4-43d5-8640-6219f863bc2a-kube-api-access-6tdsb\") pod \"neutron-a166-account-create-update-kswwc\" (UID: \"ad7d32c2-ffe4-43d5-8640-6219f863bc2a\") " pod="openstack/neutron-a166-account-create-update-kswwc" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.534767 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad7d32c2-ffe4-43d5-8640-6219f863bc2a-operator-scripts\") pod \"neutron-a166-account-create-update-kswwc\" (UID: \"ad7d32c2-ffe4-43d5-8640-6219f863bc2a\") " pod="openstack/neutron-a166-account-create-update-kswwc" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.535661 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b86f9b8-6493-4a60-85b3-12057a6a8f65-operator-scripts\") pod \"barbican-6d6f-account-create-update-qxm8j\" (UID: \"5b86f9b8-6493-4a60-85b3-12057a6a8f65\") " pod="openstack/barbican-6d6f-account-create-update-qxm8j" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.563188 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsb9v\" (UniqueName: \"kubernetes.io/projected/5b86f9b8-6493-4a60-85b3-12057a6a8f65-kube-api-access-wsb9v\") pod \"barbican-6d6f-account-create-update-qxm8j\" (UID: \"5b86f9b8-6493-4a60-85b3-12057a6a8f65\") " pod="openstack/barbican-6d6f-account-create-update-qxm8j" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.584528 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rjx7v" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.592950 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="186bfdab-9518-4d38-9f43-a0eafa335ed9" path="/var/lib/kubelet/pods/186bfdab-9518-4d38-9f43-a0eafa335ed9/volumes" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.605090 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6d6f-account-create-update-qxm8j" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.636760 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tdsb\" (UniqueName: \"kubernetes.io/projected/ad7d32c2-ffe4-43d5-8640-6219f863bc2a-kube-api-access-6tdsb\") pod \"neutron-a166-account-create-update-kswwc\" (UID: \"ad7d32c2-ffe4-43d5-8640-6219f863bc2a\") " pod="openstack/neutron-a166-account-create-update-kswwc" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.636866 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad7d32c2-ffe4-43d5-8640-6219f863bc2a-operator-scripts\") pod \"neutron-a166-account-create-update-kswwc\" (UID: \"ad7d32c2-ffe4-43d5-8640-6219f863bc2a\") " pod="openstack/neutron-a166-account-create-update-kswwc" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.637627 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad7d32c2-ffe4-43d5-8640-6219f863bc2a-operator-scripts\") pod \"neutron-a166-account-create-update-kswwc\" (UID: \"ad7d32c2-ffe4-43d5-8640-6219f863bc2a\") " pod="openstack/neutron-a166-account-create-update-kswwc" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.638049 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gkf9g" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.676059 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tdsb\" (UniqueName: \"kubernetes.io/projected/ad7d32c2-ffe4-43d5-8640-6219f863bc2a-kube-api-access-6tdsb\") pod \"neutron-a166-account-create-update-kswwc\" (UID: \"ad7d32c2-ffe4-43d5-8640-6219f863bc2a\") " pod="openstack/neutron-a166-account-create-update-kswwc" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.828804 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a166-account-create-update-kswwc" Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.887336 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-15a9-account-create-update-5s8tj"] Mar 09 13:40:27 crc kubenswrapper[4764]: W0309 13:40:27.916819 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82410bc0_aa4c_450d_8fbc_67cfb9dd615b.slice/crio-0b8142b3cc805480a67a46802a0365da053e4e9868c09964599a921ca210d192 WatchSource:0}: Error finding container 0b8142b3cc805480a67a46802a0365da053e4e9868c09964599a921ca210d192: Status 404 returned error can't find the container with id 0b8142b3cc805480a67a46802a0365da053e4e9868c09964599a921ca210d192 Mar 09 13:40:27 crc kubenswrapper[4764]: I0309 13:40:27.932980 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-mqv59"] Mar 09 13:40:28 crc kubenswrapper[4764]: W0309 13:40:28.222470 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod642b5df5_dec0_47cc_9595_02b254277452.slice/crio-d83672538bf04116631d7414d3375d19a02bf6520cb37838591c270b1cfadc5f WatchSource:0}: Error finding container d83672538bf04116631d7414d3375d19a02bf6520cb37838591c270b1cfadc5f: Status 404 returned error can't find the container with id d83672538bf04116631d7414d3375d19a02bf6520cb37838591c270b1cfadc5f Mar 09 13:40:28 crc kubenswrapper[4764]: I0309 13:40:28.223287 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-thhcb"] Mar 09 13:40:28 crc kubenswrapper[4764]: I0309 13:40:28.370513 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:40:28 crc kubenswrapper[4764]: I0309 13:40:28.370582 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:40:28 crc kubenswrapper[4764]: I0309 13:40:28.373862 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-gkf9g"] Mar 09 13:40:28 crc kubenswrapper[4764]: W0309 13:40:28.383972 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bf72fda_56e5_427c_b2d0_8267613d8a9e.slice/crio-e51ce3016afe58e70de7013784d516c067ad727cd5eede5c2da838d45dc7748c WatchSource:0}: Error finding container e51ce3016afe58e70de7013784d516c067ad727cd5eede5c2da838d45dc7748c: Status 404 returned error can't find the container with id e51ce3016afe58e70de7013784d516c067ad727cd5eede5c2da838d45dc7748c Mar 09 13:40:28 crc kubenswrapper[4764]: I0309 13:40:28.934397 4764 generic.go:334] "Generic (PLEG): container finished" podID="82410bc0-aa4c-450d-8fbc-67cfb9dd615b" containerID="339bad8557439d7afa5c329a435cd35a995246fc604b25e4a8e250f1a7b99368" exitCode=0 Mar 09 13:40:28 crc kubenswrapper[4764]: I0309 13:40:28.934630 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-15a9-account-create-update-5s8tj" event={"ID":"82410bc0-aa4c-450d-8fbc-67cfb9dd615b","Type":"ContainerDied","Data":"339bad8557439d7afa5c329a435cd35a995246fc604b25e4a8e250f1a7b99368"} Mar 09 13:40:28 crc kubenswrapper[4764]: I0309 13:40:28.934931 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-15a9-account-create-update-5s8tj" event={"ID":"82410bc0-aa4c-450d-8fbc-67cfb9dd615b","Type":"ContainerStarted","Data":"0b8142b3cc805480a67a46802a0365da053e4e9868c09964599a921ca210d192"} Mar 09 13:40:28 crc kubenswrapper[4764]: I0309 13:40:28.938468 4764 generic.go:334] "Generic (PLEG): container finished" podID="46124175-b282-444f-8d9c-0397e35cf8ae" containerID="da404af8a75d74966ac6aec712c910704da97b03d85b23af80911b87587e1ef7" exitCode=0 Mar 09 13:40:28 crc kubenswrapper[4764]: I0309 13:40:28.938610 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-mqv59" event={"ID":"46124175-b282-444f-8d9c-0397e35cf8ae","Type":"ContainerDied","Data":"da404af8a75d74966ac6aec712c910704da97b03d85b23af80911b87587e1ef7"} Mar 09 13:40:28 crc kubenswrapper[4764]: I0309 13:40:28.938684 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-mqv59" event={"ID":"46124175-b282-444f-8d9c-0397e35cf8ae","Type":"ContainerStarted","Data":"8e28314c4efa3469246836e0ce638aaaf4d3b302e3f0249c8ebd090715366479"} Mar 09 13:40:28 crc kubenswrapper[4764]: I0309 13:40:28.944317 4764 generic.go:334] "Generic (PLEG): container finished" podID="642b5df5-dec0-47cc-9595-02b254277452" containerID="6b21bf7421d738d162c19d823aaa8d5749330a4586aa8ead5eb3f5fbb8ebcd4e" exitCode=0 Mar 09 13:40:28 crc kubenswrapper[4764]: I0309 13:40:28.944407 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-thhcb" event={"ID":"642b5df5-dec0-47cc-9595-02b254277452","Type":"ContainerDied","Data":"6b21bf7421d738d162c19d823aaa8d5749330a4586aa8ead5eb3f5fbb8ebcd4e"} Mar 09 13:40:28 crc kubenswrapper[4764]: I0309 13:40:28.944455 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-thhcb" event={"ID":"642b5df5-dec0-47cc-9595-02b254277452","Type":"ContainerStarted","Data":"d83672538bf04116631d7414d3375d19a02bf6520cb37838591c270b1cfadc5f"} Mar 09 13:40:28 crc kubenswrapper[4764]: I0309 13:40:28.947184 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-gkf9g" event={"ID":"1bf72fda-56e5-427c-b2d0-8267613d8a9e","Type":"ContainerStarted","Data":"8799a3258f6d77e33d1068648c90a3371654d54435ab51ff0aa268e01baf2e9b"} Mar 09 13:40:28 crc kubenswrapper[4764]: I0309 13:40:28.947215 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-gkf9g" event={"ID":"1bf72fda-56e5-427c-b2d0-8267613d8a9e","Type":"ContainerStarted","Data":"e51ce3016afe58e70de7013784d516c067ad727cd5eede5c2da838d45dc7748c"} Mar 09 13:40:29 crc kubenswrapper[4764]: I0309 13:40:29.043410 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-gkf9g" podStartSLOduration=2.043393455 podStartE2EDuration="2.043393455s" podCreationTimestamp="2026-03-09 13:40:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:40:29.039755383 +0000 UTC m=+1184.289927291" watchObservedRunningTime="2026-03-09 13:40:29.043393455 +0000 UTC m=+1184.293565363" Mar 09 13:40:29 crc kubenswrapper[4764]: I0309 13:40:29.230135 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-rjx7v"] Mar 09 13:40:29 crc kubenswrapper[4764]: I0309 13:40:29.239772 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6d6f-account-create-update-qxm8j"] Mar 09 13:40:29 crc kubenswrapper[4764]: W0309 13:40:29.263913 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b86f9b8_6493_4a60_85b3_12057a6a8f65.slice/crio-e0c6dbc0703a920e008f15365138493c9f2dff91ea4203cb37d8d4b845814dc6 WatchSource:0}: Error finding container e0c6dbc0703a920e008f15365138493c9f2dff91ea4203cb37d8d4b845814dc6: Status 404 returned error can't find the container with id e0c6dbc0703a920e008f15365138493c9f2dff91ea4203cb37d8d4b845814dc6 Mar 09 13:40:29 crc kubenswrapper[4764]: I0309 13:40:29.318467 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-a166-account-create-update-kswwc"] Mar 09 13:40:29 crc kubenswrapper[4764]: W0309 13:40:29.326720 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad7d32c2_ffe4_43d5_8640_6219f863bc2a.slice/crio-37c18bc0866de7085e3cc23a30681608a62529af59013b7f38c34f1a01bfb0af WatchSource:0}: Error finding container 37c18bc0866de7085e3cc23a30681608a62529af59013b7f38c34f1a01bfb0af: Status 404 returned error can't find the container with id 37c18bc0866de7085e3cc23a30681608a62529af59013b7f38c34f1a01bfb0af Mar 09 13:40:29 crc kubenswrapper[4764]: I0309 13:40:29.961382 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rjx7v" event={"ID":"ea9388c2-526b-49ff-8f42-03ca66ae08dd","Type":"ContainerStarted","Data":"250d5ff9e06681bc763b0d0c4d24353835d78a5fc9756881a6c43044e79b8237"} Mar 09 13:40:29 crc kubenswrapper[4764]: I0309 13:40:29.963427 4764 generic.go:334] "Generic (PLEG): container finished" podID="ad7d32c2-ffe4-43d5-8640-6219f863bc2a" containerID="cf71be9d097dd827ce8f90129691408cafc7024d1662f6cee61aed9d868f11b5" exitCode=0 Mar 09 13:40:29 crc kubenswrapper[4764]: I0309 13:40:29.963545 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a166-account-create-update-kswwc" event={"ID":"ad7d32c2-ffe4-43d5-8640-6219f863bc2a","Type":"ContainerDied","Data":"cf71be9d097dd827ce8f90129691408cafc7024d1662f6cee61aed9d868f11b5"} Mar 09 13:40:29 crc kubenswrapper[4764]: I0309 13:40:29.963753 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a166-account-create-update-kswwc" event={"ID":"ad7d32c2-ffe4-43d5-8640-6219f863bc2a","Type":"ContainerStarted","Data":"37c18bc0866de7085e3cc23a30681608a62529af59013b7f38c34f1a01bfb0af"} Mar 09 13:40:29 crc kubenswrapper[4764]: I0309 13:40:29.966556 4764 generic.go:334] "Generic (PLEG): container finished" podID="5b86f9b8-6493-4a60-85b3-12057a6a8f65" containerID="63ff43316b127bbf8f65a169da3d3d1192d7c9af3a1493dff44991331dc6c723" exitCode=0 Mar 09 13:40:29 crc kubenswrapper[4764]: I0309 13:40:29.966627 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6d6f-account-create-update-qxm8j" event={"ID":"5b86f9b8-6493-4a60-85b3-12057a6a8f65","Type":"ContainerDied","Data":"63ff43316b127bbf8f65a169da3d3d1192d7c9af3a1493dff44991331dc6c723"} Mar 09 13:40:29 crc kubenswrapper[4764]: I0309 13:40:29.966671 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6d6f-account-create-update-qxm8j" event={"ID":"5b86f9b8-6493-4a60-85b3-12057a6a8f65","Type":"ContainerStarted","Data":"e0c6dbc0703a920e008f15365138493c9f2dff91ea4203cb37d8d4b845814dc6"} Mar 09 13:40:29 crc kubenswrapper[4764]: I0309 13:40:29.970768 4764 generic.go:334] "Generic (PLEG): container finished" podID="1bf72fda-56e5-427c-b2d0-8267613d8a9e" containerID="8799a3258f6d77e33d1068648c90a3371654d54435ab51ff0aa268e01baf2e9b" exitCode=0 Mar 09 13:40:29 crc kubenswrapper[4764]: I0309 13:40:29.970998 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-gkf9g" event={"ID":"1bf72fda-56e5-427c-b2d0-8267613d8a9e","Type":"ContainerDied","Data":"8799a3258f6d77e33d1068648c90a3371654d54435ab51ff0aa268e01baf2e9b"} Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.452117 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-thhcb" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.462256 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mqv59" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.464970 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-15a9-account-create-update-5s8tj" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.605018 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82410bc0-aa4c-450d-8fbc-67cfb9dd615b-operator-scripts\") pod \"82410bc0-aa4c-450d-8fbc-67cfb9dd615b\" (UID: \"82410bc0-aa4c-450d-8fbc-67cfb9dd615b\") " Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.605264 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/642b5df5-dec0-47cc-9595-02b254277452-operator-scripts\") pod \"642b5df5-dec0-47cc-9595-02b254277452\" (UID: \"642b5df5-dec0-47cc-9595-02b254277452\") " Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.605326 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zqq7\" (UniqueName: \"kubernetes.io/projected/46124175-b282-444f-8d9c-0397e35cf8ae-kube-api-access-6zqq7\") pod \"46124175-b282-444f-8d9c-0397e35cf8ae\" (UID: \"46124175-b282-444f-8d9c-0397e35cf8ae\") " Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.605423 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbcrf\" (UniqueName: \"kubernetes.io/projected/82410bc0-aa4c-450d-8fbc-67cfb9dd615b-kube-api-access-sbcrf\") pod \"82410bc0-aa4c-450d-8fbc-67cfb9dd615b\" (UID: \"82410bc0-aa4c-450d-8fbc-67cfb9dd615b\") " Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.605456 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppxms\" (UniqueName: \"kubernetes.io/projected/642b5df5-dec0-47cc-9595-02b254277452-kube-api-access-ppxms\") pod \"642b5df5-dec0-47cc-9595-02b254277452\" (UID: \"642b5df5-dec0-47cc-9595-02b254277452\") " Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.605506 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46124175-b282-444f-8d9c-0397e35cf8ae-operator-scripts\") pod \"46124175-b282-444f-8d9c-0397e35cf8ae\" (UID: \"46124175-b282-444f-8d9c-0397e35cf8ae\") " Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.606874 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46124175-b282-444f-8d9c-0397e35cf8ae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "46124175-b282-444f-8d9c-0397e35cf8ae" (UID: "46124175-b282-444f-8d9c-0397e35cf8ae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.606942 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/642b5df5-dec0-47cc-9595-02b254277452-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "642b5df5-dec0-47cc-9595-02b254277452" (UID: "642b5df5-dec0-47cc-9595-02b254277452"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.607550 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82410bc0-aa4c-450d-8fbc-67cfb9dd615b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "82410bc0-aa4c-450d-8fbc-67cfb9dd615b" (UID: "82410bc0-aa4c-450d-8fbc-67cfb9dd615b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.614579 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46124175-b282-444f-8d9c-0397e35cf8ae-kube-api-access-6zqq7" (OuterVolumeSpecName: "kube-api-access-6zqq7") pod "46124175-b282-444f-8d9c-0397e35cf8ae" (UID: "46124175-b282-444f-8d9c-0397e35cf8ae"). InnerVolumeSpecName "kube-api-access-6zqq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.615090 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/642b5df5-dec0-47cc-9595-02b254277452-kube-api-access-ppxms" (OuterVolumeSpecName: "kube-api-access-ppxms") pod "642b5df5-dec0-47cc-9595-02b254277452" (UID: "642b5df5-dec0-47cc-9595-02b254277452"). InnerVolumeSpecName "kube-api-access-ppxms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.620768 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82410bc0-aa4c-450d-8fbc-67cfb9dd615b-kube-api-access-sbcrf" (OuterVolumeSpecName: "kube-api-access-sbcrf") pod "82410bc0-aa4c-450d-8fbc-67cfb9dd615b" (UID: "82410bc0-aa4c-450d-8fbc-67cfb9dd615b"). InnerVolumeSpecName "kube-api-access-sbcrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.710259 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/642b5df5-dec0-47cc-9595-02b254277452-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.710296 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zqq7\" (UniqueName: \"kubernetes.io/projected/46124175-b282-444f-8d9c-0397e35cf8ae-kube-api-access-6zqq7\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.710308 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbcrf\" (UniqueName: \"kubernetes.io/projected/82410bc0-aa4c-450d-8fbc-67cfb9dd615b-kube-api-access-sbcrf\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.710317 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppxms\" (UniqueName: \"kubernetes.io/projected/642b5df5-dec0-47cc-9595-02b254277452-kube-api-access-ppxms\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.710330 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46124175-b282-444f-8d9c-0397e35cf8ae-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.710341 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82410bc0-aa4c-450d-8fbc-67cfb9dd615b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.989403 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-thhcb" event={"ID":"642b5df5-dec0-47cc-9595-02b254277452","Type":"ContainerDied","Data":"d83672538bf04116631d7414d3375d19a02bf6520cb37838591c270b1cfadc5f"} Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.991804 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d83672538bf04116631d7414d3375d19a02bf6520cb37838591c270b1cfadc5f" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.989421 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-thhcb" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.992682 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-15a9-account-create-update-5s8tj" event={"ID":"82410bc0-aa4c-450d-8fbc-67cfb9dd615b","Type":"ContainerDied","Data":"0b8142b3cc805480a67a46802a0365da053e4e9868c09964599a921ca210d192"} Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.992715 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-15a9-account-create-update-5s8tj" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.992764 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b8142b3cc805480a67a46802a0365da053e4e9868c09964599a921ca210d192" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.995686 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-mqv59" event={"ID":"46124175-b282-444f-8d9c-0397e35cf8ae","Type":"ContainerDied","Data":"8e28314c4efa3469246836e0ce638aaaf4d3b302e3f0249c8ebd090715366479"} Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.995733 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e28314c4efa3469246836e0ce638aaaf4d3b302e3f0249c8ebd090715366479" Mar 09 13:40:30 crc kubenswrapper[4764]: I0309 13:40:30.995754 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mqv59" Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.378509 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gkf9g" Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.386676 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a166-account-create-update-kswwc" Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.409324 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6d6f-account-create-update-qxm8j" Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.531451 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b86f9b8-6493-4a60-85b3-12057a6a8f65-operator-scripts\") pod \"5b86f9b8-6493-4a60-85b3-12057a6a8f65\" (UID: \"5b86f9b8-6493-4a60-85b3-12057a6a8f65\") " Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.531541 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsb9v\" (UniqueName: \"kubernetes.io/projected/5b86f9b8-6493-4a60-85b3-12057a6a8f65-kube-api-access-wsb9v\") pod \"5b86f9b8-6493-4a60-85b3-12057a6a8f65\" (UID: \"5b86f9b8-6493-4a60-85b3-12057a6a8f65\") " Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.531618 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tdsb\" (UniqueName: \"kubernetes.io/projected/ad7d32c2-ffe4-43d5-8640-6219f863bc2a-kube-api-access-6tdsb\") pod \"ad7d32c2-ffe4-43d5-8640-6219f863bc2a\" (UID: \"ad7d32c2-ffe4-43d5-8640-6219f863bc2a\") " Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.531866 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bf72fda-56e5-427c-b2d0-8267613d8a9e-operator-scripts\") pod \"1bf72fda-56e5-427c-b2d0-8267613d8a9e\" (UID: \"1bf72fda-56e5-427c-b2d0-8267613d8a9e\") " Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.531913 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gw7zq\" (UniqueName: \"kubernetes.io/projected/1bf72fda-56e5-427c-b2d0-8267613d8a9e-kube-api-access-gw7zq\") pod \"1bf72fda-56e5-427c-b2d0-8267613d8a9e\" (UID: \"1bf72fda-56e5-427c-b2d0-8267613d8a9e\") " Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.531997 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad7d32c2-ffe4-43d5-8640-6219f863bc2a-operator-scripts\") pod \"ad7d32c2-ffe4-43d5-8640-6219f863bc2a\" (UID: \"ad7d32c2-ffe4-43d5-8640-6219f863bc2a\") " Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.532583 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b86f9b8-6493-4a60-85b3-12057a6a8f65-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5b86f9b8-6493-4a60-85b3-12057a6a8f65" (UID: "5b86f9b8-6493-4a60-85b3-12057a6a8f65"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.532902 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b86f9b8-6493-4a60-85b3-12057a6a8f65-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.533005 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf72fda-56e5-427c-b2d0-8267613d8a9e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1bf72fda-56e5-427c-b2d0-8267613d8a9e" (UID: "1bf72fda-56e5-427c-b2d0-8267613d8a9e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.533703 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad7d32c2-ffe4-43d5-8640-6219f863bc2a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ad7d32c2-ffe4-43d5-8640-6219f863bc2a" (UID: "ad7d32c2-ffe4-43d5-8640-6219f863bc2a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.541978 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad7d32c2-ffe4-43d5-8640-6219f863bc2a-kube-api-access-6tdsb" (OuterVolumeSpecName: "kube-api-access-6tdsb") pod "ad7d32c2-ffe4-43d5-8640-6219f863bc2a" (UID: "ad7d32c2-ffe4-43d5-8640-6219f863bc2a"). InnerVolumeSpecName "kube-api-access-6tdsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.545667 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf72fda-56e5-427c-b2d0-8267613d8a9e-kube-api-access-gw7zq" (OuterVolumeSpecName: "kube-api-access-gw7zq") pod "1bf72fda-56e5-427c-b2d0-8267613d8a9e" (UID: "1bf72fda-56e5-427c-b2d0-8267613d8a9e"). InnerVolumeSpecName "kube-api-access-gw7zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.545868 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b86f9b8-6493-4a60-85b3-12057a6a8f65-kube-api-access-wsb9v" (OuterVolumeSpecName: "kube-api-access-wsb9v") pod "5b86f9b8-6493-4a60-85b3-12057a6a8f65" (UID: "5b86f9b8-6493-4a60-85b3-12057a6a8f65"). InnerVolumeSpecName "kube-api-access-wsb9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.634768 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tdsb\" (UniqueName: \"kubernetes.io/projected/ad7d32c2-ffe4-43d5-8640-6219f863bc2a-kube-api-access-6tdsb\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.634846 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bf72fda-56e5-427c-b2d0-8267613d8a9e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.634906 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gw7zq\" (UniqueName: \"kubernetes.io/projected/1bf72fda-56e5-427c-b2d0-8267613d8a9e-kube-api-access-gw7zq\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.634924 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad7d32c2-ffe4-43d5-8640-6219f863bc2a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:31 crc kubenswrapper[4764]: I0309 13:40:31.634937 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsb9v\" (UniqueName: \"kubernetes.io/projected/5b86f9b8-6493-4a60-85b3-12057a6a8f65-kube-api-access-wsb9v\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:32 crc kubenswrapper[4764]: I0309 13:40:32.016696 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a166-account-create-update-kswwc" Mar 09 13:40:32 crc kubenswrapper[4764]: I0309 13:40:32.016817 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a166-account-create-update-kswwc" event={"ID":"ad7d32c2-ffe4-43d5-8640-6219f863bc2a","Type":"ContainerDied","Data":"37c18bc0866de7085e3cc23a30681608a62529af59013b7f38c34f1a01bfb0af"} Mar 09 13:40:32 crc kubenswrapper[4764]: I0309 13:40:32.016890 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37c18bc0866de7085e3cc23a30681608a62529af59013b7f38c34f1a01bfb0af" Mar 09 13:40:32 crc kubenswrapper[4764]: I0309 13:40:32.026018 4764 generic.go:334] "Generic (PLEG): container finished" podID="29e20119-f7d3-4b10-82c3-afbfa462c831" containerID="06f1bf05b0fa436ae59020308ffb3c9a4a73f8d30f844f8ee3828c34f9dbc702" exitCode=0 Mar 09 13:40:32 crc kubenswrapper[4764]: I0309 13:40:32.026210 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vzxr2" event={"ID":"29e20119-f7d3-4b10-82c3-afbfa462c831","Type":"ContainerDied","Data":"06f1bf05b0fa436ae59020308ffb3c9a4a73f8d30f844f8ee3828c34f9dbc702"} Mar 09 13:40:32 crc kubenswrapper[4764]: I0309 13:40:32.030748 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6d6f-account-create-update-qxm8j" event={"ID":"5b86f9b8-6493-4a60-85b3-12057a6a8f65","Type":"ContainerDied","Data":"e0c6dbc0703a920e008f15365138493c9f2dff91ea4203cb37d8d4b845814dc6"} Mar 09 13:40:32 crc kubenswrapper[4764]: I0309 13:40:32.030787 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0c6dbc0703a920e008f15365138493c9f2dff91ea4203cb37d8d4b845814dc6" Mar 09 13:40:32 crc kubenswrapper[4764]: I0309 13:40:32.030867 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6d6f-account-create-update-qxm8j" Mar 09 13:40:32 crc kubenswrapper[4764]: I0309 13:40:32.033491 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-gkf9g" event={"ID":"1bf72fda-56e5-427c-b2d0-8267613d8a9e","Type":"ContainerDied","Data":"e51ce3016afe58e70de7013784d516c067ad727cd5eede5c2da838d45dc7748c"} Mar 09 13:40:32 crc kubenswrapper[4764]: I0309 13:40:32.033516 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e51ce3016afe58e70de7013784d516c067ad727cd5eede5c2da838d45dc7748c" Mar 09 13:40:32 crc kubenswrapper[4764]: I0309 13:40:32.033556 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gkf9g" Mar 09 13:40:34 crc kubenswrapper[4764]: I0309 13:40:34.494238 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:34 crc kubenswrapper[4764]: I0309 13:40:34.596531 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-config-data\") pod \"29e20119-f7d3-4b10-82c3-afbfa462c831\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " Mar 09 13:40:34 crc kubenswrapper[4764]: I0309 13:40:34.596652 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-db-sync-config-data\") pod \"29e20119-f7d3-4b10-82c3-afbfa462c831\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " Mar 09 13:40:34 crc kubenswrapper[4764]: I0309 13:40:34.596691 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-combined-ca-bundle\") pod \"29e20119-f7d3-4b10-82c3-afbfa462c831\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.061086 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "29e20119-f7d3-4b10-82c3-afbfa462c831" (UID: "29e20119-f7d3-4b10-82c3-afbfa462c831"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.067187 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pflc\" (UniqueName: \"kubernetes.io/projected/29e20119-f7d3-4b10-82c3-afbfa462c831-kube-api-access-2pflc\") pod \"29e20119-f7d3-4b10-82c3-afbfa462c831\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.067590 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-db-sync-config-data\") pod \"29e20119-f7d3-4b10-82c3-afbfa462c831\" (UID: \"29e20119-f7d3-4b10-82c3-afbfa462c831\") " Mar 09 13:40:35 crc kubenswrapper[4764]: W0309 13:40:35.069085 4764 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/29e20119-f7d3-4b10-82c3-afbfa462c831/volumes/kubernetes.io~secret/db-sync-config-data Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.069117 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "29e20119-f7d3-4b10-82c3-afbfa462c831" (UID: "29e20119-f7d3-4b10-82c3-afbfa462c831"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.069493 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29e20119-f7d3-4b10-82c3-afbfa462c831" (UID: "29e20119-f7d3-4b10-82c3-afbfa462c831"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.074461 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29e20119-f7d3-4b10-82c3-afbfa462c831-kube-api-access-2pflc" (OuterVolumeSpecName: "kube-api-access-2pflc") pod "29e20119-f7d3-4b10-82c3-afbfa462c831" (UID: "29e20119-f7d3-4b10-82c3-afbfa462c831"). InnerVolumeSpecName "kube-api-access-2pflc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.077155 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pflc\" (UniqueName: \"kubernetes.io/projected/29e20119-f7d3-4b10-82c3-afbfa462c831-kube-api-access-2pflc\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.077198 4764 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.077220 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.086952 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vzxr2" event={"ID":"29e20119-f7d3-4b10-82c3-afbfa462c831","Type":"ContainerDied","Data":"bf52819fae166b95cbd52539da218f51f13c6084d42cea34252b0b03900eaee5"} Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.087036 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf52819fae166b95cbd52539da218f51f13c6084d42cea34252b0b03900eaee5" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.087554 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vzxr2" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.093332 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-config-data" (OuterVolumeSpecName: "config-data") pod "29e20119-f7d3-4b10-82c3-afbfa462c831" (UID: "29e20119-f7d3-4b10-82c3-afbfa462c831"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.111991 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-rjx7v" podStartSLOduration=3.008077395 podStartE2EDuration="8.111971865s" podCreationTimestamp="2026-03-09 13:40:27 +0000 UTC" firstStartedPulling="2026-03-09 13:40:29.244579077 +0000 UTC m=+1184.494750985" lastFinishedPulling="2026-03-09 13:40:34.348473547 +0000 UTC m=+1189.598645455" observedRunningTime="2026-03-09 13:40:35.109653127 +0000 UTC m=+1190.359825055" watchObservedRunningTime="2026-03-09 13:40:35.111971865 +0000 UTC m=+1190.362143773" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.179471 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29e20119-f7d3-4b10-82c3-afbfa462c831-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.922784 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-28h2r"] Mar 09 13:40:35 crc kubenswrapper[4764]: E0309 13:40:35.923269 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad7d32c2-ffe4-43d5-8640-6219f863bc2a" containerName="mariadb-account-create-update" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.923288 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad7d32c2-ffe4-43d5-8640-6219f863bc2a" containerName="mariadb-account-create-update" Mar 09 13:40:35 crc kubenswrapper[4764]: E0309 13:40:35.923300 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46124175-b282-444f-8d9c-0397e35cf8ae" containerName="mariadb-database-create" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.923308 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="46124175-b282-444f-8d9c-0397e35cf8ae" containerName="mariadb-database-create" Mar 09 13:40:35 crc kubenswrapper[4764]: E0309 13:40:35.923319 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="642b5df5-dec0-47cc-9595-02b254277452" containerName="mariadb-database-create" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.923329 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="642b5df5-dec0-47cc-9595-02b254277452" containerName="mariadb-database-create" Mar 09 13:40:35 crc kubenswrapper[4764]: E0309 13:40:35.923345 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82410bc0-aa4c-450d-8fbc-67cfb9dd615b" containerName="mariadb-account-create-update" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.923352 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="82410bc0-aa4c-450d-8fbc-67cfb9dd615b" containerName="mariadb-account-create-update" Mar 09 13:40:35 crc kubenswrapper[4764]: E0309 13:40:35.923365 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b86f9b8-6493-4a60-85b3-12057a6a8f65" containerName="mariadb-account-create-update" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.923374 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b86f9b8-6493-4a60-85b3-12057a6a8f65" containerName="mariadb-account-create-update" Mar 09 13:40:35 crc kubenswrapper[4764]: E0309 13:40:35.923387 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29e20119-f7d3-4b10-82c3-afbfa462c831" containerName="glance-db-sync" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.923395 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="29e20119-f7d3-4b10-82c3-afbfa462c831" containerName="glance-db-sync" Mar 09 13:40:35 crc kubenswrapper[4764]: E0309 13:40:35.923416 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bf72fda-56e5-427c-b2d0-8267613d8a9e" containerName="mariadb-database-create" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.923434 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bf72fda-56e5-427c-b2d0-8267613d8a9e" containerName="mariadb-database-create" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.923585 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bf72fda-56e5-427c-b2d0-8267613d8a9e" containerName="mariadb-database-create" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.923596 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b86f9b8-6493-4a60-85b3-12057a6a8f65" containerName="mariadb-account-create-update" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.923606 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="29e20119-f7d3-4b10-82c3-afbfa462c831" containerName="glance-db-sync" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.923616 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="46124175-b282-444f-8d9c-0397e35cf8ae" containerName="mariadb-database-create" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.923627 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="82410bc0-aa4c-450d-8fbc-67cfb9dd615b" containerName="mariadb-account-create-update" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.923635 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad7d32c2-ffe4-43d5-8640-6219f863bc2a" containerName="mariadb-account-create-update" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.923668 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="642b5df5-dec0-47cc-9595-02b254277452" containerName="mariadb-database-create" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.924517 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:35 crc kubenswrapper[4764]: I0309 13:40:35.941570 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-28h2r"] Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.094044 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-ovsdbserver-nb\") pod \"dnsmasq-dns-554567b4f7-28h2r\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.094423 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwv6l\" (UniqueName: \"kubernetes.io/projected/bb8bce39-6992-4785-a460-24d6def57630-kube-api-access-xwv6l\") pod \"dnsmasq-dns-554567b4f7-28h2r\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.094493 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-config\") pod \"dnsmasq-dns-554567b4f7-28h2r\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.094517 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-dns-svc\") pod \"dnsmasq-dns-554567b4f7-28h2r\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.094543 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-ovsdbserver-sb\") pod \"dnsmasq-dns-554567b4f7-28h2r\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.100342 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rjx7v" event={"ID":"ea9388c2-526b-49ff-8f42-03ca66ae08dd","Type":"ContainerStarted","Data":"bffbf528e6b05eccf8c0e302264ead1f4f679cc141d19e6aa19cb09ac29fba17"} Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.197131 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-ovsdbserver-nb\") pod \"dnsmasq-dns-554567b4f7-28h2r\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.197209 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwv6l\" (UniqueName: \"kubernetes.io/projected/bb8bce39-6992-4785-a460-24d6def57630-kube-api-access-xwv6l\") pod \"dnsmasq-dns-554567b4f7-28h2r\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.197324 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-config\") pod \"dnsmasq-dns-554567b4f7-28h2r\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.197351 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-dns-svc\") pod \"dnsmasq-dns-554567b4f7-28h2r\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.197383 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-ovsdbserver-sb\") pod \"dnsmasq-dns-554567b4f7-28h2r\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.198443 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-ovsdbserver-sb\") pod \"dnsmasq-dns-554567b4f7-28h2r\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.199009 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-config\") pod \"dnsmasq-dns-554567b4f7-28h2r\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.199476 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-dns-svc\") pod \"dnsmasq-dns-554567b4f7-28h2r\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.201674 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-ovsdbserver-nb\") pod \"dnsmasq-dns-554567b4f7-28h2r\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.234725 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwv6l\" (UniqueName: \"kubernetes.io/projected/bb8bce39-6992-4785-a460-24d6def57630-kube-api-access-xwv6l\") pod \"dnsmasq-dns-554567b4f7-28h2r\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.246911 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:36 crc kubenswrapper[4764]: I0309 13:40:36.720900 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-28h2r"] Mar 09 13:40:36 crc kubenswrapper[4764]: W0309 13:40:36.725326 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb8bce39_6992_4785_a460_24d6def57630.slice/crio-e0c540986c124380f8b3c90bbd614819fd8fb0c94ebb67116f54a09ce532fcfa WatchSource:0}: Error finding container e0c540986c124380f8b3c90bbd614819fd8fb0c94ebb67116f54a09ce532fcfa: Status 404 returned error can't find the container with id e0c540986c124380f8b3c90bbd614819fd8fb0c94ebb67116f54a09ce532fcfa Mar 09 13:40:37 crc kubenswrapper[4764]: I0309 13:40:37.112180 4764 generic.go:334] "Generic (PLEG): container finished" podID="bb8bce39-6992-4785-a460-24d6def57630" containerID="0a568401ccdf9f344a39bc852ee338a2a7b840ec0f974c1b01a9b85e341b5181" exitCode=0 Mar 09 13:40:37 crc kubenswrapper[4764]: I0309 13:40:37.112296 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554567b4f7-28h2r" event={"ID":"bb8bce39-6992-4785-a460-24d6def57630","Type":"ContainerDied","Data":"0a568401ccdf9f344a39bc852ee338a2a7b840ec0f974c1b01a9b85e341b5181"} Mar 09 13:40:37 crc kubenswrapper[4764]: I0309 13:40:37.114700 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554567b4f7-28h2r" event={"ID":"bb8bce39-6992-4785-a460-24d6def57630","Type":"ContainerStarted","Data":"e0c540986c124380f8b3c90bbd614819fd8fb0c94ebb67116f54a09ce532fcfa"} Mar 09 13:40:38 crc kubenswrapper[4764]: I0309 13:40:38.126287 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554567b4f7-28h2r" event={"ID":"bb8bce39-6992-4785-a460-24d6def57630","Type":"ContainerStarted","Data":"645a2f0f2f39fa20266b4afd9cec3444922a6377324ddf9bf9434ae55054aa8e"} Mar 09 13:40:38 crc kubenswrapper[4764]: I0309 13:40:38.126757 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:38 crc kubenswrapper[4764]: I0309 13:40:38.149186 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-554567b4f7-28h2r" podStartSLOduration=3.149154181 podStartE2EDuration="3.149154181s" podCreationTimestamp="2026-03-09 13:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:40:38.143612372 +0000 UTC m=+1193.393784280" watchObservedRunningTime="2026-03-09 13:40:38.149154181 +0000 UTC m=+1193.399326089" Mar 09 13:40:39 crc kubenswrapper[4764]: I0309 13:40:39.137399 4764 generic.go:334] "Generic (PLEG): container finished" podID="ea9388c2-526b-49ff-8f42-03ca66ae08dd" containerID="bffbf528e6b05eccf8c0e302264ead1f4f679cc141d19e6aa19cb09ac29fba17" exitCode=0 Mar 09 13:40:39 crc kubenswrapper[4764]: I0309 13:40:39.137534 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rjx7v" event={"ID":"ea9388c2-526b-49ff-8f42-03ca66ae08dd","Type":"ContainerDied","Data":"bffbf528e6b05eccf8c0e302264ead1f4f679cc141d19e6aa19cb09ac29fba17"} Mar 09 13:40:40 crc kubenswrapper[4764]: I0309 13:40:40.499612 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rjx7v" Mar 09 13:40:40 crc kubenswrapper[4764]: I0309 13:40:40.573531 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea9388c2-526b-49ff-8f42-03ca66ae08dd-config-data\") pod \"ea9388c2-526b-49ff-8f42-03ca66ae08dd\" (UID: \"ea9388c2-526b-49ff-8f42-03ca66ae08dd\") " Mar 09 13:40:40 crc kubenswrapper[4764]: I0309 13:40:40.573692 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea9388c2-526b-49ff-8f42-03ca66ae08dd-combined-ca-bundle\") pod \"ea9388c2-526b-49ff-8f42-03ca66ae08dd\" (UID: \"ea9388c2-526b-49ff-8f42-03ca66ae08dd\") " Mar 09 13:40:40 crc kubenswrapper[4764]: I0309 13:40:40.573727 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8bhv\" (UniqueName: \"kubernetes.io/projected/ea9388c2-526b-49ff-8f42-03ca66ae08dd-kube-api-access-q8bhv\") pod \"ea9388c2-526b-49ff-8f42-03ca66ae08dd\" (UID: \"ea9388c2-526b-49ff-8f42-03ca66ae08dd\") " Mar 09 13:40:40 crc kubenswrapper[4764]: I0309 13:40:40.580329 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea9388c2-526b-49ff-8f42-03ca66ae08dd-kube-api-access-q8bhv" (OuterVolumeSpecName: "kube-api-access-q8bhv") pod "ea9388c2-526b-49ff-8f42-03ca66ae08dd" (UID: "ea9388c2-526b-49ff-8f42-03ca66ae08dd"). InnerVolumeSpecName "kube-api-access-q8bhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:40 crc kubenswrapper[4764]: I0309 13:40:40.604620 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea9388c2-526b-49ff-8f42-03ca66ae08dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea9388c2-526b-49ff-8f42-03ca66ae08dd" (UID: "ea9388c2-526b-49ff-8f42-03ca66ae08dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:40:40 crc kubenswrapper[4764]: I0309 13:40:40.617798 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea9388c2-526b-49ff-8f42-03ca66ae08dd-config-data" (OuterVolumeSpecName: "config-data") pod "ea9388c2-526b-49ff-8f42-03ca66ae08dd" (UID: "ea9388c2-526b-49ff-8f42-03ca66ae08dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:40:40 crc kubenswrapper[4764]: I0309 13:40:40.681424 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea9388c2-526b-49ff-8f42-03ca66ae08dd-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:40 crc kubenswrapper[4764]: I0309 13:40:40.681465 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea9388c2-526b-49ff-8f42-03ca66ae08dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:40 crc kubenswrapper[4764]: I0309 13:40:40.681481 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8bhv\" (UniqueName: \"kubernetes.io/projected/ea9388c2-526b-49ff-8f42-03ca66ae08dd-kube-api-access-q8bhv\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.155737 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rjx7v" event={"ID":"ea9388c2-526b-49ff-8f42-03ca66ae08dd","Type":"ContainerDied","Data":"250d5ff9e06681bc763b0d0c4d24353835d78a5fc9756881a6c43044e79b8237"} Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.155782 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="250d5ff9e06681bc763b0d0c4d24353835d78a5fc9756881a6c43044e79b8237" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.156319 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rjx7v" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.408125 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-28h2r"] Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.408418 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-554567b4f7-28h2r" podUID="bb8bce39-6992-4785-a460-24d6def57630" containerName="dnsmasq-dns" containerID="cri-o://645a2f0f2f39fa20266b4afd9cec3444922a6377324ddf9bf9434ae55054aa8e" gracePeriod=10 Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.455052 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67795cd9-n77tv"] Mar 09 13:40:41 crc kubenswrapper[4764]: E0309 13:40:41.455434 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea9388c2-526b-49ff-8f42-03ca66ae08dd" containerName="keystone-db-sync" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.455450 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea9388c2-526b-49ff-8f42-03ca66ae08dd" containerName="keystone-db-sync" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.455619 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea9388c2-526b-49ff-8f42-03ca66ae08dd" containerName="keystone-db-sync" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.456504 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.464022 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-mtsrr"] Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.465237 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.476063 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.476248 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.476274 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.476467 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.476814 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4hfql" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.495723 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67795cd9-n77tv"] Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.533716 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mtsrr"] Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.629947 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-credential-keys\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.630545 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-ovsdbserver-sb\") pod \"dnsmasq-dns-67795cd9-n77tv\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.630848 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-config\") pod \"dnsmasq-dns-67795cd9-n77tv\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.630949 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-dns-svc\") pod \"dnsmasq-dns-67795cd9-n77tv\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.630987 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-combined-ca-bundle\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.631142 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-scripts\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.631215 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-ovsdbserver-nb\") pod \"dnsmasq-dns-67795cd9-n77tv\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.631233 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6nvs\" (UniqueName: \"kubernetes.io/projected/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-kube-api-access-t6nvs\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.631353 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs6wz\" (UniqueName: \"kubernetes.io/projected/2a710e46-50b7-4069-b15c-ee3d19bc06e0-kube-api-access-hs6wz\") pod \"dnsmasq-dns-67795cd9-n77tv\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.631402 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-config-data\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.631464 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-fernet-keys\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.658698 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-dp5x6"] Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.659920 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.669473 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.672502 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-4vt4n" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.672536 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.677247 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-dp5x6"] Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.716609 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.724763 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.734004 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.734279 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.735342 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs6wz\" (UniqueName: \"kubernetes.io/projected/2a710e46-50b7-4069-b15c-ee3d19bc06e0-kube-api-access-hs6wz\") pod \"dnsmasq-dns-67795cd9-n77tv\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.735395 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-config-data\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.735422 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-fernet-keys\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.735454 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-credential-keys\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.735473 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-ovsdbserver-sb\") pod \"dnsmasq-dns-67795cd9-n77tv\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.735492 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-config\") pod \"dnsmasq-dns-67795cd9-n77tv\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.735528 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-dns-svc\") pod \"dnsmasq-dns-67795cd9-n77tv\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.735549 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-combined-ca-bundle\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.735606 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-scripts\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.735637 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-ovsdbserver-nb\") pod \"dnsmasq-dns-67795cd9-n77tv\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.735666 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6nvs\" (UniqueName: \"kubernetes.io/projected/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-kube-api-access-t6nvs\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.736975 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-ovsdbserver-sb\") pod \"dnsmasq-dns-67795cd9-n77tv\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.738112 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-config\") pod \"dnsmasq-dns-67795cd9-n77tv\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.739308 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-dns-svc\") pod \"dnsmasq-dns-67795cd9-n77tv\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.739867 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-ovsdbserver-nb\") pod \"dnsmasq-dns-67795cd9-n77tv\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.751096 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-fernet-keys\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.752258 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-scripts\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.767264 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-config-data\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.780095 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.783336 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-credential-keys\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.788107 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-combined-ca-bundle\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.798415 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs6wz\" (UniqueName: \"kubernetes.io/projected/2a710e46-50b7-4069-b15c-ee3d19bc06e0-kube-api-access-hs6wz\") pod \"dnsmasq-dns-67795cd9-n77tv\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.802111 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-cmhtp"] Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.803983 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cmhtp" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.812533 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.812784 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-fn2ft" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.812944 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.827352 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-cmhtp"] Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.828093 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6nvs\" (UniqueName: \"kubernetes.io/projected/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-kube-api-access-t6nvs\") pod \"keystone-bootstrap-mtsrr\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.841552 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-config-data\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.841706 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-scripts\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.841754 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/861cdd7d-b563-4009-9c33-a5c64d6ffae9-log-httpd\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.841772 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/861cdd7d-b563-4009-9c33-a5c64d6ffae9-run-httpd\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.841799 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-config-data\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.841820 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.841842 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-scripts\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.841874 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khssq\" (UniqueName: \"kubernetes.io/projected/74146b7d-9780-4d2d-9454-853296f88955-kube-api-access-khssq\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.841911 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj2kg\" (UniqueName: \"kubernetes.io/projected/861cdd7d-b563-4009-9c33-a5c64d6ffae9-kube-api-access-mj2kg\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.841938 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74146b7d-9780-4d2d-9454-853296f88955-etc-machine-id\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.841953 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-db-sync-config-data\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.841969 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-combined-ca-bundle\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.842027 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.874544 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.902198 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-x9gvc"] Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.904335 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-x9gvc" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.911572 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.911809 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-j8zg7" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.920822 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.940749 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-x9gvc"] Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.943769 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-scripts\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.943893 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-config\") pod \"neutron-db-sync-cmhtp\" (UID: \"34466abc-30eb-4a0c-b4ea-50b5ab368fa1\") " pod="openstack/neutron-db-sync-cmhtp" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.943937 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/861cdd7d-b563-4009-9c33-a5c64d6ffae9-log-httpd\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.943954 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/861cdd7d-b563-4009-9c33-a5c64d6ffae9-run-httpd\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.943978 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-config-data\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.943996 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.944037 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-scripts\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.944060 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khssq\" (UniqueName: \"kubernetes.io/projected/74146b7d-9780-4d2d-9454-853296f88955-kube-api-access-khssq\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.944094 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj2kg\" (UniqueName: \"kubernetes.io/projected/861cdd7d-b563-4009-9c33-a5c64d6ffae9-kube-api-access-mj2kg\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.944118 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74146b7d-9780-4d2d-9454-853296f88955-etc-machine-id\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.944146 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-db-sync-config-data\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.944162 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-combined-ca-bundle\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.944203 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-combined-ca-bundle\") pod \"neutron-db-sync-cmhtp\" (UID: \"34466abc-30eb-4a0c-b4ea-50b5ab368fa1\") " pod="openstack/neutron-db-sync-cmhtp" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.944224 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.944242 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lj9f\" (UniqueName: \"kubernetes.io/projected/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-kube-api-access-8lj9f\") pod \"neutron-db-sync-cmhtp\" (UID: \"34466abc-30eb-4a0c-b4ea-50b5ab368fa1\") " pod="openstack/neutron-db-sync-cmhtp" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.944264 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-config-data\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.946300 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/861cdd7d-b563-4009-9c33-a5c64d6ffae9-run-httpd\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.946753 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/861cdd7d-b563-4009-9c33-a5c64d6ffae9-log-httpd\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.950440 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67795cd9-n77tv"] Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.950545 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74146b7d-9780-4d2d-9454-853296f88955-etc-machine-id\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.952109 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.960450 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-combined-ca-bundle\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.960847 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-db-sync-config-data\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.969035 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-config-data\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.973573 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-scripts\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.976220 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.980753 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-config-data\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:41 crc kubenswrapper[4764]: I0309 13:40:41.986211 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-scripts\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.009479 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khssq\" (UniqueName: \"kubernetes.io/projected/74146b7d-9780-4d2d-9454-853296f88955-kube-api-access-khssq\") pod \"cinder-db-sync-dp5x6\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.013625 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj2kg\" (UniqueName: \"kubernetes.io/projected/861cdd7d-b563-4009-9c33-a5c64d6ffae9-kube-api-access-mj2kg\") pod \"ceilometer-0\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " pod="openstack/ceilometer-0" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.020904 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-47q58"] Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.029342 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.039759 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-bnpcj"] Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.042469 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.045918 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jvrd\" (UniqueName: \"kubernetes.io/projected/cb54f57d-afb6-4e53-be9a-4b22573a9450-kube-api-access-7jvrd\") pod \"barbican-db-sync-x9gvc\" (UID: \"cb54f57d-afb6-4e53-be9a-4b22573a9450\") " pod="openstack/barbican-db-sync-x9gvc" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.045983 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-combined-ca-bundle\") pod \"neutron-db-sync-cmhtp\" (UID: \"34466abc-30eb-4a0c-b4ea-50b5ab368fa1\") " pod="openstack/neutron-db-sync-cmhtp" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.046008 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lj9f\" (UniqueName: \"kubernetes.io/projected/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-kube-api-access-8lj9f\") pod \"neutron-db-sync-cmhtp\" (UID: \"34466abc-30eb-4a0c-b4ea-50b5ab368fa1\") " pod="openstack/neutron-db-sync-cmhtp" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.046038 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cb54f57d-afb6-4e53-be9a-4b22573a9450-db-sync-config-data\") pod \"barbican-db-sync-x9gvc\" (UID: \"cb54f57d-afb6-4e53-be9a-4b22573a9450\") " pod="openstack/barbican-db-sync-x9gvc" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.046086 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb54f57d-afb6-4e53-be9a-4b22573a9450-combined-ca-bundle\") pod \"barbican-db-sync-x9gvc\" (UID: \"cb54f57d-afb6-4e53-be9a-4b22573a9450\") " pod="openstack/barbican-db-sync-x9gvc" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.046120 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-config\") pod \"neutron-db-sync-cmhtp\" (UID: \"34466abc-30eb-4a0c-b4ea-50b5ab368fa1\") " pod="openstack/neutron-db-sync-cmhtp" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.048104 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.048276 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.048443 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-l4tqv" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.050332 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-combined-ca-bundle\") pod \"neutron-db-sync-cmhtp\" (UID: \"34466abc-30eb-4a0c-b4ea-50b5ab368fa1\") " pod="openstack/neutron-db-sync-cmhtp" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.052599 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-config\") pod \"neutron-db-sync-cmhtp\" (UID: \"34466abc-30eb-4a0c-b4ea-50b5ab368fa1\") " pod="openstack/neutron-db-sync-cmhtp" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.061202 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-47q58"] Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.083671 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lj9f\" (UniqueName: \"kubernetes.io/projected/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-kube-api-access-8lj9f\") pod \"neutron-db-sync-cmhtp\" (UID: \"34466abc-30eb-4a0c-b4ea-50b5ab368fa1\") " pod="openstack/neutron-db-sync-cmhtp" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.085604 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-bnpcj"] Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.126522 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.149103 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-scripts\") pod \"placement-db-sync-bnpcj\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.149177 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-combined-ca-bundle\") pod \"placement-db-sync-bnpcj\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.149217 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jvrd\" (UniqueName: \"kubernetes.io/projected/cb54f57d-afb6-4e53-be9a-4b22573a9450-kube-api-access-7jvrd\") pod \"barbican-db-sync-x9gvc\" (UID: \"cb54f57d-afb6-4e53-be9a-4b22573a9450\") " pod="openstack/barbican-db-sync-x9gvc" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.149257 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-ovsdbserver-nb\") pod \"dnsmasq-dns-5b6dbdb6f5-47q58\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.149297 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-config-data\") pod \"placement-db-sync-bnpcj\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.149323 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-ovsdbserver-sb\") pod \"dnsmasq-dns-5b6dbdb6f5-47q58\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.149347 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cb54f57d-afb6-4e53-be9a-4b22573a9450-db-sync-config-data\") pod \"barbican-db-sync-x9gvc\" (UID: \"cb54f57d-afb6-4e53-be9a-4b22573a9450\") " pod="openstack/barbican-db-sync-x9gvc" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.159913 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb54f57d-afb6-4e53-be9a-4b22573a9450-combined-ca-bundle\") pod \"barbican-db-sync-x9gvc\" (UID: \"cb54f57d-afb6-4e53-be9a-4b22573a9450\") " pod="openstack/barbican-db-sync-x9gvc" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.159997 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-config\") pod \"dnsmasq-dns-5b6dbdb6f5-47q58\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.160069 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1004910c-0db4-4e3d-aac5-358a557ee268-logs\") pod \"placement-db-sync-bnpcj\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.160096 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9qxl\" (UniqueName: \"kubernetes.io/projected/1004910c-0db4-4e3d-aac5-358a557ee268-kube-api-access-v9qxl\") pod \"placement-db-sync-bnpcj\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.160144 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-dns-svc\") pod \"dnsmasq-dns-5b6dbdb6f5-47q58\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.160160 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4km7\" (UniqueName: \"kubernetes.io/projected/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-kube-api-access-p4km7\") pod \"dnsmasq-dns-5b6dbdb6f5-47q58\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.166037 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cb54f57d-afb6-4e53-be9a-4b22573a9450-db-sync-config-data\") pod \"barbican-db-sync-x9gvc\" (UID: \"cb54f57d-afb6-4e53-be9a-4b22573a9450\") " pod="openstack/barbican-db-sync-x9gvc" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.169545 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb54f57d-afb6-4e53-be9a-4b22573a9450-combined-ca-bundle\") pod \"barbican-db-sync-x9gvc\" (UID: \"cb54f57d-afb6-4e53-be9a-4b22573a9450\") " pod="openstack/barbican-db-sync-x9gvc" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.180574 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jvrd\" (UniqueName: \"kubernetes.io/projected/cb54f57d-afb6-4e53-be9a-4b22573a9450-kube-api-access-7jvrd\") pod \"barbican-db-sync-x9gvc\" (UID: \"cb54f57d-afb6-4e53-be9a-4b22573a9450\") " pod="openstack/barbican-db-sync-x9gvc" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.187258 4764 generic.go:334] "Generic (PLEG): container finished" podID="bb8bce39-6992-4785-a460-24d6def57630" containerID="645a2f0f2f39fa20266b4afd9cec3444922a6377324ddf9bf9434ae55054aa8e" exitCode=0 Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.187326 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554567b4f7-28h2r" event={"ID":"bb8bce39-6992-4785-a460-24d6def57630","Type":"ContainerDied","Data":"645a2f0f2f39fa20266b4afd9cec3444922a6377324ddf9bf9434ae55054aa8e"} Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.187357 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554567b4f7-28h2r" event={"ID":"bb8bce39-6992-4785-a460-24d6def57630","Type":"ContainerDied","Data":"e0c540986c124380f8b3c90bbd614819fd8fb0c94ebb67116f54a09ce532fcfa"} Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.187376 4764 scope.go:117] "RemoveContainer" containerID="645a2f0f2f39fa20266b4afd9cec3444922a6377324ddf9bf9434ae55054aa8e" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.187590 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554567b4f7-28h2r" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.188699 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.209030 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cmhtp" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.243692 4764 scope.go:117] "RemoveContainer" containerID="0a568401ccdf9f344a39bc852ee338a2a7b840ec0f974c1b01a9b85e341b5181" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.264685 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-ovsdbserver-nb\") pod \"bb8bce39-6992-4785-a460-24d6def57630\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.264785 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwv6l\" (UniqueName: \"kubernetes.io/projected/bb8bce39-6992-4785-a460-24d6def57630-kube-api-access-xwv6l\") pod \"bb8bce39-6992-4785-a460-24d6def57630\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.264878 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-ovsdbserver-sb\") pod \"bb8bce39-6992-4785-a460-24d6def57630\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.264993 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-config\") pod \"bb8bce39-6992-4785-a460-24d6def57630\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.265049 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-dns-svc\") pod \"bb8bce39-6992-4785-a460-24d6def57630\" (UID: \"bb8bce39-6992-4785-a460-24d6def57630\") " Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.265389 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-config-data\") pod \"placement-db-sync-bnpcj\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.265422 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-ovsdbserver-sb\") pod \"dnsmasq-dns-5b6dbdb6f5-47q58\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.265485 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-config\") pod \"dnsmasq-dns-5b6dbdb6f5-47q58\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.265517 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1004910c-0db4-4e3d-aac5-358a557ee268-logs\") pod \"placement-db-sync-bnpcj\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.265537 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9qxl\" (UniqueName: \"kubernetes.io/projected/1004910c-0db4-4e3d-aac5-358a557ee268-kube-api-access-v9qxl\") pod \"placement-db-sync-bnpcj\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.265557 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-dns-svc\") pod \"dnsmasq-dns-5b6dbdb6f5-47q58\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.265575 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4km7\" (UniqueName: \"kubernetes.io/projected/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-kube-api-access-p4km7\") pod \"dnsmasq-dns-5b6dbdb6f5-47q58\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.265596 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-scripts\") pod \"placement-db-sync-bnpcj\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.265620 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-combined-ca-bundle\") pod \"placement-db-sync-bnpcj\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.265694 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-ovsdbserver-nb\") pod \"dnsmasq-dns-5b6dbdb6f5-47q58\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.266617 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-ovsdbserver-nb\") pod \"dnsmasq-dns-5b6dbdb6f5-47q58\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.268677 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-config\") pod \"dnsmasq-dns-5b6dbdb6f5-47q58\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.268691 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1004910c-0db4-4e3d-aac5-358a557ee268-logs\") pod \"placement-db-sync-bnpcj\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.269690 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-dns-svc\") pod \"dnsmasq-dns-5b6dbdb6f5-47q58\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.274523 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-ovsdbserver-sb\") pod \"dnsmasq-dns-5b6dbdb6f5-47q58\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.291974 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb8bce39-6992-4785-a460-24d6def57630-kube-api-access-xwv6l" (OuterVolumeSpecName: "kube-api-access-xwv6l") pod "bb8bce39-6992-4785-a460-24d6def57630" (UID: "bb8bce39-6992-4785-a460-24d6def57630"). InnerVolumeSpecName "kube-api-access-xwv6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.292203 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-scripts\") pod \"placement-db-sync-bnpcj\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.296707 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-config-data\") pod \"placement-db-sync-bnpcj\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.297290 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4km7\" (UniqueName: \"kubernetes.io/projected/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-kube-api-access-p4km7\") pod \"dnsmasq-dns-5b6dbdb6f5-47q58\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.297765 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.300594 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9qxl\" (UniqueName: \"kubernetes.io/projected/1004910c-0db4-4e3d-aac5-358a557ee268-kube-api-access-v9qxl\") pod \"placement-db-sync-bnpcj\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.302359 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-combined-ca-bundle\") pod \"placement-db-sync-bnpcj\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.341335 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-x9gvc" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.345751 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bb8bce39-6992-4785-a460-24d6def57630" (UID: "bb8bce39-6992-4785-a460-24d6def57630"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.346888 4764 scope.go:117] "RemoveContainer" containerID="645a2f0f2f39fa20266b4afd9cec3444922a6377324ddf9bf9434ae55054aa8e" Mar 09 13:40:42 crc kubenswrapper[4764]: E0309 13:40:42.347973 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"645a2f0f2f39fa20266b4afd9cec3444922a6377324ddf9bf9434ae55054aa8e\": container with ID starting with 645a2f0f2f39fa20266b4afd9cec3444922a6377324ddf9bf9434ae55054aa8e not found: ID does not exist" containerID="645a2f0f2f39fa20266b4afd9cec3444922a6377324ddf9bf9434ae55054aa8e" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.348022 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"645a2f0f2f39fa20266b4afd9cec3444922a6377324ddf9bf9434ae55054aa8e"} err="failed to get container status \"645a2f0f2f39fa20266b4afd9cec3444922a6377324ddf9bf9434ae55054aa8e\": rpc error: code = NotFound desc = could not find container \"645a2f0f2f39fa20266b4afd9cec3444922a6377324ddf9bf9434ae55054aa8e\": container with ID starting with 645a2f0f2f39fa20266b4afd9cec3444922a6377324ddf9bf9434ae55054aa8e not found: ID does not exist" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.348051 4764 scope.go:117] "RemoveContainer" containerID="0a568401ccdf9f344a39bc852ee338a2a7b840ec0f974c1b01a9b85e341b5181" Mar 09 13:40:42 crc kubenswrapper[4764]: E0309 13:40:42.348408 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a568401ccdf9f344a39bc852ee338a2a7b840ec0f974c1b01a9b85e341b5181\": container with ID starting with 0a568401ccdf9f344a39bc852ee338a2a7b840ec0f974c1b01a9b85e341b5181 not found: ID does not exist" containerID="0a568401ccdf9f344a39bc852ee338a2a7b840ec0f974c1b01a9b85e341b5181" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.348432 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a568401ccdf9f344a39bc852ee338a2a7b840ec0f974c1b01a9b85e341b5181"} err="failed to get container status \"0a568401ccdf9f344a39bc852ee338a2a7b840ec0f974c1b01a9b85e341b5181\": rpc error: code = NotFound desc = could not find container \"0a568401ccdf9f344a39bc852ee338a2a7b840ec0f974c1b01a9b85e341b5181\": container with ID starting with 0a568401ccdf9f344a39bc852ee338a2a7b840ec0f974c1b01a9b85e341b5181 not found: ID does not exist" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.367409 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.367569 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwv6l\" (UniqueName: \"kubernetes.io/projected/bb8bce39-6992-4785-a460-24d6def57630-kube-api-access-xwv6l\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.367403 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-config" (OuterVolumeSpecName: "config") pod "bb8bce39-6992-4785-a460-24d6def57630" (UID: "bb8bce39-6992-4785-a460-24d6def57630"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.376753 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.393372 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-bnpcj" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.403196 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bb8bce39-6992-4785-a460-24d6def57630" (UID: "bb8bce39-6992-4785-a460-24d6def57630"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.404369 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bb8bce39-6992-4785-a460-24d6def57630" (UID: "bb8bce39-6992-4785-a460-24d6def57630"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.468712 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.468743 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.468752 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb8bce39-6992-4785-a460-24d6def57630-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.554686 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-28h2r"] Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.562016 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-28h2r"] Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.613666 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mtsrr"] Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.638656 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67795cd9-n77tv"] Mar 09 13:40:42 crc kubenswrapper[4764]: I0309 13:40:42.874726 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.065410 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-cmhtp"] Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.224173 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cmhtp" event={"ID":"34466abc-30eb-4a0c-b4ea-50b5ab368fa1","Type":"ContainerStarted","Data":"7aa42fe65ff3f50e964cbe57813dcd4f0b9049443a2784caec6019418fe3ac3d"} Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.247217 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mtsrr" event={"ID":"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62","Type":"ContainerStarted","Data":"14bfbdfc3fcd7dcb9efec055a62cdfeec5d1e0e90e453a55c55ffd01ca49ad5a"} Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.247266 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mtsrr" event={"ID":"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62","Type":"ContainerStarted","Data":"964dc46977581745072ef70168c2ac93d99beadaf259b69a6032dafaefc2a059"} Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.254307 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-dp5x6"] Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.283176 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"861cdd7d-b563-4009-9c33-a5c64d6ffae9","Type":"ContainerStarted","Data":"6eab7083503ac11b2f955fc7b67907a3eb60734c7262ad63357465f6782429c0"} Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.286964 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-x9gvc"] Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.299061 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-47q58"] Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.307520 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-mtsrr" podStartSLOduration=2.307490315 podStartE2EDuration="2.307490315s" podCreationTimestamp="2026-03-09 13:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:40:43.28536864 +0000 UTC m=+1198.535540548" watchObservedRunningTime="2026-03-09 13:40:43.307490315 +0000 UTC m=+1198.557662223" Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.325425 4764 generic.go:334] "Generic (PLEG): container finished" podID="2a710e46-50b7-4069-b15c-ee3d19bc06e0" containerID="550203bc7f1c151fba9e233eabd544aa66da7e46f4d007a4f7cc69ce9aa3acf6" exitCode=0 Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.325478 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67795cd9-n77tv" event={"ID":"2a710e46-50b7-4069-b15c-ee3d19bc06e0","Type":"ContainerDied","Data":"550203bc7f1c151fba9e233eabd544aa66da7e46f4d007a4f7cc69ce9aa3acf6"} Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.325507 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67795cd9-n77tv" event={"ID":"2a710e46-50b7-4069-b15c-ee3d19bc06e0","Type":"ContainerStarted","Data":"c4a9f43a5d65d0ed33c7939a176de965247fc3684b2222485a9ce263feb9c4e8"} Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.352941 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-bnpcj"] Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.627241 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb8bce39-6992-4785-a460-24d6def57630" path="/var/lib/kubelet/pods/bb8bce39-6992-4785-a460-24d6def57630/volumes" Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.825467 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.887332 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.909603 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-ovsdbserver-sb\") pod \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.909785 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hs6wz\" (UniqueName: \"kubernetes.io/projected/2a710e46-50b7-4069-b15c-ee3d19bc06e0-kube-api-access-hs6wz\") pod \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.909830 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-config\") pod \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.909905 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-ovsdbserver-nb\") pod \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.909977 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-dns-svc\") pod \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\" (UID: \"2a710e46-50b7-4069-b15c-ee3d19bc06e0\") " Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.944842 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a710e46-50b7-4069-b15c-ee3d19bc06e0-kube-api-access-hs6wz" (OuterVolumeSpecName: "kube-api-access-hs6wz") pod "2a710e46-50b7-4069-b15c-ee3d19bc06e0" (UID: "2a710e46-50b7-4069-b15c-ee3d19bc06e0"). InnerVolumeSpecName "kube-api-access-hs6wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.961427 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2a710e46-50b7-4069-b15c-ee3d19bc06e0" (UID: "2a710e46-50b7-4069-b15c-ee3d19bc06e0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:43 crc kubenswrapper[4764]: I0309 13:40:43.989233 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2a710e46-50b7-4069-b15c-ee3d19bc06e0" (UID: "2a710e46-50b7-4069-b15c-ee3d19bc06e0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.017029 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.017065 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hs6wz\" (UniqueName: \"kubernetes.io/projected/2a710e46-50b7-4069-b15c-ee3d19bc06e0-kube-api-access-hs6wz\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.017081 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.018492 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2a710e46-50b7-4069-b15c-ee3d19bc06e0" (UID: "2a710e46-50b7-4069-b15c-ee3d19bc06e0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.022578 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-config" (OuterVolumeSpecName: "config") pod "2a710e46-50b7-4069-b15c-ee3d19bc06e0" (UID: "2a710e46-50b7-4069-b15c-ee3d19bc06e0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.119171 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.119204 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a710e46-50b7-4069-b15c-ee3d19bc06e0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.354163 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67795cd9-n77tv" event={"ID":"2a710e46-50b7-4069-b15c-ee3d19bc06e0","Type":"ContainerDied","Data":"c4a9f43a5d65d0ed33c7939a176de965247fc3684b2222485a9ce263feb9c4e8"} Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.354211 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67795cd9-n77tv" Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.354240 4764 scope.go:117] "RemoveContainer" containerID="550203bc7f1c151fba9e233eabd544aa66da7e46f4d007a4f7cc69ce9aa3acf6" Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.356885 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cmhtp" event={"ID":"34466abc-30eb-4a0c-b4ea-50b5ab368fa1","Type":"ContainerStarted","Data":"77b865a9fe4889c82ec1f58b4a21addc379165d3a6897c87913c6042aa1f357b"} Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.360955 4764 generic.go:334] "Generic (PLEG): container finished" podID="f99ecd3e-1d5b-4f3d-80d7-9a401171fef1" containerID="643b445a5cdf8e66cc772b07777e2b70383e28ddead01aaa535ab1ef9fb46333" exitCode=0 Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.361007 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" event={"ID":"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1","Type":"ContainerDied","Data":"643b445a5cdf8e66cc772b07777e2b70383e28ddead01aaa535ab1ef9fb46333"} Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.361025 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" event={"ID":"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1","Type":"ContainerStarted","Data":"29094e7809129e1e7698bb9912d54d76c17888d5bfac065889c5bd5838e4b71c"} Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.367476 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-bnpcj" event={"ID":"1004910c-0db4-4e3d-aac5-358a557ee268","Type":"ContainerStarted","Data":"4a62e7bc40a3d288cb2716e66a0b772309aa68a835605c18dfc2a81767e2ae07"} Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.372358 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dp5x6" event={"ID":"74146b7d-9780-4d2d-9454-853296f88955","Type":"ContainerStarted","Data":"eedf3d5fa18d75dee38a49a615e9d2f0a831d8c6f13a33063d424dd9831c9a81"} Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.388166 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-x9gvc" event={"ID":"cb54f57d-afb6-4e53-be9a-4b22573a9450","Type":"ContainerStarted","Data":"7611ab820e5631fb861e6ee00f8e6a6553e8f141cb68d0a4556a1a0beeb23d3b"} Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.408605 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-cmhtp" podStartSLOduration=3.408579764 podStartE2EDuration="3.408579764s" podCreationTimestamp="2026-03-09 13:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:40:44.383520116 +0000 UTC m=+1199.633692024" watchObservedRunningTime="2026-03-09 13:40:44.408579764 +0000 UTC m=+1199.658751662" Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.448368 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67795cd9-n77tv"] Mar 09 13:40:44 crc kubenswrapper[4764]: I0309 13:40:44.457859 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67795cd9-n77tv"] Mar 09 13:40:45 crc kubenswrapper[4764]: I0309 13:40:45.435069 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" event={"ID":"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1","Type":"ContainerStarted","Data":"1c12c99553c3fded7ba52a026664e1bff10eba1ac57dc2b2484eb57141804c22"} Mar 09 13:40:45 crc kubenswrapper[4764]: I0309 13:40:45.435489 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:45 crc kubenswrapper[4764]: I0309 13:40:45.474096 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" podStartSLOduration=4.47407667 podStartE2EDuration="4.47407667s" podCreationTimestamp="2026-03-09 13:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:40:45.466778937 +0000 UTC m=+1200.716950855" watchObservedRunningTime="2026-03-09 13:40:45.47407667 +0000 UTC m=+1200.724248578" Mar 09 13:40:45 crc kubenswrapper[4764]: I0309 13:40:45.595574 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a710e46-50b7-4069-b15c-ee3d19bc06e0" path="/var/lib/kubelet/pods/2a710e46-50b7-4069-b15c-ee3d19bc06e0/volumes" Mar 09 13:40:48 crc kubenswrapper[4764]: I0309 13:40:48.464913 4764 generic.go:334] "Generic (PLEG): container finished" podID="be29c97d-4c8d-4e5b-9e12-07cdc4df5a62" containerID="14bfbdfc3fcd7dcb9efec055a62cdfeec5d1e0e90e453a55c55ffd01ca49ad5a" exitCode=0 Mar 09 13:40:48 crc kubenswrapper[4764]: I0309 13:40:48.465006 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mtsrr" event={"ID":"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62","Type":"ContainerDied","Data":"14bfbdfc3fcd7dcb9efec055a62cdfeec5d1e0e90e453a55c55ffd01ca49ad5a"} Mar 09 13:40:49 crc kubenswrapper[4764]: I0309 13:40:49.866049 4764 scope.go:117] "RemoveContainer" containerID="33a445a6adcd631bbd08283597c203355f22e04e83cf13fd98ba9c1f71c00bd6" Mar 09 13:40:51 crc kubenswrapper[4764]: I0309 13:40:51.951056 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.125519 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-combined-ca-bundle\") pod \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.125707 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6nvs\" (UniqueName: \"kubernetes.io/projected/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-kube-api-access-t6nvs\") pod \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.125740 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-credential-keys\") pod \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.127058 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-scripts\") pod \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.127194 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-config-data\") pod \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.127235 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-fernet-keys\") pod \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\" (UID: \"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62\") " Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.134679 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-kube-api-access-t6nvs" (OuterVolumeSpecName: "kube-api-access-t6nvs") pod "be29c97d-4c8d-4e5b-9e12-07cdc4df5a62" (UID: "be29c97d-4c8d-4e5b-9e12-07cdc4df5a62"). InnerVolumeSpecName "kube-api-access-t6nvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.137106 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "be29c97d-4c8d-4e5b-9e12-07cdc4df5a62" (UID: "be29c97d-4c8d-4e5b-9e12-07cdc4df5a62"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.137182 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-scripts" (OuterVolumeSpecName: "scripts") pod "be29c97d-4c8d-4e5b-9e12-07cdc4df5a62" (UID: "be29c97d-4c8d-4e5b-9e12-07cdc4df5a62"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.155420 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "be29c97d-4c8d-4e5b-9e12-07cdc4df5a62" (UID: "be29c97d-4c8d-4e5b-9e12-07cdc4df5a62"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.158390 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be29c97d-4c8d-4e5b-9e12-07cdc4df5a62" (UID: "be29c97d-4c8d-4e5b-9e12-07cdc4df5a62"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.164206 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-config-data" (OuterVolumeSpecName: "config-data") pod "be29c97d-4c8d-4e5b-9e12-07cdc4df5a62" (UID: "be29c97d-4c8d-4e5b-9e12-07cdc4df5a62"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.231228 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.231273 4764 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.231284 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.231298 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6nvs\" (UniqueName: \"kubernetes.io/projected/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-kube-api-access-t6nvs\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.231309 4764 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.231318 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.378970 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.473715 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-p289h"] Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.474116 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-p289h" podUID="e921061b-2a0f-4b22-beb1-0d52993dc06b" containerName="dnsmasq-dns" containerID="cri-o://fce1b5bc8a63a55a1345ac17564049805c448dc5638e52c8bc6cdc086dbbb27d" gracePeriod=10 Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.527735 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mtsrr" event={"ID":"be29c97d-4c8d-4e5b-9e12-07cdc4df5a62","Type":"ContainerDied","Data":"964dc46977581745072ef70168c2ac93d99beadaf259b69a6032dafaefc2a059"} Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.527786 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="964dc46977581745072ef70168c2ac93d99beadaf259b69a6032dafaefc2a059" Mar 09 13:40:52 crc kubenswrapper[4764]: I0309 13:40:52.527874 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mtsrr" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.042306 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-mtsrr"] Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.049626 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-mtsrr"] Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.167637 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-9sj6m"] Mar 09 13:40:53 crc kubenswrapper[4764]: E0309 13:40:53.168045 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8bce39-6992-4785-a460-24d6def57630" containerName="init" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.168064 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8bce39-6992-4785-a460-24d6def57630" containerName="init" Mar 09 13:40:53 crc kubenswrapper[4764]: E0309 13:40:53.168082 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be29c97d-4c8d-4e5b-9e12-07cdc4df5a62" containerName="keystone-bootstrap" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.168091 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="be29c97d-4c8d-4e5b-9e12-07cdc4df5a62" containerName="keystone-bootstrap" Mar 09 13:40:53 crc kubenswrapper[4764]: E0309 13:40:53.168114 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8bce39-6992-4785-a460-24d6def57630" containerName="dnsmasq-dns" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.168121 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8bce39-6992-4785-a460-24d6def57630" containerName="dnsmasq-dns" Mar 09 13:40:53 crc kubenswrapper[4764]: E0309 13:40:53.168135 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a710e46-50b7-4069-b15c-ee3d19bc06e0" containerName="init" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.168140 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a710e46-50b7-4069-b15c-ee3d19bc06e0" containerName="init" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.168294 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb8bce39-6992-4785-a460-24d6def57630" containerName="dnsmasq-dns" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.168307 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a710e46-50b7-4069-b15c-ee3d19bc06e0" containerName="init" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.168326 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="be29c97d-4c8d-4e5b-9e12-07cdc4df5a62" containerName="keystone-bootstrap" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.168938 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.174616 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.174880 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.174931 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4hfql" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.175221 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.175253 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.192638 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9sj6m"] Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.270511 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-fernet-keys\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.270547 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-config-data\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.270570 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-combined-ca-bundle\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.270602 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-credential-keys\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.270637 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw9hw\" (UniqueName: \"kubernetes.io/projected/2a338463-1443-4863-830e-0621abc3ed15-kube-api-access-mw9hw\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.270693 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-scripts\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.371898 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-credential-keys\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.371973 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw9hw\" (UniqueName: \"kubernetes.io/projected/2a338463-1443-4863-830e-0621abc3ed15-kube-api-access-mw9hw\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.372026 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-scripts\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.372096 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-fernet-keys\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.372116 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-config-data\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.372135 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-combined-ca-bundle\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.377570 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-credential-keys\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.378776 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-fernet-keys\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.380390 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-scripts\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.383738 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-config-data\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.394512 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-combined-ca-bundle\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.401262 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw9hw\" (UniqueName: \"kubernetes.io/projected/2a338463-1443-4863-830e-0621abc3ed15-kube-api-access-mw9hw\") pod \"keystone-bootstrap-9sj6m\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.487393 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.543850 4764 generic.go:334] "Generic (PLEG): container finished" podID="e921061b-2a0f-4b22-beb1-0d52993dc06b" containerID="fce1b5bc8a63a55a1345ac17564049805c448dc5638e52c8bc6cdc086dbbb27d" exitCode=0 Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.543898 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-p289h" event={"ID":"e921061b-2a0f-4b22-beb1-0d52993dc06b","Type":"ContainerDied","Data":"fce1b5bc8a63a55a1345ac17564049805c448dc5638e52c8bc6cdc086dbbb27d"} Mar 09 13:40:53 crc kubenswrapper[4764]: I0309 13:40:53.570866 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be29c97d-4c8d-4e5b-9e12-07cdc4df5a62" path="/var/lib/kubelet/pods/be29c97d-4c8d-4e5b-9e12-07cdc4df5a62/volumes" Mar 09 13:40:57 crc kubenswrapper[4764]: I0309 13:40:57.436111 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8554648995-p289h" podUID="e921061b-2a0f-4b22-beb1-0d52993dc06b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Mar 09 13:40:58 crc kubenswrapper[4764]: I0309 13:40:58.370904 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:40:58 crc kubenswrapper[4764]: I0309 13:40:58.370963 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:40:58 crc kubenswrapper[4764]: I0309 13:40:58.371015 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:40:58 crc kubenswrapper[4764]: I0309 13:40:58.371771 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"089fede4f6de3963d0297d0b784543ed7a9d38cdefe66f6bbb9aab989dbffbb0"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:40:58 crc kubenswrapper[4764]: I0309 13:40:58.371823 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://089fede4f6de3963d0297d0b784543ed7a9d38cdefe66f6bbb9aab989dbffbb0" gracePeriod=600 Mar 09 13:40:58 crc kubenswrapper[4764]: I0309 13:40:58.596373 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="089fede4f6de3963d0297d0b784543ed7a9d38cdefe66f6bbb9aab989dbffbb0" exitCode=0 Mar 09 13:40:58 crc kubenswrapper[4764]: I0309 13:40:58.596439 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"089fede4f6de3963d0297d0b784543ed7a9d38cdefe66f6bbb9aab989dbffbb0"} Mar 09 13:40:58 crc kubenswrapper[4764]: I0309 13:40:58.596544 4764 scope.go:117] "RemoveContainer" containerID="bd218b04a62035d950083134e237631295493daecba2baa1eef096560f891819" Mar 09 13:41:02 crc kubenswrapper[4764]: I0309 13:41:02.636265 4764 generic.go:334] "Generic (PLEG): container finished" podID="34466abc-30eb-4a0c-b4ea-50b5ab368fa1" containerID="77b865a9fe4889c82ec1f58b4a21addc379165d3a6897c87913c6042aa1f357b" exitCode=0 Mar 09 13:41:02 crc kubenswrapper[4764]: I0309 13:41:02.636349 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cmhtp" event={"ID":"34466abc-30eb-4a0c-b4ea-50b5ab368fa1","Type":"ContainerDied","Data":"77b865a9fe4889c82ec1f58b4a21addc379165d3a6897c87913c6042aa1f357b"} Mar 09 13:41:02 crc kubenswrapper[4764]: E0309 13:41:02.759669 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 09 13:41:02 crc kubenswrapper[4764]: E0309 13:41:02.759878 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7jvrd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-x9gvc_openstack(cb54f57d-afb6-4e53-be9a-4b22573a9450): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:41:02 crc kubenswrapper[4764]: E0309 13:41:02.761034 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-x9gvc" podUID="cb54f57d-afb6-4e53-be9a-4b22573a9450" Mar 09 13:41:03 crc kubenswrapper[4764]: E0309 13:41:03.651060 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-x9gvc" podUID="cb54f57d-afb6-4e53-be9a-4b22573a9450" Mar 09 13:41:03 crc kubenswrapper[4764]: E0309 13:41:03.941132 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 09 13:41:03 crc kubenswrapper[4764]: E0309 13:41:03.941690 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-khssq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-dp5x6_openstack(74146b7d-9780-4d2d-9454-853296f88955): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:41:03 crc kubenswrapper[4764]: E0309 13:41:03.942906 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-dp5x6" podUID="74146b7d-9780-4d2d-9454-853296f88955" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.233025 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.283261 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cmhtp" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.373961 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9sj6m"] Mar 09 13:41:04 crc kubenswrapper[4764]: W0309 13:41:04.385879 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a338463_1443_4863_830e_0621abc3ed15.slice/crio-38fa86d2f8bb849de0695f8ca7f22c2d7e73ace5965981aa2e9b4258c6a40283 WatchSource:0}: Error finding container 38fa86d2f8bb849de0695f8ca7f22c2d7e73ace5965981aa2e9b4258c6a40283: Status 404 returned error can't find the container with id 38fa86d2f8bb849de0695f8ca7f22c2d7e73ace5965981aa2e9b4258c6a40283 Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.392487 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-config\") pod \"34466abc-30eb-4a0c-b4ea-50b5ab368fa1\" (UID: \"34466abc-30eb-4a0c-b4ea-50b5ab368fa1\") " Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.392630 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-combined-ca-bundle\") pod \"34466abc-30eb-4a0c-b4ea-50b5ab368fa1\" (UID: \"34466abc-30eb-4a0c-b4ea-50b5ab368fa1\") " Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.392702 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkvzb\" (UniqueName: \"kubernetes.io/projected/e921061b-2a0f-4b22-beb1-0d52993dc06b-kube-api-access-xkvzb\") pod \"e921061b-2a0f-4b22-beb1-0d52993dc06b\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.392787 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-ovsdbserver-sb\") pod \"e921061b-2a0f-4b22-beb1-0d52993dc06b\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.392937 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-config\") pod \"e921061b-2a0f-4b22-beb1-0d52993dc06b\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.392988 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lj9f\" (UniqueName: \"kubernetes.io/projected/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-kube-api-access-8lj9f\") pod \"34466abc-30eb-4a0c-b4ea-50b5ab368fa1\" (UID: \"34466abc-30eb-4a0c-b4ea-50b5ab368fa1\") " Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.393023 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-ovsdbserver-nb\") pod \"e921061b-2a0f-4b22-beb1-0d52993dc06b\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.393091 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-dns-svc\") pod \"e921061b-2a0f-4b22-beb1-0d52993dc06b\" (UID: \"e921061b-2a0f-4b22-beb1-0d52993dc06b\") " Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.397464 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-kube-api-access-8lj9f" (OuterVolumeSpecName: "kube-api-access-8lj9f") pod "34466abc-30eb-4a0c-b4ea-50b5ab368fa1" (UID: "34466abc-30eb-4a0c-b4ea-50b5ab368fa1"). InnerVolumeSpecName "kube-api-access-8lj9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.397325 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e921061b-2a0f-4b22-beb1-0d52993dc06b-kube-api-access-xkvzb" (OuterVolumeSpecName: "kube-api-access-xkvzb") pod "e921061b-2a0f-4b22-beb1-0d52993dc06b" (UID: "e921061b-2a0f-4b22-beb1-0d52993dc06b"). InnerVolumeSpecName "kube-api-access-xkvzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.422623 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34466abc-30eb-4a0c-b4ea-50b5ab368fa1" (UID: "34466abc-30eb-4a0c-b4ea-50b5ab368fa1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.434931 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-config" (OuterVolumeSpecName: "config") pod "34466abc-30eb-4a0c-b4ea-50b5ab368fa1" (UID: "34466abc-30eb-4a0c-b4ea-50b5ab368fa1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.443779 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-config" (OuterVolumeSpecName: "config") pod "e921061b-2a0f-4b22-beb1-0d52993dc06b" (UID: "e921061b-2a0f-4b22-beb1-0d52993dc06b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.443864 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e921061b-2a0f-4b22-beb1-0d52993dc06b" (UID: "e921061b-2a0f-4b22-beb1-0d52993dc06b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.444421 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e921061b-2a0f-4b22-beb1-0d52993dc06b" (UID: "e921061b-2a0f-4b22-beb1-0d52993dc06b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.470242 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e921061b-2a0f-4b22-beb1-0d52993dc06b" (UID: "e921061b-2a0f-4b22-beb1-0d52993dc06b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.495028 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.495064 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lj9f\" (UniqueName: \"kubernetes.io/projected/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-kube-api-access-8lj9f\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.495078 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.495088 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.495100 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.495109 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34466abc-30eb-4a0c-b4ea-50b5ab368fa1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.495117 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkvzb\" (UniqueName: \"kubernetes.io/projected/e921061b-2a0f-4b22-beb1-0d52993dc06b-kube-api-access-xkvzb\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.495125 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e921061b-2a0f-4b22-beb1-0d52993dc06b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.661804 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-p289h" event={"ID":"e921061b-2a0f-4b22-beb1-0d52993dc06b","Type":"ContainerDied","Data":"9c01a77a060dbab4de5d1ba1f06fcd3807020da1983c9df162b0099cb08b09d0"} Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.661863 4764 scope.go:117] "RemoveContainer" containerID="fce1b5bc8a63a55a1345ac17564049805c448dc5638e52c8bc6cdc086dbbb27d" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.661967 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-p289h" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.664553 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cmhtp" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.664566 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cmhtp" event={"ID":"34466abc-30eb-4a0c-b4ea-50b5ab368fa1","Type":"ContainerDied","Data":"7aa42fe65ff3f50e964cbe57813dcd4f0b9049443a2784caec6019418fe3ac3d"} Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.664680 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7aa42fe65ff3f50e964cbe57813dcd4f0b9049443a2784caec6019418fe3ac3d" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.667357 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-bnpcj" event={"ID":"1004910c-0db4-4e3d-aac5-358a557ee268","Type":"ContainerStarted","Data":"ec23e6a8c58e7f0bc7312e60b63b0c20314c347b59fda500de1bb63ca3c5e6c6"} Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.670071 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9sj6m" event={"ID":"2a338463-1443-4863-830e-0621abc3ed15","Type":"ContainerStarted","Data":"bcb17cd274a1a85d7439c286f89b2351717de890e1de960b1f945d832ef377e9"} Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.670111 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9sj6m" event={"ID":"2a338463-1443-4863-830e-0621abc3ed15","Type":"ContainerStarted","Data":"38fa86d2f8bb849de0695f8ca7f22c2d7e73ace5965981aa2e9b4258c6a40283"} Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.682705 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"69810f80271be58962525ba5a5a37ce68d5e172f7c253c440a5a45f3f3beab77"} Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.688527 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"861cdd7d-b563-4009-9c33-a5c64d6ffae9","Type":"ContainerStarted","Data":"2486ace2409cc48f14efc281a2eed3fa9e4956d145642ec1061f64d96ea18a5f"} Mar 09 13:41:04 crc kubenswrapper[4764]: E0309 13:41:04.704041 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-dp5x6" podUID="74146b7d-9780-4d2d-9454-853296f88955" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.704349 4764 scope.go:117] "RemoveContainer" containerID="7c314d6ef6d0a466bb5e22ea5e084d1cbdc6ed4e3e9ab6aee110b11fa331f5fb" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.707769 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-bnpcj" podStartSLOduration=3.192289793 podStartE2EDuration="23.707681248s" podCreationTimestamp="2026-03-09 13:40:41 +0000 UTC" firstStartedPulling="2026-03-09 13:40:43.392378153 +0000 UTC m=+1198.642550061" lastFinishedPulling="2026-03-09 13:41:03.907769608 +0000 UTC m=+1219.157941516" observedRunningTime="2026-03-09 13:41:04.702613631 +0000 UTC m=+1219.952785529" watchObservedRunningTime="2026-03-09 13:41:04.707681248 +0000 UTC m=+1219.957853156" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.724002 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-9sj6m" podStartSLOduration=11.723951326 podStartE2EDuration="11.723951326s" podCreationTimestamp="2026-03-09 13:40:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:04.722343166 +0000 UTC m=+1219.972515104" watchObservedRunningTime="2026-03-09 13:41:04.723951326 +0000 UTC m=+1219.974123244" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.772710 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-p289h"] Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.781188 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-p289h"] Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.841699 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-sh5rn"] Mar 09 13:41:04 crc kubenswrapper[4764]: E0309 13:41:04.842267 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e921061b-2a0f-4b22-beb1-0d52993dc06b" containerName="dnsmasq-dns" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.842288 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e921061b-2a0f-4b22-beb1-0d52993dc06b" containerName="dnsmasq-dns" Mar 09 13:41:04 crc kubenswrapper[4764]: E0309 13:41:04.842306 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34466abc-30eb-4a0c-b4ea-50b5ab368fa1" containerName="neutron-db-sync" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.842313 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="34466abc-30eb-4a0c-b4ea-50b5ab368fa1" containerName="neutron-db-sync" Mar 09 13:41:04 crc kubenswrapper[4764]: E0309 13:41:04.842344 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e921061b-2a0f-4b22-beb1-0d52993dc06b" containerName="init" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.842351 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e921061b-2a0f-4b22-beb1-0d52993dc06b" containerName="init" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.842529 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="34466abc-30eb-4a0c-b4ea-50b5ab368fa1" containerName="neutron-db-sync" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.842547 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e921061b-2a0f-4b22-beb1-0d52993dc06b" containerName="dnsmasq-dns" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.844141 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.854335 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-sh5rn"] Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.964282 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6b74bc6bc6-vsxl5"] Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.966583 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.970015 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.970496 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.970925 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.971223 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-fn2ft" Mar 09 13:41:04 crc kubenswrapper[4764]: I0309 13:41:04.979427 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6b74bc6bc6-vsxl5"] Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.006431 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-config\") pod \"dnsmasq-dns-5f66db59b9-sh5rn\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.006494 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f92p7\" (UniqueName: \"kubernetes.io/projected/5b2b268a-adc9-46ca-908a-d30ab8543059-kube-api-access-f92p7\") pod \"dnsmasq-dns-5f66db59b9-sh5rn\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.006544 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-ovsdbserver-sb\") pod \"dnsmasq-dns-5f66db59b9-sh5rn\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.006564 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-dns-svc\") pod \"dnsmasq-dns-5f66db59b9-sh5rn\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.006614 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-ovsdbserver-nb\") pod \"dnsmasq-dns-5f66db59b9-sh5rn\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.108494 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnqrq\" (UniqueName: \"kubernetes.io/projected/50610296-d076-4c9f-ac34-a976202ce135-kube-api-access-xnqrq\") pod \"neutron-6b74bc6bc6-vsxl5\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.108613 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-httpd-config\") pod \"neutron-6b74bc6bc6-vsxl5\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.108666 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-combined-ca-bundle\") pod \"neutron-6b74bc6bc6-vsxl5\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.108699 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-config\") pod \"dnsmasq-dns-5f66db59b9-sh5rn\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.108728 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f92p7\" (UniqueName: \"kubernetes.io/projected/5b2b268a-adc9-46ca-908a-d30ab8543059-kube-api-access-f92p7\") pod \"dnsmasq-dns-5f66db59b9-sh5rn\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.108762 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-ovsdbserver-sb\") pod \"dnsmasq-dns-5f66db59b9-sh5rn\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.108781 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-dns-svc\") pod \"dnsmasq-dns-5f66db59b9-sh5rn\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.108809 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-ovndb-tls-certs\") pod \"neutron-6b74bc6bc6-vsxl5\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.108835 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-ovsdbserver-nb\") pod \"dnsmasq-dns-5f66db59b9-sh5rn\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.108857 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-config\") pod \"neutron-6b74bc6bc6-vsxl5\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.110005 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-ovsdbserver-sb\") pod \"dnsmasq-dns-5f66db59b9-sh5rn\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.110232 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-dns-svc\") pod \"dnsmasq-dns-5f66db59b9-sh5rn\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.110853 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-config\") pod \"dnsmasq-dns-5f66db59b9-sh5rn\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.111269 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-ovsdbserver-nb\") pod \"dnsmasq-dns-5f66db59b9-sh5rn\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.146229 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f92p7\" (UniqueName: \"kubernetes.io/projected/5b2b268a-adc9-46ca-908a-d30ab8543059-kube-api-access-f92p7\") pod \"dnsmasq-dns-5f66db59b9-sh5rn\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.183494 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.210038 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-ovndb-tls-certs\") pod \"neutron-6b74bc6bc6-vsxl5\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.210092 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-config\") pod \"neutron-6b74bc6bc6-vsxl5\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.210143 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnqrq\" (UniqueName: \"kubernetes.io/projected/50610296-d076-4c9f-ac34-a976202ce135-kube-api-access-xnqrq\") pod \"neutron-6b74bc6bc6-vsxl5\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.210188 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-httpd-config\") pod \"neutron-6b74bc6bc6-vsxl5\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.210230 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-combined-ca-bundle\") pod \"neutron-6b74bc6bc6-vsxl5\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.218577 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-combined-ca-bundle\") pod \"neutron-6b74bc6bc6-vsxl5\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.218576 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-ovndb-tls-certs\") pod \"neutron-6b74bc6bc6-vsxl5\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.218983 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-config\") pod \"neutron-6b74bc6bc6-vsxl5\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.220009 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-httpd-config\") pod \"neutron-6b74bc6bc6-vsxl5\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.239010 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnqrq\" (UniqueName: \"kubernetes.io/projected/50610296-d076-4c9f-ac34-a976202ce135-kube-api-access-xnqrq\") pod \"neutron-6b74bc6bc6-vsxl5\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.287124 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.596728 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e921061b-2a0f-4b22-beb1-0d52993dc06b" path="/var/lib/kubelet/pods/e921061b-2a0f-4b22-beb1-0d52993dc06b/volumes" Mar 09 13:41:05 crc kubenswrapper[4764]: I0309 13:41:05.792327 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-sh5rn"] Mar 09 13:41:06 crc kubenswrapper[4764]: I0309 13:41:06.017226 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6b74bc6bc6-vsxl5"] Mar 09 13:41:06 crc kubenswrapper[4764]: I0309 13:41:06.739846 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b74bc6bc6-vsxl5" event={"ID":"50610296-d076-4c9f-ac34-a976202ce135","Type":"ContainerStarted","Data":"508370eb92a6887d3b1ecf26300791f931d8f328026a6caf81b9a1ce10c18bed"} Mar 09 13:41:06 crc kubenswrapper[4764]: I0309 13:41:06.741154 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b74bc6bc6-vsxl5" event={"ID":"50610296-d076-4c9f-ac34-a976202ce135","Type":"ContainerStarted","Data":"bd4c896bc38b604cb19726769c37db30c4145f3642057a166913e3d7cfd24c8f"} Mar 09 13:41:06 crc kubenswrapper[4764]: I0309 13:41:06.748594 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"861cdd7d-b563-4009-9c33-a5c64d6ffae9","Type":"ContainerStarted","Data":"4db2afcb94ef0a26dd49d9961d5346346b289d4dbcccce08d916a8fe7f319ea2"} Mar 09 13:41:06 crc kubenswrapper[4764]: I0309 13:41:06.752782 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" event={"ID":"5b2b268a-adc9-46ca-908a-d30ab8543059","Type":"ContainerStarted","Data":"8419a9151f23a452d93428da7d46b2dd5d9147864269035236c144959618c6a4"} Mar 09 13:41:06 crc kubenswrapper[4764]: I0309 13:41:06.753054 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" event={"ID":"5b2b268a-adc9-46ca-908a-d30ab8543059","Type":"ContainerStarted","Data":"d82d6336b352da3e8f6c37dd12f8ae1d4a1f5e3d8a23342826dc108286471929"} Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.216657 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6c97985d69-khcvg"] Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.228634 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.232280 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.238292 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.240026 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c97985d69-khcvg"] Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.377068 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnnlq\" (UniqueName: \"kubernetes.io/projected/5d65fa53-be02-4d11-b300-5cb4629c03da-kube-api-access-bnnlq\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.377131 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-public-tls-certs\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.377339 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-config\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.377602 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-httpd-config\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.377636 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-combined-ca-bundle\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.377677 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-ovndb-tls-certs\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.377730 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-internal-tls-certs\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.436834 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8554648995-p289h" podUID="e921061b-2a0f-4b22-beb1-0d52993dc06b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: i/o timeout" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.480042 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-config\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.480182 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-httpd-config\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.480258 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-combined-ca-bundle\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.480294 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-ovndb-tls-certs\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.480337 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-internal-tls-certs\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.480574 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnnlq\" (UniqueName: \"kubernetes.io/projected/5d65fa53-be02-4d11-b300-5cb4629c03da-kube-api-access-bnnlq\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.480629 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-public-tls-certs\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.488735 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-config\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.490503 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-combined-ca-bundle\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.496502 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-public-tls-certs\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.498345 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-httpd-config\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.506584 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnnlq\" (UniqueName: \"kubernetes.io/projected/5d65fa53-be02-4d11-b300-5cb4629c03da-kube-api-access-bnnlq\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.509142 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-internal-tls-certs\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.518086 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-ovndb-tls-certs\") pod \"neutron-6c97985d69-khcvg\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.551316 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.764557 4764 generic.go:334] "Generic (PLEG): container finished" podID="1004910c-0db4-4e3d-aac5-358a557ee268" containerID="ec23e6a8c58e7f0bc7312e60b63b0c20314c347b59fda500de1bb63ca3c5e6c6" exitCode=0 Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.764672 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-bnpcj" event={"ID":"1004910c-0db4-4e3d-aac5-358a557ee268","Type":"ContainerDied","Data":"ec23e6a8c58e7f0bc7312e60b63b0c20314c347b59fda500de1bb63ca3c5e6c6"} Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.778425 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b74bc6bc6-vsxl5" event={"ID":"50610296-d076-4c9f-ac34-a976202ce135","Type":"ContainerStarted","Data":"40fc71094cc1ee70bba232001951743f253a32a95993837575a49f9b6e385df9"} Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.779470 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.810174 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6b74bc6bc6-vsxl5" podStartSLOduration=3.810140571 podStartE2EDuration="3.810140571s" podCreationTimestamp="2026-03-09 13:41:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:07.809781502 +0000 UTC m=+1223.059953420" watchObservedRunningTime="2026-03-09 13:41:07.810140571 +0000 UTC m=+1223.060312479" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.810799 4764 generic.go:334] "Generic (PLEG): container finished" podID="5b2b268a-adc9-46ca-908a-d30ab8543059" containerID="8419a9151f23a452d93428da7d46b2dd5d9147864269035236c144959618c6a4" exitCode=0 Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.810941 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" event={"ID":"5b2b268a-adc9-46ca-908a-d30ab8543059","Type":"ContainerDied","Data":"8419a9151f23a452d93428da7d46b2dd5d9147864269035236c144959618c6a4"} Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.810977 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" event={"ID":"5b2b268a-adc9-46ca-908a-d30ab8543059","Type":"ContainerStarted","Data":"20dab72734e5903795d478bac44970ac57f0690f8f454dee9dae223fbd3e92ac"} Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.812370 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:07 crc kubenswrapper[4764]: I0309 13:41:07.914685 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" podStartSLOduration=3.9146523 podStartE2EDuration="3.9146523s" podCreationTimestamp="2026-03-09 13:41:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:07.859897658 +0000 UTC m=+1223.110069566" watchObservedRunningTime="2026-03-09 13:41:07.9146523 +0000 UTC m=+1223.164824208" Mar 09 13:41:08 crc kubenswrapper[4764]: I0309 13:41:08.183315 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c97985d69-khcvg"] Mar 09 13:41:08 crc kubenswrapper[4764]: I0309 13:41:08.820962 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c97985d69-khcvg" event={"ID":"5d65fa53-be02-4d11-b300-5cb4629c03da","Type":"ContainerStarted","Data":"63c3a5316fc5ec3fc301ebe753725e716ab876289ed6944f416ebd4b78894abb"} Mar 09 13:41:08 crc kubenswrapper[4764]: I0309 13:41:08.824007 4764 generic.go:334] "Generic (PLEG): container finished" podID="2a338463-1443-4863-830e-0621abc3ed15" containerID="bcb17cd274a1a85d7439c286f89b2351717de890e1de960b1f945d832ef377e9" exitCode=0 Mar 09 13:41:08 crc kubenswrapper[4764]: I0309 13:41:08.824064 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9sj6m" event={"ID":"2a338463-1443-4863-830e-0621abc3ed15","Type":"ContainerDied","Data":"bcb17cd274a1a85d7439c286f89b2351717de890e1de960b1f945d832ef377e9"} Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.298483 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-bnpcj" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.426954 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9qxl\" (UniqueName: \"kubernetes.io/projected/1004910c-0db4-4e3d-aac5-358a557ee268-kube-api-access-v9qxl\") pod \"1004910c-0db4-4e3d-aac5-358a557ee268\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.427225 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1004910c-0db4-4e3d-aac5-358a557ee268-logs\") pod \"1004910c-0db4-4e3d-aac5-358a557ee268\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.427259 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-scripts\") pod \"1004910c-0db4-4e3d-aac5-358a557ee268\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.427294 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-config-data\") pod \"1004910c-0db4-4e3d-aac5-358a557ee268\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.427316 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-combined-ca-bundle\") pod \"1004910c-0db4-4e3d-aac5-358a557ee268\" (UID: \"1004910c-0db4-4e3d-aac5-358a557ee268\") " Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.427912 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1004910c-0db4-4e3d-aac5-358a557ee268-logs" (OuterVolumeSpecName: "logs") pod "1004910c-0db4-4e3d-aac5-358a557ee268" (UID: "1004910c-0db4-4e3d-aac5-358a557ee268"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.437236 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-scripts" (OuterVolumeSpecName: "scripts") pod "1004910c-0db4-4e3d-aac5-358a557ee268" (UID: "1004910c-0db4-4e3d-aac5-358a557ee268"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.456159 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1004910c-0db4-4e3d-aac5-358a557ee268-kube-api-access-v9qxl" (OuterVolumeSpecName: "kube-api-access-v9qxl") pod "1004910c-0db4-4e3d-aac5-358a557ee268" (UID: "1004910c-0db4-4e3d-aac5-358a557ee268"). InnerVolumeSpecName "kube-api-access-v9qxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.460447 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1004910c-0db4-4e3d-aac5-358a557ee268" (UID: "1004910c-0db4-4e3d-aac5-358a557ee268"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.461974 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-config-data" (OuterVolumeSpecName: "config-data") pod "1004910c-0db4-4e3d-aac5-358a557ee268" (UID: "1004910c-0db4-4e3d-aac5-358a557ee268"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.529690 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1004910c-0db4-4e3d-aac5-358a557ee268-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.529733 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.529746 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.529760 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1004910c-0db4-4e3d-aac5-358a557ee268-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.529776 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9qxl\" (UniqueName: \"kubernetes.io/projected/1004910c-0db4-4e3d-aac5-358a557ee268-kube-api-access-v9qxl\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.859675 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-bnpcj" event={"ID":"1004910c-0db4-4e3d-aac5-358a557ee268","Type":"ContainerDied","Data":"4a62e7bc40a3d288cb2716e66a0b772309aa68a835605c18dfc2a81767e2ae07"} Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.859733 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a62e7bc40a3d288cb2716e66a0b772309aa68a835605c18dfc2a81767e2ae07" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.859770 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-bnpcj" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.871559 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c97985d69-khcvg" event={"ID":"5d65fa53-be02-4d11-b300-5cb4629c03da","Type":"ContainerStarted","Data":"af86c5d555559ee78e3558628ecf704c4c51493754120be1f860b79edb606896"} Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.900349 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-586d68b4fd-xj4tk"] Mar 09 13:41:09 crc kubenswrapper[4764]: E0309 13:41:09.900841 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1004910c-0db4-4e3d-aac5-358a557ee268" containerName="placement-db-sync" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.900862 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1004910c-0db4-4e3d-aac5-358a557ee268" containerName="placement-db-sync" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.901336 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1004910c-0db4-4e3d-aac5-358a557ee268" containerName="placement-db-sync" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.902270 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.908132 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.913828 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.914183 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.914431 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-l4tqv" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.914600 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 09 13:41:09 crc kubenswrapper[4764]: I0309 13:41:09.926478 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-586d68b4fd-xj4tk"] Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.040373 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-config-data\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.040508 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-combined-ca-bundle\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.040579 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-internal-tls-certs\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.040611 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-public-tls-certs\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.040769 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-scripts\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.040799 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8mnp\" (UniqueName: \"kubernetes.io/projected/e89507a7-04a1-444b-b38a-40b001ec079a-kube-api-access-j8mnp\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.041044 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e89507a7-04a1-444b-b38a-40b001ec079a-logs\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.145240 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-config-data\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.145357 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-combined-ca-bundle\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.145408 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-internal-tls-certs\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.145433 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-public-tls-certs\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.145460 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-scripts\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.145757 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8mnp\" (UniqueName: \"kubernetes.io/projected/e89507a7-04a1-444b-b38a-40b001ec079a-kube-api-access-j8mnp\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.145811 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e89507a7-04a1-444b-b38a-40b001ec079a-logs\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.150599 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e89507a7-04a1-444b-b38a-40b001ec079a-logs\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.151499 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-internal-tls-certs\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.152189 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-config-data\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.168574 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-public-tls-certs\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.169638 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-combined-ca-bundle\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.173916 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8mnp\" (UniqueName: \"kubernetes.io/projected/e89507a7-04a1-444b-b38a-40b001ec079a-kube-api-access-j8mnp\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.190238 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-scripts\") pod \"placement-586d68b4fd-xj4tk\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.244508 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.863873 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.901745 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9sj6m" event={"ID":"2a338463-1443-4863-830e-0621abc3ed15","Type":"ContainerDied","Data":"38fa86d2f8bb849de0695f8ca7f22c2d7e73ace5965981aa2e9b4258c6a40283"} Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.901792 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38fa86d2f8bb849de0695f8ca7f22c2d7e73ace5965981aa2e9b4258c6a40283" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.901921 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9sj6m" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.967128 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-combined-ca-bundle\") pod \"2a338463-1443-4863-830e-0621abc3ed15\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.967587 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw9hw\" (UniqueName: \"kubernetes.io/projected/2a338463-1443-4863-830e-0621abc3ed15-kube-api-access-mw9hw\") pod \"2a338463-1443-4863-830e-0621abc3ed15\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.967668 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-config-data\") pod \"2a338463-1443-4863-830e-0621abc3ed15\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.967697 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-fernet-keys\") pod \"2a338463-1443-4863-830e-0621abc3ed15\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.967956 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-credential-keys\") pod \"2a338463-1443-4863-830e-0621abc3ed15\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.968017 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-scripts\") pod \"2a338463-1443-4863-830e-0621abc3ed15\" (UID: \"2a338463-1443-4863-830e-0621abc3ed15\") " Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.973611 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a338463-1443-4863-830e-0621abc3ed15-kube-api-access-mw9hw" (OuterVolumeSpecName: "kube-api-access-mw9hw") pod "2a338463-1443-4863-830e-0621abc3ed15" (UID: "2a338463-1443-4863-830e-0621abc3ed15"). InnerVolumeSpecName "kube-api-access-mw9hw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.973629 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2a338463-1443-4863-830e-0621abc3ed15" (UID: "2a338463-1443-4863-830e-0621abc3ed15"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.975087 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-scripts" (OuterVolumeSpecName: "scripts") pod "2a338463-1443-4863-830e-0621abc3ed15" (UID: "2a338463-1443-4863-830e-0621abc3ed15"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.990614 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2a338463-1443-4863-830e-0621abc3ed15" (UID: "2a338463-1443-4863-830e-0621abc3ed15"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:10 crc kubenswrapper[4764]: I0309 13:41:10.997332 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a338463-1443-4863-830e-0621abc3ed15" (UID: "2a338463-1443-4863-830e-0621abc3ed15"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.008614 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-config-data" (OuterVolumeSpecName: "config-data") pod "2a338463-1443-4863-830e-0621abc3ed15" (UID: "2a338463-1443-4863-830e-0621abc3ed15"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.070048 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.070318 4764 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.070407 4764 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.070491 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.070548 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a338463-1443-4863-830e-0621abc3ed15-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.070602 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mw9hw\" (UniqueName: \"kubernetes.io/projected/2a338463-1443-4863-830e-0621abc3ed15-kube-api-access-mw9hw\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.977917 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-759c9c64fb-nwls6"] Mar 09 13:41:11 crc kubenswrapper[4764]: E0309 13:41:11.979409 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a338463-1443-4863-830e-0621abc3ed15" containerName="keystone-bootstrap" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.979476 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a338463-1443-4863-830e-0621abc3ed15" containerName="keystone-bootstrap" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.979729 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a338463-1443-4863-830e-0621abc3ed15" containerName="keystone-bootstrap" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.980704 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.983501 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.984918 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.985217 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.985413 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.985600 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4hfql" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.985766 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 13:41:11 crc kubenswrapper[4764]: I0309 13:41:11.993409 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-759c9c64fb-nwls6"] Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.089193 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-scripts\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.089280 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-fernet-keys\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.089304 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-combined-ca-bundle\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.089335 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-internal-tls-certs\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.089354 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-credential-keys\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.089384 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcw5v\" (UniqueName: \"kubernetes.io/projected/48b871c4-f2e8-44e9-9268-54920414c084-kube-api-access-xcw5v\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.089410 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-config-data\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.089451 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-public-tls-certs\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.191041 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-internal-tls-certs\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.191116 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-credential-keys\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.191144 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcw5v\" (UniqueName: \"kubernetes.io/projected/48b871c4-f2e8-44e9-9268-54920414c084-kube-api-access-xcw5v\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.191174 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-config-data\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.191223 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-public-tls-certs\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.191279 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-scripts\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.191322 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-fernet-keys\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.191343 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-combined-ca-bundle\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.203915 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-combined-ca-bundle\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.204782 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-credential-keys\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.212811 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-internal-tls-certs\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.213298 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-fernet-keys\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.213372 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-config-data\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.213850 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-scripts\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.223625 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48b871c4-f2e8-44e9-9268-54920414c084-public-tls-certs\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.240298 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcw5v\" (UniqueName: \"kubernetes.io/projected/48b871c4-f2e8-44e9-9268-54920414c084-kube-api-access-xcw5v\") pod \"keystone-759c9c64fb-nwls6\" (UID: \"48b871c4-f2e8-44e9-9268-54920414c084\") " pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.308427 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.433071 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-f85c59cb-gm4df"] Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.436414 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.458603 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f85c59cb-gm4df"] Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.601610 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a26a533-a42a-4553-96b3-922ad860ca7a-public-tls-certs\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.601781 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a26a533-a42a-4553-96b3-922ad860ca7a-config-data\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.601817 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvp6f\" (UniqueName: \"kubernetes.io/projected/2a26a533-a42a-4553-96b3-922ad860ca7a-kube-api-access-vvp6f\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.601845 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a26a533-a42a-4553-96b3-922ad860ca7a-logs\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.601905 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a26a533-a42a-4553-96b3-922ad860ca7a-combined-ca-bundle\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.601964 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a26a533-a42a-4553-96b3-922ad860ca7a-internal-tls-certs\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.601996 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a26a533-a42a-4553-96b3-922ad860ca7a-scripts\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.706130 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a26a533-a42a-4553-96b3-922ad860ca7a-config-data\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.706190 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvp6f\" (UniqueName: \"kubernetes.io/projected/2a26a533-a42a-4553-96b3-922ad860ca7a-kube-api-access-vvp6f\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.706229 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a26a533-a42a-4553-96b3-922ad860ca7a-logs\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.706273 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a26a533-a42a-4553-96b3-922ad860ca7a-combined-ca-bundle\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.706876 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a26a533-a42a-4553-96b3-922ad860ca7a-internal-tls-certs\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.707041 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a26a533-a42a-4553-96b3-922ad860ca7a-scripts\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.707108 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a26a533-a42a-4553-96b3-922ad860ca7a-public-tls-certs\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.708391 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a26a533-a42a-4553-96b3-922ad860ca7a-logs\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.711638 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a26a533-a42a-4553-96b3-922ad860ca7a-internal-tls-certs\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.713137 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a26a533-a42a-4553-96b3-922ad860ca7a-scripts\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.713342 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a26a533-a42a-4553-96b3-922ad860ca7a-config-data\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.721425 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a26a533-a42a-4553-96b3-922ad860ca7a-combined-ca-bundle\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.725265 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvp6f\" (UniqueName: \"kubernetes.io/projected/2a26a533-a42a-4553-96b3-922ad860ca7a-kube-api-access-vvp6f\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.725315 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a26a533-a42a-4553-96b3-922ad860ca7a-public-tls-certs\") pod \"placement-f85c59cb-gm4df\" (UID: \"2a26a533-a42a-4553-96b3-922ad860ca7a\") " pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:12 crc kubenswrapper[4764]: I0309 13:41:12.765812 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:13 crc kubenswrapper[4764]: I0309 13:41:13.773636 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-586d68b4fd-xj4tk"] Mar 09 13:41:13 crc kubenswrapper[4764]: W0309 13:41:13.783106 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode89507a7_04a1_444b_b38a_40b001ec079a.slice/crio-521edccf936d07531fd9f618513cede620e92cebf6edb91c3eb8e6d1d0940b40 WatchSource:0}: Error finding container 521edccf936d07531fd9f618513cede620e92cebf6edb91c3eb8e6d1d0940b40: Status 404 returned error can't find the container with id 521edccf936d07531fd9f618513cede620e92cebf6edb91c3eb8e6d1d0940b40 Mar 09 13:41:13 crc kubenswrapper[4764]: I0309 13:41:13.872384 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f85c59cb-gm4df"] Mar 09 13:41:13 crc kubenswrapper[4764]: I0309 13:41:13.937877 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"861cdd7d-b563-4009-9c33-a5c64d6ffae9","Type":"ContainerStarted","Data":"d869817a82b9f339ae214ccda8eaf5aa4fa4f2796d726fb166b8fa1a3f7e2eff"} Mar 09 13:41:13 crc kubenswrapper[4764]: I0309 13:41:13.939552 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-586d68b4fd-xj4tk" event={"ID":"e89507a7-04a1-444b-b38a-40b001ec079a","Type":"ContainerStarted","Data":"521edccf936d07531fd9f618513cede620e92cebf6edb91c3eb8e6d1d0940b40"} Mar 09 13:41:13 crc kubenswrapper[4764]: I0309 13:41:13.940701 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f85c59cb-gm4df" event={"ID":"2a26a533-a42a-4553-96b3-922ad860ca7a","Type":"ContainerStarted","Data":"dc2f6cf195a7a4d81bbe22d2d05f6dab5cd71f4bbb5aa5f4f57465a6ba1dbaf0"} Mar 09 13:41:13 crc kubenswrapper[4764]: I0309 13:41:13.942696 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c97985d69-khcvg" event={"ID":"5d65fa53-be02-4d11-b300-5cb4629c03da","Type":"ContainerStarted","Data":"cff13fa571b5cb0d7d8cf860c4a1efdb55518757df6a7508bbd7dca053bc0e44"} Mar 09 13:41:13 crc kubenswrapper[4764]: I0309 13:41:13.945094 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:13 crc kubenswrapper[4764]: I0309 13:41:13.970992 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-759c9c64fb-nwls6"] Mar 09 13:41:14 crc kubenswrapper[4764]: I0309 13:41:14.002587 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6c97985d69-khcvg" podStartSLOduration=7.002559994 podStartE2EDuration="7.002559994s" podCreationTimestamp="2026-03-09 13:41:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:13.990290006 +0000 UTC m=+1229.240461914" watchObservedRunningTime="2026-03-09 13:41:14.002559994 +0000 UTC m=+1229.252731902" Mar 09 13:41:14 crc kubenswrapper[4764]: I0309 13:41:14.952052 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f85c59cb-gm4df" event={"ID":"2a26a533-a42a-4553-96b3-922ad860ca7a","Type":"ContainerStarted","Data":"0428883faa796852458c72233641e433d39f3470e2adf94bdc2955214fca65d0"} Mar 09 13:41:14 crc kubenswrapper[4764]: I0309 13:41:14.952407 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:14 crc kubenswrapper[4764]: I0309 13:41:14.952428 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:14 crc kubenswrapper[4764]: I0309 13:41:14.952442 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f85c59cb-gm4df" event={"ID":"2a26a533-a42a-4553-96b3-922ad860ca7a","Type":"ContainerStarted","Data":"1a183d4acff4926d12823fed1357a77c6cf0ac1cd4db68a8320ca592acf90a8d"} Mar 09 13:41:14 crc kubenswrapper[4764]: I0309 13:41:14.955042 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-759c9c64fb-nwls6" event={"ID":"48b871c4-f2e8-44e9-9268-54920414c084","Type":"ContainerStarted","Data":"2f8fad3993132a09c0f6aa7d6a907699fbfb2983fa82f5bdb6eabaff82cf0739"} Mar 09 13:41:14 crc kubenswrapper[4764]: I0309 13:41:14.955120 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:14 crc kubenswrapper[4764]: I0309 13:41:14.955131 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-759c9c64fb-nwls6" event={"ID":"48b871c4-f2e8-44e9-9268-54920414c084","Type":"ContainerStarted","Data":"a621183f66c1c30c816f1b516021cdd36424bc24a400ea319637f857bdd39514"} Mar 09 13:41:14 crc kubenswrapper[4764]: I0309 13:41:14.957546 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-586d68b4fd-xj4tk" event={"ID":"e89507a7-04a1-444b-b38a-40b001ec079a","Type":"ContainerStarted","Data":"6939747e3ae2dc764ad9ad287190d81cffed4e3b8fef6031a7e72a9769a88873"} Mar 09 13:41:14 crc kubenswrapper[4764]: I0309 13:41:14.957578 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-586d68b4fd-xj4tk" event={"ID":"e89507a7-04a1-444b-b38a-40b001ec079a","Type":"ContainerStarted","Data":"846d1c4f6ed0185cda05e320aba446c1f9b2558f2a5da71ef147be109e27b726"} Mar 09 13:41:14 crc kubenswrapper[4764]: I0309 13:41:14.987312 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-f85c59cb-gm4df" podStartSLOduration=2.987289876 podStartE2EDuration="2.987289876s" podCreationTimestamp="2026-03-09 13:41:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:14.973750827 +0000 UTC m=+1230.223922745" watchObservedRunningTime="2026-03-09 13:41:14.987289876 +0000 UTC m=+1230.237461784" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.001821 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-759c9c64fb-nwls6" podStartSLOduration=4.00178865 podStartE2EDuration="4.00178865s" podCreationTimestamp="2026-03-09 13:41:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:14.994467086 +0000 UTC m=+1230.244639004" watchObservedRunningTime="2026-03-09 13:41:15.00178865 +0000 UTC m=+1230.251960568" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.025045 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-586d68b4fd-xj4tk" podStartSLOduration=6.025023392 podStartE2EDuration="6.025023392s" podCreationTimestamp="2026-03-09 13:41:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:15.0165684 +0000 UTC m=+1230.266740308" watchObservedRunningTime="2026-03-09 13:41:15.025023392 +0000 UTC m=+1230.275195320" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.185957 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.251994 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-47q58"] Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.252292 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" podUID="f99ecd3e-1d5b-4f3d-80d7-9a401171fef1" containerName="dnsmasq-dns" containerID="cri-o://1c12c99553c3fded7ba52a026664e1bff10eba1ac57dc2b2484eb57141804c22" gracePeriod=10 Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.794073 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.865095 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-ovsdbserver-sb\") pod \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.866038 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-ovsdbserver-nb\") pod \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.866307 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-config\") pod \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.866467 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-dns-svc\") pod \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.866493 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4km7\" (UniqueName: \"kubernetes.io/projected/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-kube-api-access-p4km7\") pod \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.875062 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-kube-api-access-p4km7" (OuterVolumeSpecName: "kube-api-access-p4km7") pod "f99ecd3e-1d5b-4f3d-80d7-9a401171fef1" (UID: "f99ecd3e-1d5b-4f3d-80d7-9a401171fef1"). InnerVolumeSpecName "kube-api-access-p4km7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.919041 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f99ecd3e-1d5b-4f3d-80d7-9a401171fef1" (UID: "f99ecd3e-1d5b-4f3d-80d7-9a401171fef1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.928177 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-config" (OuterVolumeSpecName: "config") pod "f99ecd3e-1d5b-4f3d-80d7-9a401171fef1" (UID: "f99ecd3e-1d5b-4f3d-80d7-9a401171fef1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.947622 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f99ecd3e-1d5b-4f3d-80d7-9a401171fef1" (UID: "f99ecd3e-1d5b-4f3d-80d7-9a401171fef1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.967628 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f99ecd3e-1d5b-4f3d-80d7-9a401171fef1" (UID: "f99ecd3e-1d5b-4f3d-80d7-9a401171fef1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.968390 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-ovsdbserver-nb\") pod \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\" (UID: \"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1\") " Mar 09 13:41:15 crc kubenswrapper[4764]: W0309 13:41:15.969170 4764 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1/volumes/kubernetes.io~configmap/ovsdbserver-nb Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.969275 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f99ecd3e-1d5b-4f3d-80d7-9a401171fef1" (UID: "f99ecd3e-1d5b-4f3d-80d7-9a401171fef1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.971497 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.971695 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.971718 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4km7\" (UniqueName: \"kubernetes.io/projected/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-kube-api-access-p4km7\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.971733 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.972084 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.972851 4764 generic.go:334] "Generic (PLEG): container finished" podID="f99ecd3e-1d5b-4f3d-80d7-9a401171fef1" containerID="1c12c99553c3fded7ba52a026664e1bff10eba1ac57dc2b2484eb57141804c22" exitCode=0 Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.973804 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" event={"ID":"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1","Type":"ContainerDied","Data":"1c12c99553c3fded7ba52a026664e1bff10eba1ac57dc2b2484eb57141804c22"} Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.973854 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.974092 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-47q58" event={"ID":"f99ecd3e-1d5b-4f3d-80d7-9a401171fef1","Type":"ContainerDied","Data":"29094e7809129e1e7698bb9912d54d76c17888d5bfac065889c5bd5838e4b71c"} Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.974124 4764 scope.go:117] "RemoveContainer" containerID="1c12c99553c3fded7ba52a026664e1bff10eba1ac57dc2b2484eb57141804c22" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.975076 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:15 crc kubenswrapper[4764]: I0309 13:41:15.975770 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:16 crc kubenswrapper[4764]: I0309 13:41:16.022921 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-47q58"] Mar 09 13:41:16 crc kubenswrapper[4764]: I0309 13:41:16.027728 4764 scope.go:117] "RemoveContainer" containerID="643b445a5cdf8e66cc772b07777e2b70383e28ddead01aaa535ab1ef9fb46333" Mar 09 13:41:16 crc kubenswrapper[4764]: I0309 13:41:16.029932 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-47q58"] Mar 09 13:41:16 crc kubenswrapper[4764]: I0309 13:41:16.059314 4764 scope.go:117] "RemoveContainer" containerID="1c12c99553c3fded7ba52a026664e1bff10eba1ac57dc2b2484eb57141804c22" Mar 09 13:41:16 crc kubenswrapper[4764]: E0309 13:41:16.060106 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c12c99553c3fded7ba52a026664e1bff10eba1ac57dc2b2484eb57141804c22\": container with ID starting with 1c12c99553c3fded7ba52a026664e1bff10eba1ac57dc2b2484eb57141804c22 not found: ID does not exist" containerID="1c12c99553c3fded7ba52a026664e1bff10eba1ac57dc2b2484eb57141804c22" Mar 09 13:41:16 crc kubenswrapper[4764]: I0309 13:41:16.060165 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c12c99553c3fded7ba52a026664e1bff10eba1ac57dc2b2484eb57141804c22"} err="failed to get container status \"1c12c99553c3fded7ba52a026664e1bff10eba1ac57dc2b2484eb57141804c22\": rpc error: code = NotFound desc = could not find container \"1c12c99553c3fded7ba52a026664e1bff10eba1ac57dc2b2484eb57141804c22\": container with ID starting with 1c12c99553c3fded7ba52a026664e1bff10eba1ac57dc2b2484eb57141804c22 not found: ID does not exist" Mar 09 13:41:16 crc kubenswrapper[4764]: I0309 13:41:16.060188 4764 scope.go:117] "RemoveContainer" containerID="643b445a5cdf8e66cc772b07777e2b70383e28ddead01aaa535ab1ef9fb46333" Mar 09 13:41:16 crc kubenswrapper[4764]: E0309 13:41:16.060741 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"643b445a5cdf8e66cc772b07777e2b70383e28ddead01aaa535ab1ef9fb46333\": container with ID starting with 643b445a5cdf8e66cc772b07777e2b70383e28ddead01aaa535ab1ef9fb46333 not found: ID does not exist" containerID="643b445a5cdf8e66cc772b07777e2b70383e28ddead01aaa535ab1ef9fb46333" Mar 09 13:41:16 crc kubenswrapper[4764]: I0309 13:41:16.060768 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"643b445a5cdf8e66cc772b07777e2b70383e28ddead01aaa535ab1ef9fb46333"} err="failed to get container status \"643b445a5cdf8e66cc772b07777e2b70383e28ddead01aaa535ab1ef9fb46333\": rpc error: code = NotFound desc = could not find container \"643b445a5cdf8e66cc772b07777e2b70383e28ddead01aaa535ab1ef9fb46333\": container with ID starting with 643b445a5cdf8e66cc772b07777e2b70383e28ddead01aaa535ab1ef9fb46333 not found: ID does not exist" Mar 09 13:41:16 crc kubenswrapper[4764]: I0309 13:41:16.985696 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-x9gvc" event={"ID":"cb54f57d-afb6-4e53-be9a-4b22573a9450","Type":"ContainerStarted","Data":"e95470c676ddedadba89efafc3707fbce528908b0bfa879405e032128d81cc49"} Mar 09 13:41:17 crc kubenswrapper[4764]: I0309 13:41:17.007283 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-x9gvc" podStartSLOduration=3.2118654429999998 podStartE2EDuration="36.007266077s" podCreationTimestamp="2026-03-09 13:40:41 +0000 UTC" firstStartedPulling="2026-03-09 13:40:43.339839636 +0000 UTC m=+1198.590011544" lastFinishedPulling="2026-03-09 13:41:16.13524027 +0000 UTC m=+1231.385412178" observedRunningTime="2026-03-09 13:41:17.001731309 +0000 UTC m=+1232.251903217" watchObservedRunningTime="2026-03-09 13:41:17.007266077 +0000 UTC m=+1232.257437985" Mar 09 13:41:17 crc kubenswrapper[4764]: I0309 13:41:17.571175 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f99ecd3e-1d5b-4f3d-80d7-9a401171fef1" path="/var/lib/kubelet/pods/f99ecd3e-1d5b-4f3d-80d7-9a401171fef1/volumes" Mar 09 13:41:19 crc kubenswrapper[4764]: I0309 13:41:19.012355 4764 generic.go:334] "Generic (PLEG): container finished" podID="cb54f57d-afb6-4e53-be9a-4b22573a9450" containerID="e95470c676ddedadba89efafc3707fbce528908b0bfa879405e032128d81cc49" exitCode=0 Mar 09 13:41:19 crc kubenswrapper[4764]: I0309 13:41:19.012525 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-x9gvc" event={"ID":"cb54f57d-afb6-4e53-be9a-4b22573a9450","Type":"ContainerDied","Data":"e95470c676ddedadba89efafc3707fbce528908b0bfa879405e032128d81cc49"} Mar 09 13:41:21 crc kubenswrapper[4764]: I0309 13:41:21.618220 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-x9gvc" Mar 09 13:41:21 crc kubenswrapper[4764]: I0309 13:41:21.689676 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jvrd\" (UniqueName: \"kubernetes.io/projected/cb54f57d-afb6-4e53-be9a-4b22573a9450-kube-api-access-7jvrd\") pod \"cb54f57d-afb6-4e53-be9a-4b22573a9450\" (UID: \"cb54f57d-afb6-4e53-be9a-4b22573a9450\") " Mar 09 13:41:21 crc kubenswrapper[4764]: I0309 13:41:21.689750 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cb54f57d-afb6-4e53-be9a-4b22573a9450-db-sync-config-data\") pod \"cb54f57d-afb6-4e53-be9a-4b22573a9450\" (UID: \"cb54f57d-afb6-4e53-be9a-4b22573a9450\") " Mar 09 13:41:21 crc kubenswrapper[4764]: I0309 13:41:21.689822 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb54f57d-afb6-4e53-be9a-4b22573a9450-combined-ca-bundle\") pod \"cb54f57d-afb6-4e53-be9a-4b22573a9450\" (UID: \"cb54f57d-afb6-4e53-be9a-4b22573a9450\") " Mar 09 13:41:21 crc kubenswrapper[4764]: I0309 13:41:21.701674 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb54f57d-afb6-4e53-be9a-4b22573a9450-kube-api-access-7jvrd" (OuterVolumeSpecName: "kube-api-access-7jvrd") pod "cb54f57d-afb6-4e53-be9a-4b22573a9450" (UID: "cb54f57d-afb6-4e53-be9a-4b22573a9450"). InnerVolumeSpecName "kube-api-access-7jvrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:21 crc kubenswrapper[4764]: I0309 13:41:21.703854 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb54f57d-afb6-4e53-be9a-4b22573a9450-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "cb54f57d-afb6-4e53-be9a-4b22573a9450" (UID: "cb54f57d-afb6-4e53-be9a-4b22573a9450"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:21 crc kubenswrapper[4764]: I0309 13:41:21.720389 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb54f57d-afb6-4e53-be9a-4b22573a9450-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb54f57d-afb6-4e53-be9a-4b22573a9450" (UID: "cb54f57d-afb6-4e53-be9a-4b22573a9450"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:21 crc kubenswrapper[4764]: I0309 13:41:21.793594 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jvrd\" (UniqueName: \"kubernetes.io/projected/cb54f57d-afb6-4e53-be9a-4b22573a9450-kube-api-access-7jvrd\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:21 crc kubenswrapper[4764]: I0309 13:41:21.793635 4764 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cb54f57d-afb6-4e53-be9a-4b22573a9450-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:21 crc kubenswrapper[4764]: I0309 13:41:21.793656 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb54f57d-afb6-4e53-be9a-4b22573a9450-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.060715 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-x9gvc" Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.060714 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-x9gvc" event={"ID":"cb54f57d-afb6-4e53-be9a-4b22573a9450","Type":"ContainerDied","Data":"7611ab820e5631fb861e6ee00f8e6a6553e8f141cb68d0a4556a1a0beeb23d3b"} Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.060853 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7611ab820e5631fb861e6ee00f8e6a6553e8f141cb68d0a4556a1a0beeb23d3b" Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.872693 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-789c56cf69-2dj2c"] Mar 09 13:41:22 crc kubenswrapper[4764]: E0309 13:41:22.873900 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f99ecd3e-1d5b-4f3d-80d7-9a401171fef1" containerName="dnsmasq-dns" Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.873915 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f99ecd3e-1d5b-4f3d-80d7-9a401171fef1" containerName="dnsmasq-dns" Mar 09 13:41:22 crc kubenswrapper[4764]: E0309 13:41:22.873936 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f99ecd3e-1d5b-4f3d-80d7-9a401171fef1" containerName="init" Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.873942 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f99ecd3e-1d5b-4f3d-80d7-9a401171fef1" containerName="init" Mar 09 13:41:22 crc kubenswrapper[4764]: E0309 13:41:22.873959 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb54f57d-afb6-4e53-be9a-4b22573a9450" containerName="barbican-db-sync" Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.873966 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb54f57d-afb6-4e53-be9a-4b22573a9450" containerName="barbican-db-sync" Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.874152 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f99ecd3e-1d5b-4f3d-80d7-9a401171fef1" containerName="dnsmasq-dns" Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.874169 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb54f57d-afb6-4e53-be9a-4b22573a9450" containerName="barbican-db-sync" Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.880163 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.882320 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.882666 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.885456 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-j8zg7" Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.901404 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7c6f54974-hws5g"] Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.903087 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.906050 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.918763 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-789c56cf69-2dj2c"] Mar 09 13:41:22 crc kubenswrapper[4764]: I0309 13:41:22.948365 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7c6f54974-hws5g"] Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.018011 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a18071d3-1164-4080-9095-919bb5349bb8-combined-ca-bundle\") pod \"barbican-worker-789c56cf69-2dj2c\" (UID: \"a18071d3-1164-4080-9095-919bb5349bb8\") " pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.018070 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf2cd\" (UniqueName: \"kubernetes.io/projected/154490f8-97ab-4703-a96c-16b6d5f7a178-kube-api-access-wf2cd\") pod \"barbican-keystone-listener-7c6f54974-hws5g\" (UID: \"154490f8-97ab-4703-a96c-16b6d5f7a178\") " pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.018106 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a18071d3-1164-4080-9095-919bb5349bb8-config-data\") pod \"barbican-worker-789c56cf69-2dj2c\" (UID: \"a18071d3-1164-4080-9095-919bb5349bb8\") " pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.018144 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a18071d3-1164-4080-9095-919bb5349bb8-config-data-custom\") pod \"barbican-worker-789c56cf69-2dj2c\" (UID: \"a18071d3-1164-4080-9095-919bb5349bb8\") " pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.018170 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/154490f8-97ab-4703-a96c-16b6d5f7a178-config-data-custom\") pod \"barbican-keystone-listener-7c6f54974-hws5g\" (UID: \"154490f8-97ab-4703-a96c-16b6d5f7a178\") " pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.018205 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/154490f8-97ab-4703-a96c-16b6d5f7a178-logs\") pod \"barbican-keystone-listener-7c6f54974-hws5g\" (UID: \"154490f8-97ab-4703-a96c-16b6d5f7a178\") " pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.018254 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/154490f8-97ab-4703-a96c-16b6d5f7a178-config-data\") pod \"barbican-keystone-listener-7c6f54974-hws5g\" (UID: \"154490f8-97ab-4703-a96c-16b6d5f7a178\") " pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.018276 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzfmq\" (UniqueName: \"kubernetes.io/projected/a18071d3-1164-4080-9095-919bb5349bb8-kube-api-access-zzfmq\") pod \"barbican-worker-789c56cf69-2dj2c\" (UID: \"a18071d3-1164-4080-9095-919bb5349bb8\") " pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.018317 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154490f8-97ab-4703-a96c-16b6d5f7a178-combined-ca-bundle\") pod \"barbican-keystone-listener-7c6f54974-hws5g\" (UID: \"154490f8-97ab-4703-a96c-16b6d5f7a178\") " pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.018344 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a18071d3-1164-4080-9095-919bb5349bb8-logs\") pod \"barbican-worker-789c56cf69-2dj2c\" (UID: \"a18071d3-1164-4080-9095-919bb5349bb8\") " pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.022880 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-869f779d85-qsc97"] Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.024572 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.054461 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-qsc97"] Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.120410 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a18071d3-1164-4080-9095-919bb5349bb8-combined-ca-bundle\") pod \"barbican-worker-789c56cf69-2dj2c\" (UID: \"a18071d3-1164-4080-9095-919bb5349bb8\") " pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.120457 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf2cd\" (UniqueName: \"kubernetes.io/projected/154490f8-97ab-4703-a96c-16b6d5f7a178-kube-api-access-wf2cd\") pod \"barbican-keystone-listener-7c6f54974-hws5g\" (UID: \"154490f8-97ab-4703-a96c-16b6d5f7a178\") " pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.120516 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a18071d3-1164-4080-9095-919bb5349bb8-config-data\") pod \"barbican-worker-789c56cf69-2dj2c\" (UID: \"a18071d3-1164-4080-9095-919bb5349bb8\") " pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.120571 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-ovsdbserver-nb\") pod \"dnsmasq-dns-869f779d85-qsc97\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.120685 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a18071d3-1164-4080-9095-919bb5349bb8-config-data-custom\") pod \"barbican-worker-789c56cf69-2dj2c\" (UID: \"a18071d3-1164-4080-9095-919bb5349bb8\") " pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.120715 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-dns-svc\") pod \"dnsmasq-dns-869f779d85-qsc97\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.120913 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/154490f8-97ab-4703-a96c-16b6d5f7a178-config-data-custom\") pod \"barbican-keystone-listener-7c6f54974-hws5g\" (UID: \"154490f8-97ab-4703-a96c-16b6d5f7a178\") " pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.120962 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/154490f8-97ab-4703-a96c-16b6d5f7a178-logs\") pod \"barbican-keystone-listener-7c6f54974-hws5g\" (UID: \"154490f8-97ab-4703-a96c-16b6d5f7a178\") " pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.121070 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-config\") pod \"dnsmasq-dns-869f779d85-qsc97\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.121669 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-489r4\" (UniqueName: \"kubernetes.io/projected/c38cb253-b945-43b3-8dcd-209682d40f11-kube-api-access-489r4\") pod \"dnsmasq-dns-869f779d85-qsc97\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.121711 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/154490f8-97ab-4703-a96c-16b6d5f7a178-config-data\") pod \"barbican-keystone-listener-7c6f54974-hws5g\" (UID: \"154490f8-97ab-4703-a96c-16b6d5f7a178\") " pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.121736 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzfmq\" (UniqueName: \"kubernetes.io/projected/a18071d3-1164-4080-9095-919bb5349bb8-kube-api-access-zzfmq\") pod \"barbican-worker-789c56cf69-2dj2c\" (UID: \"a18071d3-1164-4080-9095-919bb5349bb8\") " pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.121823 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154490f8-97ab-4703-a96c-16b6d5f7a178-combined-ca-bundle\") pod \"barbican-keystone-listener-7c6f54974-hws5g\" (UID: \"154490f8-97ab-4703-a96c-16b6d5f7a178\") " pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.121858 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a18071d3-1164-4080-9095-919bb5349bb8-logs\") pod \"barbican-worker-789c56cf69-2dj2c\" (UID: \"a18071d3-1164-4080-9095-919bb5349bb8\") " pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.121878 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-ovsdbserver-sb\") pod \"dnsmasq-dns-869f779d85-qsc97\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.122419 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/154490f8-97ab-4703-a96c-16b6d5f7a178-logs\") pod \"barbican-keystone-listener-7c6f54974-hws5g\" (UID: \"154490f8-97ab-4703-a96c-16b6d5f7a178\") " pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.124662 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dp5x6" event={"ID":"74146b7d-9780-4d2d-9454-853296f88955","Type":"ContainerStarted","Data":"1a80c7f5de2d815f10d1b147b52b1c923bf4e3266278afd841a98b5bc66a8ad6"} Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.126544 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a18071d3-1164-4080-9095-919bb5349bb8-logs\") pod \"barbican-worker-789c56cf69-2dj2c\" (UID: \"a18071d3-1164-4080-9095-919bb5349bb8\") " pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.135546 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a18071d3-1164-4080-9095-919bb5349bb8-config-data-custom\") pod \"barbican-worker-789c56cf69-2dj2c\" (UID: \"a18071d3-1164-4080-9095-919bb5349bb8\") " pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.136286 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a18071d3-1164-4080-9095-919bb5349bb8-combined-ca-bundle\") pod \"barbican-worker-789c56cf69-2dj2c\" (UID: \"a18071d3-1164-4080-9095-919bb5349bb8\") " pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.145716 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzfmq\" (UniqueName: \"kubernetes.io/projected/a18071d3-1164-4080-9095-919bb5349bb8-kube-api-access-zzfmq\") pod \"barbican-worker-789c56cf69-2dj2c\" (UID: \"a18071d3-1164-4080-9095-919bb5349bb8\") " pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.155981 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154490f8-97ab-4703-a96c-16b6d5f7a178-combined-ca-bundle\") pod \"barbican-keystone-listener-7c6f54974-hws5g\" (UID: \"154490f8-97ab-4703-a96c-16b6d5f7a178\") " pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.159636 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/154490f8-97ab-4703-a96c-16b6d5f7a178-config-data-custom\") pod \"barbican-keystone-listener-7c6f54974-hws5g\" (UID: \"154490f8-97ab-4703-a96c-16b6d5f7a178\") " pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.160748 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a18071d3-1164-4080-9095-919bb5349bb8-config-data\") pod \"barbican-worker-789c56cf69-2dj2c\" (UID: \"a18071d3-1164-4080-9095-919bb5349bb8\") " pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.161218 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf2cd\" (UniqueName: \"kubernetes.io/projected/154490f8-97ab-4703-a96c-16b6d5f7a178-kube-api-access-wf2cd\") pod \"barbican-keystone-listener-7c6f54974-hws5g\" (UID: \"154490f8-97ab-4703-a96c-16b6d5f7a178\") " pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.168130 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"861cdd7d-b563-4009-9c33-a5c64d6ffae9","Type":"ContainerStarted","Data":"49c5c69c52f65c2c21bb505679de7692f04c3f794601e412d337315878f9e046"} Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.168671 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="ceilometer-central-agent" containerID="cri-o://2486ace2409cc48f14efc281a2eed3fa9e4956d145642ec1061f64d96ea18a5f" gracePeriod=30 Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.168922 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="ceilometer-notification-agent" containerID="cri-o://4db2afcb94ef0a26dd49d9961d5346346b289d4dbcccce08d916a8fe7f319ea2" gracePeriod=30 Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.168947 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.168964 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="sg-core" containerID="cri-o://d869817a82b9f339ae214ccda8eaf5aa4fa4f2796d726fb166b8fa1a3f7e2eff" gracePeriod=30 Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.169216 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="proxy-httpd" containerID="cri-o://49c5c69c52f65c2c21bb505679de7692f04c3f794601e412d337315878f9e046" gracePeriod=30 Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.171729 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/154490f8-97ab-4703-a96c-16b6d5f7a178-config-data\") pod \"barbican-keystone-listener-7c6f54974-hws5g\" (UID: \"154490f8-97ab-4703-a96c-16b6d5f7a178\") " pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.175176 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-dp5x6" podStartSLOduration=3.418861012 podStartE2EDuration="42.175154395s" podCreationTimestamp="2026-03-09 13:40:41 +0000 UTC" firstStartedPulling="2026-03-09 13:40:43.324675096 +0000 UTC m=+1198.574847004" lastFinishedPulling="2026-03-09 13:41:22.080968479 +0000 UTC m=+1237.331140387" observedRunningTime="2026-03-09 13:41:23.15862092 +0000 UTC m=+1238.408792828" watchObservedRunningTime="2026-03-09 13:41:23.175154395 +0000 UTC m=+1238.425326323" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.212554 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-789c56cf69-2dj2c" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.234719 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.242701 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-dns-svc\") pod \"dnsmasq-dns-869f779d85-qsc97\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.242901 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-config\") pod \"dnsmasq-dns-869f779d85-qsc97\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.242969 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-489r4\" (UniqueName: \"kubernetes.io/projected/c38cb253-b945-43b3-8dcd-209682d40f11-kube-api-access-489r4\") pod \"dnsmasq-dns-869f779d85-qsc97\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.243193 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-ovsdbserver-sb\") pod \"dnsmasq-dns-869f779d85-qsc97\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.246099 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-config\") pod \"dnsmasq-dns-869f779d85-qsc97\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.247101 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-ovsdbserver-sb\") pod \"dnsmasq-dns-869f779d85-qsc97\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.255331 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-ovsdbserver-nb\") pod \"dnsmasq-dns-869f779d85-qsc97\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.256384 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-ovsdbserver-nb\") pod \"dnsmasq-dns-869f779d85-qsc97\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.261049 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-dns-svc\") pod \"dnsmasq-dns-869f779d85-qsc97\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.302314 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.062579511 podStartE2EDuration="42.302287331s" podCreationTimestamp="2026-03-09 13:40:41 +0000 UTC" firstStartedPulling="2026-03-09 13:40:42.860709556 +0000 UTC m=+1198.110881464" lastFinishedPulling="2026-03-09 13:41:22.100417376 +0000 UTC m=+1237.350589284" observedRunningTime="2026-03-09 13:41:23.218898061 +0000 UTC m=+1238.469069969" watchObservedRunningTime="2026-03-09 13:41:23.302287331 +0000 UTC m=+1238.552459249" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.302805 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-489r4\" (UniqueName: \"kubernetes.io/projected/c38cb253-b945-43b3-8dcd-209682d40f11-kube-api-access-489r4\") pod \"dnsmasq-dns-869f779d85-qsc97\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.323872 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7dcdc6487b-j8w75"] Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.326337 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.332078 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.347538 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7dcdc6487b-j8w75"] Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.363844 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.461315 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-config-data-custom\") pod \"barbican-api-7dcdc6487b-j8w75\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.461917 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gstx5\" (UniqueName: \"kubernetes.io/projected/60a250ee-f349-4bab-b5b4-b402289210a6-kube-api-access-gstx5\") pod \"barbican-api-7dcdc6487b-j8w75\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.461975 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60a250ee-f349-4bab-b5b4-b402289210a6-logs\") pod \"barbican-api-7dcdc6487b-j8w75\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.462013 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-combined-ca-bundle\") pod \"barbican-api-7dcdc6487b-j8w75\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.462043 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-config-data\") pod \"barbican-api-7dcdc6487b-j8w75\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.567174 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gstx5\" (UniqueName: \"kubernetes.io/projected/60a250ee-f349-4bab-b5b4-b402289210a6-kube-api-access-gstx5\") pod \"barbican-api-7dcdc6487b-j8w75\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.567312 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60a250ee-f349-4bab-b5b4-b402289210a6-logs\") pod \"barbican-api-7dcdc6487b-j8w75\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.567352 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-combined-ca-bundle\") pod \"barbican-api-7dcdc6487b-j8w75\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.567385 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-config-data\") pod \"barbican-api-7dcdc6487b-j8w75\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.567434 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-config-data-custom\") pod \"barbican-api-7dcdc6487b-j8w75\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.568002 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60a250ee-f349-4bab-b5b4-b402289210a6-logs\") pod \"barbican-api-7dcdc6487b-j8w75\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.577481 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-config-data-custom\") pod \"barbican-api-7dcdc6487b-j8w75\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.581313 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-combined-ca-bundle\") pod \"barbican-api-7dcdc6487b-j8w75\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.584498 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-config-data\") pod \"barbican-api-7dcdc6487b-j8w75\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.618101 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gstx5\" (UniqueName: \"kubernetes.io/projected/60a250ee-f349-4bab-b5b4-b402289210a6-kube-api-access-gstx5\") pod \"barbican-api-7dcdc6487b-j8w75\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.684610 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.908350 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7c6f54974-hws5g"] Mar 09 13:41:23 crc kubenswrapper[4764]: W0309 13:41:23.916985 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod154490f8_97ab_4703_a96c_16b6d5f7a178.slice/crio-f9d6f5454e96631579c882ada452e520ff7962cd96029c1aa57007ee607c3b72 WatchSource:0}: Error finding container f9d6f5454e96631579c882ada452e520ff7962cd96029c1aa57007ee607c3b72: Status 404 returned error can't find the container with id f9d6f5454e96631579c882ada452e520ff7962cd96029c1aa57007ee607c3b72 Mar 09 13:41:23 crc kubenswrapper[4764]: I0309 13:41:23.988962 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-789c56cf69-2dj2c"] Mar 09 13:41:23 crc kubenswrapper[4764]: W0309 13:41:23.995145 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda18071d3_1164_4080_9095_919bb5349bb8.slice/crio-6368fca578eaa6b79a350a7b8be6f5443ef3b3a49140f98edc5341fb7208ad48 WatchSource:0}: Error finding container 6368fca578eaa6b79a350a7b8be6f5443ef3b3a49140f98edc5341fb7208ad48: Status 404 returned error can't find the container with id 6368fca578eaa6b79a350a7b8be6f5443ef3b3a49140f98edc5341fb7208ad48 Mar 09 13:41:24 crc kubenswrapper[4764]: I0309 13:41:24.020420 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-qsc97"] Mar 09 13:41:24 crc kubenswrapper[4764]: W0309 13:41:24.023406 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc38cb253_b945_43b3_8dcd_209682d40f11.slice/crio-744dd94a1b4869c7dc37123bb898cb66f67f114bb38f59e8f906fc2e985184ea WatchSource:0}: Error finding container 744dd94a1b4869c7dc37123bb898cb66f67f114bb38f59e8f906fc2e985184ea: Status 404 returned error can't find the container with id 744dd94a1b4869c7dc37123bb898cb66f67f114bb38f59e8f906fc2e985184ea Mar 09 13:41:24 crc kubenswrapper[4764]: I0309 13:41:24.179287 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-qsc97" event={"ID":"c38cb253-b945-43b3-8dcd-209682d40f11","Type":"ContainerStarted","Data":"744dd94a1b4869c7dc37123bb898cb66f67f114bb38f59e8f906fc2e985184ea"} Mar 09 13:41:24 crc kubenswrapper[4764]: I0309 13:41:24.181721 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-789c56cf69-2dj2c" event={"ID":"a18071d3-1164-4080-9095-919bb5349bb8","Type":"ContainerStarted","Data":"6368fca578eaa6b79a350a7b8be6f5443ef3b3a49140f98edc5341fb7208ad48"} Mar 09 13:41:24 crc kubenswrapper[4764]: I0309 13:41:24.184565 4764 generic.go:334] "Generic (PLEG): container finished" podID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerID="49c5c69c52f65c2c21bb505679de7692f04c3f794601e412d337315878f9e046" exitCode=0 Mar 09 13:41:24 crc kubenswrapper[4764]: I0309 13:41:24.184588 4764 generic.go:334] "Generic (PLEG): container finished" podID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerID="d869817a82b9f339ae214ccda8eaf5aa4fa4f2796d726fb166b8fa1a3f7e2eff" exitCode=2 Mar 09 13:41:24 crc kubenswrapper[4764]: I0309 13:41:24.184598 4764 generic.go:334] "Generic (PLEG): container finished" podID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerID="2486ace2409cc48f14efc281a2eed3fa9e4956d145642ec1061f64d96ea18a5f" exitCode=0 Mar 09 13:41:24 crc kubenswrapper[4764]: I0309 13:41:24.184627 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"861cdd7d-b563-4009-9c33-a5c64d6ffae9","Type":"ContainerDied","Data":"49c5c69c52f65c2c21bb505679de7692f04c3f794601e412d337315878f9e046"} Mar 09 13:41:24 crc kubenswrapper[4764]: I0309 13:41:24.184663 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"861cdd7d-b563-4009-9c33-a5c64d6ffae9","Type":"ContainerDied","Data":"d869817a82b9f339ae214ccda8eaf5aa4fa4f2796d726fb166b8fa1a3f7e2eff"} Mar 09 13:41:24 crc kubenswrapper[4764]: I0309 13:41:24.184675 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"861cdd7d-b563-4009-9c33-a5c64d6ffae9","Type":"ContainerDied","Data":"2486ace2409cc48f14efc281a2eed3fa9e4956d145642ec1061f64d96ea18a5f"} Mar 09 13:41:24 crc kubenswrapper[4764]: I0309 13:41:24.185662 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" event={"ID":"154490f8-97ab-4703-a96c-16b6d5f7a178","Type":"ContainerStarted","Data":"f9d6f5454e96631579c882ada452e520ff7962cd96029c1aa57007ee607c3b72"} Mar 09 13:41:24 crc kubenswrapper[4764]: I0309 13:41:24.216868 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7dcdc6487b-j8w75"] Mar 09 13:41:24 crc kubenswrapper[4764]: W0309 13:41:24.222691 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60a250ee_f349_4bab_b5b4_b402289210a6.slice/crio-60454879be0f224ed5bb74217d2090bdcd6efc318e5799fb2111033ee5eacbe4 WatchSource:0}: Error finding container 60454879be0f224ed5bb74217d2090bdcd6efc318e5799fb2111033ee5eacbe4: Status 404 returned error can't find the container with id 60454879be0f224ed5bb74217d2090bdcd6efc318e5799fb2111033ee5eacbe4 Mar 09 13:41:25 crc kubenswrapper[4764]: I0309 13:41:25.213110 4764 generic.go:334] "Generic (PLEG): container finished" podID="c38cb253-b945-43b3-8dcd-209682d40f11" containerID="83e5b1aa14005ff710f52b631cae367c77a62e68a139a563a732645f67f44497" exitCode=0 Mar 09 13:41:25 crc kubenswrapper[4764]: I0309 13:41:25.213295 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-qsc97" event={"ID":"c38cb253-b945-43b3-8dcd-209682d40f11","Type":"ContainerDied","Data":"83e5b1aa14005ff710f52b631cae367c77a62e68a139a563a732645f67f44497"} Mar 09 13:41:25 crc kubenswrapper[4764]: I0309 13:41:25.248869 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dcdc6487b-j8w75" event={"ID":"60a250ee-f349-4bab-b5b4-b402289210a6","Type":"ContainerStarted","Data":"73990548f38f06ad2795c4de799dea38305ca7065b97482d22fcdeed8a8051c5"} Mar 09 13:41:25 crc kubenswrapper[4764]: I0309 13:41:25.248946 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dcdc6487b-j8w75" event={"ID":"60a250ee-f349-4bab-b5b4-b402289210a6","Type":"ContainerStarted","Data":"4c95fd726c1b28b4f8ab8a6d5ebf1002f0c9dea4159ff74f568392de3a2ff067"} Mar 09 13:41:25 crc kubenswrapper[4764]: I0309 13:41:25.248962 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dcdc6487b-j8w75" event={"ID":"60a250ee-f349-4bab-b5b4-b402289210a6","Type":"ContainerStarted","Data":"60454879be0f224ed5bb74217d2090bdcd6efc318e5799fb2111033ee5eacbe4"} Mar 09 13:41:25 crc kubenswrapper[4764]: I0309 13:41:25.249779 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:25 crc kubenswrapper[4764]: I0309 13:41:25.249839 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:25 crc kubenswrapper[4764]: I0309 13:41:25.283495 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7dcdc6487b-j8w75" podStartSLOduration=2.283480259 podStartE2EDuration="2.283480259s" podCreationTimestamp="2026-03-09 13:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:25.281412227 +0000 UTC m=+1240.531584125" watchObservedRunningTime="2026-03-09 13:41:25.283480259 +0000 UTC m=+1240.533652177" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.169270 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.233764 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-config-data\") pod \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.233916 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj2kg\" (UniqueName: \"kubernetes.io/projected/861cdd7d-b563-4009-9c33-a5c64d6ffae9-kube-api-access-mj2kg\") pod \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.233950 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-scripts\") pod \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.234063 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-combined-ca-bundle\") pod \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.234110 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/861cdd7d-b563-4009-9c33-a5c64d6ffae9-log-httpd\") pod \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.234166 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-sg-core-conf-yaml\") pod \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.234222 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/861cdd7d-b563-4009-9c33-a5c64d6ffae9-run-httpd\") pod \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\" (UID: \"861cdd7d-b563-4009-9c33-a5c64d6ffae9\") " Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.245265 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/861cdd7d-b563-4009-9c33-a5c64d6ffae9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "861cdd7d-b563-4009-9c33-a5c64d6ffae9" (UID: "861cdd7d-b563-4009-9c33-a5c64d6ffae9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.261707 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/861cdd7d-b563-4009-9c33-a5c64d6ffae9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "861cdd7d-b563-4009-9c33-a5c64d6ffae9" (UID: "861cdd7d-b563-4009-9c33-a5c64d6ffae9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.273826 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/861cdd7d-b563-4009-9c33-a5c64d6ffae9-kube-api-access-mj2kg" (OuterVolumeSpecName: "kube-api-access-mj2kg") pod "861cdd7d-b563-4009-9c33-a5c64d6ffae9" (UID: "861cdd7d-b563-4009-9c33-a5c64d6ffae9"). InnerVolumeSpecName "kube-api-access-mj2kg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.324832 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-scripts" (OuterVolumeSpecName: "scripts") pod "861cdd7d-b563-4009-9c33-a5c64d6ffae9" (UID: "861cdd7d-b563-4009-9c33-a5c64d6ffae9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.338028 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mj2kg\" (UniqueName: \"kubernetes.io/projected/861cdd7d-b563-4009-9c33-a5c64d6ffae9-kube-api-access-mj2kg\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.338054 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.338068 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/861cdd7d-b563-4009-9c33-a5c64d6ffae9-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.338078 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/861cdd7d-b563-4009-9c33-a5c64d6ffae9-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.368988 4764 generic.go:334] "Generic (PLEG): container finished" podID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerID="4db2afcb94ef0a26dd49d9961d5346346b289d4dbcccce08d916a8fe7f319ea2" exitCode=0 Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.369941 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.370864 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"861cdd7d-b563-4009-9c33-a5c64d6ffae9","Type":"ContainerDied","Data":"4db2afcb94ef0a26dd49d9961d5346346b289d4dbcccce08d916a8fe7f319ea2"} Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.370893 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"861cdd7d-b563-4009-9c33-a5c64d6ffae9","Type":"ContainerDied","Data":"6eab7083503ac11b2f955fc7b67907a3eb60734c7262ad63357465f6782429c0"} Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.370914 4764 scope.go:117] "RemoveContainer" containerID="49c5c69c52f65c2c21bb505679de7692f04c3f794601e412d337315878f9e046" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.426505 4764 scope.go:117] "RemoveContainer" containerID="d869817a82b9f339ae214ccda8eaf5aa4fa4f2796d726fb166b8fa1a3f7e2eff" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.432737 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "861cdd7d-b563-4009-9c33-a5c64d6ffae9" (UID: "861cdd7d-b563-4009-9c33-a5c64d6ffae9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.440894 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.483074 4764 scope.go:117] "RemoveContainer" containerID="4db2afcb94ef0a26dd49d9961d5346346b289d4dbcccce08d916a8fe7f319ea2" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.509154 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-94887676d-fp9dl"] Mar 09 13:41:26 crc kubenswrapper[4764]: E0309 13:41:26.509680 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="ceilometer-central-agent" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.510146 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="ceilometer-central-agent" Mar 09 13:41:26 crc kubenswrapper[4764]: E0309 13:41:26.510176 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="ceilometer-notification-agent" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.510186 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="ceilometer-notification-agent" Mar 09 13:41:26 crc kubenswrapper[4764]: E0309 13:41:26.510200 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="sg-core" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.510209 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="sg-core" Mar 09 13:41:26 crc kubenswrapper[4764]: E0309 13:41:26.510230 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="proxy-httpd" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.510239 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="proxy-httpd" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.510494 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="ceilometer-central-agent" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.510524 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="sg-core" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.510536 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="proxy-httpd" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.510550 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" containerName="ceilometer-notification-agent" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.515821 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.521374 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.521660 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.536743 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-94887676d-fp9dl"] Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.537691 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-config-data" (OuterVolumeSpecName: "config-data") pod "861cdd7d-b563-4009-9c33-a5c64d6ffae9" (UID: "861cdd7d-b563-4009-9c33-a5c64d6ffae9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.548022 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.578528 4764 scope.go:117] "RemoveContainer" containerID="2486ace2409cc48f14efc281a2eed3fa9e4956d145642ec1061f64d96ea18a5f" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.581568 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "861cdd7d-b563-4009-9c33-a5c64d6ffae9" (UID: "861cdd7d-b563-4009-9c33-a5c64d6ffae9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.612820 4764 scope.go:117] "RemoveContainer" containerID="49c5c69c52f65c2c21bb505679de7692f04c3f794601e412d337315878f9e046" Mar 09 13:41:26 crc kubenswrapper[4764]: E0309 13:41:26.613202 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49c5c69c52f65c2c21bb505679de7692f04c3f794601e412d337315878f9e046\": container with ID starting with 49c5c69c52f65c2c21bb505679de7692f04c3f794601e412d337315878f9e046 not found: ID does not exist" containerID="49c5c69c52f65c2c21bb505679de7692f04c3f794601e412d337315878f9e046" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.613237 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49c5c69c52f65c2c21bb505679de7692f04c3f794601e412d337315878f9e046"} err="failed to get container status \"49c5c69c52f65c2c21bb505679de7692f04c3f794601e412d337315878f9e046\": rpc error: code = NotFound desc = could not find container \"49c5c69c52f65c2c21bb505679de7692f04c3f794601e412d337315878f9e046\": container with ID starting with 49c5c69c52f65c2c21bb505679de7692f04c3f794601e412d337315878f9e046 not found: ID does not exist" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.613270 4764 scope.go:117] "RemoveContainer" containerID="d869817a82b9f339ae214ccda8eaf5aa4fa4f2796d726fb166b8fa1a3f7e2eff" Mar 09 13:41:26 crc kubenswrapper[4764]: E0309 13:41:26.613597 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d869817a82b9f339ae214ccda8eaf5aa4fa4f2796d726fb166b8fa1a3f7e2eff\": container with ID starting with d869817a82b9f339ae214ccda8eaf5aa4fa4f2796d726fb166b8fa1a3f7e2eff not found: ID does not exist" containerID="d869817a82b9f339ae214ccda8eaf5aa4fa4f2796d726fb166b8fa1a3f7e2eff" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.613663 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d869817a82b9f339ae214ccda8eaf5aa4fa4f2796d726fb166b8fa1a3f7e2eff"} err="failed to get container status \"d869817a82b9f339ae214ccda8eaf5aa4fa4f2796d726fb166b8fa1a3f7e2eff\": rpc error: code = NotFound desc = could not find container \"d869817a82b9f339ae214ccda8eaf5aa4fa4f2796d726fb166b8fa1a3f7e2eff\": container with ID starting with d869817a82b9f339ae214ccda8eaf5aa4fa4f2796d726fb166b8fa1a3f7e2eff not found: ID does not exist" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.613695 4764 scope.go:117] "RemoveContainer" containerID="4db2afcb94ef0a26dd49d9961d5346346b289d4dbcccce08d916a8fe7f319ea2" Mar 09 13:41:26 crc kubenswrapper[4764]: E0309 13:41:26.614276 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4db2afcb94ef0a26dd49d9961d5346346b289d4dbcccce08d916a8fe7f319ea2\": container with ID starting with 4db2afcb94ef0a26dd49d9961d5346346b289d4dbcccce08d916a8fe7f319ea2 not found: ID does not exist" containerID="4db2afcb94ef0a26dd49d9961d5346346b289d4dbcccce08d916a8fe7f319ea2" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.614304 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4db2afcb94ef0a26dd49d9961d5346346b289d4dbcccce08d916a8fe7f319ea2"} err="failed to get container status \"4db2afcb94ef0a26dd49d9961d5346346b289d4dbcccce08d916a8fe7f319ea2\": rpc error: code = NotFound desc = could not find container \"4db2afcb94ef0a26dd49d9961d5346346b289d4dbcccce08d916a8fe7f319ea2\": container with ID starting with 4db2afcb94ef0a26dd49d9961d5346346b289d4dbcccce08d916a8fe7f319ea2 not found: ID does not exist" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.614320 4764 scope.go:117] "RemoveContainer" containerID="2486ace2409cc48f14efc281a2eed3fa9e4956d145642ec1061f64d96ea18a5f" Mar 09 13:41:26 crc kubenswrapper[4764]: E0309 13:41:26.614573 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2486ace2409cc48f14efc281a2eed3fa9e4956d145642ec1061f64d96ea18a5f\": container with ID starting with 2486ace2409cc48f14efc281a2eed3fa9e4956d145642ec1061f64d96ea18a5f not found: ID does not exist" containerID="2486ace2409cc48f14efc281a2eed3fa9e4956d145642ec1061f64d96ea18a5f" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.614630 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2486ace2409cc48f14efc281a2eed3fa9e4956d145642ec1061f64d96ea18a5f"} err="failed to get container status \"2486ace2409cc48f14efc281a2eed3fa9e4956d145642ec1061f64d96ea18a5f\": rpc error: code = NotFound desc = could not find container \"2486ace2409cc48f14efc281a2eed3fa9e4956d145642ec1061f64d96ea18a5f\": container with ID starting with 2486ace2409cc48f14efc281a2eed3fa9e4956d145642ec1061f64d96ea18a5f not found: ID does not exist" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.651696 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-internal-tls-certs\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.651926 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-logs\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.652241 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-config-data\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.652295 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-config-data-custom\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.652369 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-public-tls-certs\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.653615 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfxg4\" (UniqueName: \"kubernetes.io/projected/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-kube-api-access-pfxg4\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.654024 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-combined-ca-bundle\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.654169 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861cdd7d-b563-4009-9c33-a5c64d6ffae9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.755822 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-logs\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.755936 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-config-data\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.756427 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-logs\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.755970 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-config-data-custom\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.757107 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-public-tls-certs\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.757625 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfxg4\" (UniqueName: \"kubernetes.io/projected/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-kube-api-access-pfxg4\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.758379 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-combined-ca-bundle\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.759101 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-internal-tls-certs\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.760135 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-config-data-custom\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.760846 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-config-data\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.762534 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-internal-tls-certs\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.764428 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-public-tls-certs\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.767821 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-combined-ca-bundle\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.779905 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfxg4\" (UniqueName: \"kubernetes.io/projected/b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19-kube-api-access-pfxg4\") pod \"barbican-api-94887676d-fp9dl\" (UID: \"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19\") " pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.854091 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.894503 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.904123 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.924210 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.929499 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.938386 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.941190 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 13:41:26 crc kubenswrapper[4764]: I0309 13:41:26.967448 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.066701 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-scripts\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.067225 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-run-httpd\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.067263 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-config-data\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.067313 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.067356 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-log-httpd\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.067381 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpxx4\" (UniqueName: \"kubernetes.io/projected/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-kube-api-access-qpxx4\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.067420 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.169756 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-scripts\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.169874 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-run-httpd\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.169908 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-config-data\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.169964 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.170003 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-log-httpd\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.170023 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpxx4\" (UniqueName: \"kubernetes.io/projected/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-kube-api-access-qpxx4\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.170066 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.170499 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-run-httpd\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.170836 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-log-httpd\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.179671 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.179887 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-config-data\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.180030 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-scripts\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.183600 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.192326 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpxx4\" (UniqueName: \"kubernetes.io/projected/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-kube-api-access-qpxx4\") pod \"ceilometer-0\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.313110 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.371093 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-94887676d-fp9dl"] Mar 09 13:41:27 crc kubenswrapper[4764]: W0309 13:41:27.385957 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8389bcb_fcb2_48b4_a1c2_3ae7427ecc19.slice/crio-d669121abd84d11c4a6a8316b10e175a9237b52ac50e78e7464435c0f73518ce WatchSource:0}: Error finding container d669121abd84d11c4a6a8316b10e175a9237b52ac50e78e7464435c0f73518ce: Status 404 returned error can't find the container with id d669121abd84d11c4a6a8316b10e175a9237b52ac50e78e7464435c0f73518ce Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.392984 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" event={"ID":"154490f8-97ab-4703-a96c-16b6d5f7a178","Type":"ContainerStarted","Data":"d4b3b72c9386bd5a9ede799e410a50128df2c3496a09b743dba392f2cc5e257f"} Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.393060 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" event={"ID":"154490f8-97ab-4703-a96c-16b6d5f7a178","Type":"ContainerStarted","Data":"4ae3d0da6781eabbb5adafca215ad12ad0c3d525f95ef600a036122539dea3c9"} Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.404845 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-qsc97" event={"ID":"c38cb253-b945-43b3-8dcd-209682d40f11","Type":"ContainerStarted","Data":"15010747611fb0fe00d92531331c8086dfb085efb07034db0f644ee0d2a65525"} Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.405779 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.419509 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7c6f54974-hws5g" podStartSLOduration=3.247329244 podStartE2EDuration="5.419486129s" podCreationTimestamp="2026-03-09 13:41:22 +0000 UTC" firstStartedPulling="2026-03-09 13:41:23.927972964 +0000 UTC m=+1239.178144872" lastFinishedPulling="2026-03-09 13:41:26.100129849 +0000 UTC m=+1241.350301757" observedRunningTime="2026-03-09 13:41:27.411769585 +0000 UTC m=+1242.661941503" watchObservedRunningTime="2026-03-09 13:41:27.419486129 +0000 UTC m=+1242.669658037" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.439231 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-789c56cf69-2dj2c" event={"ID":"a18071d3-1164-4080-9095-919bb5349bb8","Type":"ContainerStarted","Data":"4672c16d453e1e274b6da68e97c12c8549633d1027aae998ed8d544a8dc9eae4"} Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.439309 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-789c56cf69-2dj2c" event={"ID":"a18071d3-1164-4080-9095-919bb5349bb8","Type":"ContainerStarted","Data":"6db9a396567078aa433f28579d87de04a68fe1f9ac000331309e7871a4e32d55"} Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.451291 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-869f779d85-qsc97" podStartSLOduration=4.451262735 podStartE2EDuration="4.451262735s" podCreationTimestamp="2026-03-09 13:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:27.439190312 +0000 UTC m=+1242.689362230" watchObservedRunningTime="2026-03-09 13:41:27.451262735 +0000 UTC m=+1242.701434633" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.474480 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-789c56cf69-2dj2c" podStartSLOduration=3.372097561 podStartE2EDuration="5.474448176s" podCreationTimestamp="2026-03-09 13:41:22 +0000 UTC" firstStartedPulling="2026-03-09 13:41:24.002529123 +0000 UTC m=+1239.252701031" lastFinishedPulling="2026-03-09 13:41:26.104879738 +0000 UTC m=+1241.355051646" observedRunningTime="2026-03-09 13:41:27.468926818 +0000 UTC m=+1242.719098746" watchObservedRunningTime="2026-03-09 13:41:27.474448176 +0000 UTC m=+1242.724620074" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.577517 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="861cdd7d-b563-4009-9c33-a5c64d6ffae9" path="/var/lib/kubelet/pods/861cdd7d-b563-4009-9c33-a5c64d6ffae9/volumes" Mar 09 13:41:27 crc kubenswrapper[4764]: I0309 13:41:27.823193 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:41:27 crc kubenswrapper[4764]: W0309 13:41:27.840581 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf5caebe_ad05_48b0_bbce_ecb2ec29e7c3.slice/crio-a9199722fbed2e4a6e2a95c99d206366b5d4467326eba28ca7c133400c787e70 WatchSource:0}: Error finding container a9199722fbed2e4a6e2a95c99d206366b5d4467326eba28ca7c133400c787e70: Status 404 returned error can't find the container with id a9199722fbed2e4a6e2a95c99d206366b5d4467326eba28ca7c133400c787e70 Mar 09 13:41:28 crc kubenswrapper[4764]: I0309 13:41:28.451119 4764 generic.go:334] "Generic (PLEG): container finished" podID="74146b7d-9780-4d2d-9454-853296f88955" containerID="1a80c7f5de2d815f10d1b147b52b1c923bf4e3266278afd841a98b5bc66a8ad6" exitCode=0 Mar 09 13:41:28 crc kubenswrapper[4764]: I0309 13:41:28.451205 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dp5x6" event={"ID":"74146b7d-9780-4d2d-9454-853296f88955","Type":"ContainerDied","Data":"1a80c7f5de2d815f10d1b147b52b1c923bf4e3266278afd841a98b5bc66a8ad6"} Mar 09 13:41:28 crc kubenswrapper[4764]: I0309 13:41:28.452333 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3","Type":"ContainerStarted","Data":"a9199722fbed2e4a6e2a95c99d206366b5d4467326eba28ca7c133400c787e70"} Mar 09 13:41:28 crc kubenswrapper[4764]: I0309 13:41:28.453949 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-94887676d-fp9dl" event={"ID":"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19","Type":"ContainerStarted","Data":"c1ae48c7a6db67e1b6ac1ce4507bf46720f7c1f1d15e14d2c397114aeca82582"} Mar 09 13:41:28 crc kubenswrapper[4764]: I0309 13:41:28.453996 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-94887676d-fp9dl" event={"ID":"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19","Type":"ContainerStarted","Data":"0afbaf2d7a7ce41452a514f65b61e8e6312ab49e880061ea621f68858d4cab47"} Mar 09 13:41:28 crc kubenswrapper[4764]: I0309 13:41:28.454008 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-94887676d-fp9dl" event={"ID":"b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19","Type":"ContainerStarted","Data":"d669121abd84d11c4a6a8316b10e175a9237b52ac50e78e7464435c0f73518ce"} Mar 09 13:41:28 crc kubenswrapper[4764]: I0309 13:41:28.503373 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-94887676d-fp9dl" podStartSLOduration=2.503352495 podStartE2EDuration="2.503352495s" podCreationTimestamp="2026-03-09 13:41:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:28.493764295 +0000 UTC m=+1243.743936223" watchObservedRunningTime="2026-03-09 13:41:28.503352495 +0000 UTC m=+1243.753524403" Mar 09 13:41:29 crc kubenswrapper[4764]: I0309 13:41:29.464211 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3","Type":"ContainerStarted","Data":"d86adcecb94ef745afb424352583b984c64f486d6d2dfabedad259db0634d772"} Mar 09 13:41:29 crc kubenswrapper[4764]: I0309 13:41:29.464755 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:29 crc kubenswrapper[4764]: I0309 13:41:29.464776 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:29 crc kubenswrapper[4764]: I0309 13:41:29.868389 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:41:29 crc kubenswrapper[4764]: I0309 13:41:29.953574 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-scripts\") pod \"74146b7d-9780-4d2d-9454-853296f88955\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " Mar 09 13:41:29 crc kubenswrapper[4764]: I0309 13:41:29.953709 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-config-data\") pod \"74146b7d-9780-4d2d-9454-853296f88955\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " Mar 09 13:41:29 crc kubenswrapper[4764]: I0309 13:41:29.953789 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74146b7d-9780-4d2d-9454-853296f88955-etc-machine-id\") pod \"74146b7d-9780-4d2d-9454-853296f88955\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " Mar 09 13:41:29 crc kubenswrapper[4764]: I0309 13:41:29.953874 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khssq\" (UniqueName: \"kubernetes.io/projected/74146b7d-9780-4d2d-9454-853296f88955-kube-api-access-khssq\") pod \"74146b7d-9780-4d2d-9454-853296f88955\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " Mar 09 13:41:29 crc kubenswrapper[4764]: I0309 13:41:29.954036 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74146b7d-9780-4d2d-9454-853296f88955-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "74146b7d-9780-4d2d-9454-853296f88955" (UID: "74146b7d-9780-4d2d-9454-853296f88955"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:41:29 crc kubenswrapper[4764]: I0309 13:41:29.953901 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-db-sync-config-data\") pod \"74146b7d-9780-4d2d-9454-853296f88955\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " Mar 09 13:41:29 crc kubenswrapper[4764]: I0309 13:41:29.954696 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-combined-ca-bundle\") pod \"74146b7d-9780-4d2d-9454-853296f88955\" (UID: \"74146b7d-9780-4d2d-9454-853296f88955\") " Mar 09 13:41:29 crc kubenswrapper[4764]: I0309 13:41:29.955222 4764 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74146b7d-9780-4d2d-9454-853296f88955-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:29 crc kubenswrapper[4764]: I0309 13:41:29.969389 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "74146b7d-9780-4d2d-9454-853296f88955" (UID: "74146b7d-9780-4d2d-9454-853296f88955"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:29 crc kubenswrapper[4764]: I0309 13:41:29.969726 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74146b7d-9780-4d2d-9454-853296f88955-kube-api-access-khssq" (OuterVolumeSpecName: "kube-api-access-khssq") pod "74146b7d-9780-4d2d-9454-853296f88955" (UID: "74146b7d-9780-4d2d-9454-853296f88955"). InnerVolumeSpecName "kube-api-access-khssq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:29 crc kubenswrapper[4764]: I0309 13:41:29.995926 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-scripts" (OuterVolumeSpecName: "scripts") pod "74146b7d-9780-4d2d-9454-853296f88955" (UID: "74146b7d-9780-4d2d-9454-853296f88955"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.056912 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74146b7d-9780-4d2d-9454-853296f88955" (UID: "74146b7d-9780-4d2d-9454-853296f88955"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.056953 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.057503 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khssq\" (UniqueName: \"kubernetes.io/projected/74146b7d-9780-4d2d-9454-853296f88955-kube-api-access-khssq\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.057524 4764 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.159199 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.168768 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-config-data" (OuterVolumeSpecName: "config-data") pod "74146b7d-9780-4d2d-9454-853296f88955" (UID: "74146b7d-9780-4d2d-9454-853296f88955"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.261447 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74146b7d-9780-4d2d-9454-853296f88955-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.483797 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dp5x6" event={"ID":"74146b7d-9780-4d2d-9454-853296f88955","Type":"ContainerDied","Data":"eedf3d5fa18d75dee38a49a615e9d2f0a831d8c6f13a33063d424dd9831c9a81"} Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.485768 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eedf3d5fa18d75dee38a49a615e9d2f0a831d8c6f13a33063d424dd9831c9a81" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.485053 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dp5x6" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.489017 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3","Type":"ContainerStarted","Data":"4fe2306e68853bfa3180ad2e5295480a2103723960a9860736e8ad880fc1db61"} Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.489104 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3","Type":"ContainerStarted","Data":"1e8bef222d882b21a7ec25aed9429545abfc2c7a862b52718f01f100ae12b17d"} Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.794947 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 13:41:30 crc kubenswrapper[4764]: E0309 13:41:30.795906 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74146b7d-9780-4d2d-9454-853296f88955" containerName="cinder-db-sync" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.795926 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="74146b7d-9780-4d2d-9454-853296f88955" containerName="cinder-db-sync" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.796118 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="74146b7d-9780-4d2d-9454-853296f88955" containerName="cinder-db-sync" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.797946 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.805043 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.805829 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.807520 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.807774 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-4vt4n" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.808391 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.878284 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-config-data\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.883901 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.884252 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rsb9\" (UniqueName: \"kubernetes.io/projected/6afeb6a7-a0a0-40de-90ee-97b497663798-kube-api-access-5rsb9\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.884451 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.884618 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-scripts\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.885139 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6afeb6a7-a0a0-40de-90ee-97b497663798-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.960959 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-qsc97"] Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.977208 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-869f779d85-qsc97" podUID="c38cb253-b945-43b3-8dcd-209682d40f11" containerName="dnsmasq-dns" containerID="cri-o://15010747611fb0fe00d92531331c8086dfb085efb07034db0f644ee0d2a65525" gracePeriod=10 Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.988877 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rsb9\" (UniqueName: \"kubernetes.io/projected/6afeb6a7-a0a0-40de-90ee-97b497663798-kube-api-access-5rsb9\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.988961 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.988992 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-scripts\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.989080 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6afeb6a7-a0a0-40de-90ee-97b497663798-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.989122 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-config-data\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.989147 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.991033 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6afeb6a7-a0a0-40de-90ee-97b497663798-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:30 crc kubenswrapper[4764]: I0309 13:41:30.999404 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.000203 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-scripts\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.009385 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.014792 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-config-data\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.014898 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-d2v25"] Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.016791 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.023818 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-d2v25"] Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.034706 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rsb9\" (UniqueName: \"kubernetes.io/projected/6afeb6a7-a0a0-40de-90ee-97b497663798-kube-api-access-5rsb9\") pod \"cinder-scheduler-0\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.101013 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-ovsdbserver-sb\") pod \"dnsmasq-dns-58db5546cc-d2v25\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.101129 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcpch\" (UniqueName: \"kubernetes.io/projected/4e886001-842a-4f97-b4c3-d088d80e6a45-kube-api-access-zcpch\") pod \"dnsmasq-dns-58db5546cc-d2v25\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.101193 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-ovsdbserver-nb\") pod \"dnsmasq-dns-58db5546cc-d2v25\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.101247 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-config\") pod \"dnsmasq-dns-58db5546cc-d2v25\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.101288 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-dns-svc\") pod \"dnsmasq-dns-58db5546cc-d2v25\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.195809 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.203395 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-ovsdbserver-nb\") pod \"dnsmasq-dns-58db5546cc-d2v25\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.203512 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-config\") pod \"dnsmasq-dns-58db5546cc-d2v25\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.203566 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-dns-svc\") pod \"dnsmasq-dns-58db5546cc-d2v25\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.203610 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-ovsdbserver-sb\") pod \"dnsmasq-dns-58db5546cc-d2v25\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.203689 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcpch\" (UniqueName: \"kubernetes.io/projected/4e886001-842a-4f97-b4c3-d088d80e6a45-kube-api-access-zcpch\") pod \"dnsmasq-dns-58db5546cc-d2v25\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.204789 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-dns-svc\") pod \"dnsmasq-dns-58db5546cc-d2v25\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.204841 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-ovsdbserver-nb\") pod \"dnsmasq-dns-58db5546cc-d2v25\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.206179 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-config\") pod \"dnsmasq-dns-58db5546cc-d2v25\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.206331 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-ovsdbserver-sb\") pod \"dnsmasq-dns-58db5546cc-d2v25\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.240466 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcpch\" (UniqueName: \"kubernetes.io/projected/4e886001-842a-4f97-b4c3-d088d80e6a45-kube-api-access-zcpch\") pod \"dnsmasq-dns-58db5546cc-d2v25\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.253286 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.254831 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.277090 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.305622 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-scripts\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.305734 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c43805e-424a-4820-924b-314b3e2f0a84-logs\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.306158 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8c43805e-424a-4820-924b-314b3e2f0a84-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.306358 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-config-data-custom\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.306440 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz6zt\" (UniqueName: \"kubernetes.io/projected/8c43805e-424a-4820-924b-314b3e2f0a84-kube-api-access-zz6zt\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.306544 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.306574 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-config-data\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.314997 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.414347 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8c43805e-424a-4820-924b-314b3e2f0a84-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.414435 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8c43805e-424a-4820-924b-314b3e2f0a84-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.415541 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-config-data-custom\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.415574 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz6zt\" (UniqueName: \"kubernetes.io/projected/8c43805e-424a-4820-924b-314b3e2f0a84-kube-api-access-zz6zt\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.415607 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.415627 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-config-data\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.415689 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-scripts\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.415737 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c43805e-424a-4820-924b-314b3e2f0a84-logs\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.416237 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c43805e-424a-4820-924b-314b3e2f0a84-logs\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.422494 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.423255 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-config-data-custom\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.424291 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-scripts\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.430929 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-config-data\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.462343 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz6zt\" (UniqueName: \"kubernetes.io/projected/8c43805e-424a-4820-924b-314b3e2f0a84-kube-api-access-zz6zt\") pod \"cinder-api-0\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.505322 4764 generic.go:334] "Generic (PLEG): container finished" podID="c38cb253-b945-43b3-8dcd-209682d40f11" containerID="15010747611fb0fe00d92531331c8086dfb085efb07034db0f644ee0d2a65525" exitCode=0 Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.505368 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-qsc97" event={"ID":"c38cb253-b945-43b3-8dcd-209682d40f11","Type":"ContainerDied","Data":"15010747611fb0fe00d92531331c8086dfb085efb07034db0f644ee0d2a65525"} Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.530462 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.620106 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.702586 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.734880 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-ovsdbserver-nb\") pod \"c38cb253-b945-43b3-8dcd-209682d40f11\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.735042 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-489r4\" (UniqueName: \"kubernetes.io/projected/c38cb253-b945-43b3-8dcd-209682d40f11-kube-api-access-489r4\") pod \"c38cb253-b945-43b3-8dcd-209682d40f11\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.735147 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-config\") pod \"c38cb253-b945-43b3-8dcd-209682d40f11\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.735254 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-dns-svc\") pod \"c38cb253-b945-43b3-8dcd-209682d40f11\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.735374 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-ovsdbserver-sb\") pod \"c38cb253-b945-43b3-8dcd-209682d40f11\" (UID: \"c38cb253-b945-43b3-8dcd-209682d40f11\") " Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.761322 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c38cb253-b945-43b3-8dcd-209682d40f11-kube-api-access-489r4" (OuterVolumeSpecName: "kube-api-access-489r4") pod "c38cb253-b945-43b3-8dcd-209682d40f11" (UID: "c38cb253-b945-43b3-8dcd-209682d40f11"). InnerVolumeSpecName "kube-api-access-489r4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.817103 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c38cb253-b945-43b3-8dcd-209682d40f11" (UID: "c38cb253-b945-43b3-8dcd-209682d40f11"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.833093 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c38cb253-b945-43b3-8dcd-209682d40f11" (UID: "c38cb253-b945-43b3-8dcd-209682d40f11"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.840946 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-489r4\" (UniqueName: \"kubernetes.io/projected/c38cb253-b945-43b3-8dcd-209682d40f11-kube-api-access-489r4\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.840993 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.841003 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.877162 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-config" (OuterVolumeSpecName: "config") pod "c38cb253-b945-43b3-8dcd-209682d40f11" (UID: "c38cb253-b945-43b3-8dcd-209682d40f11"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.906180 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c38cb253-b945-43b3-8dcd-209682d40f11" (UID: "c38cb253-b945-43b3-8dcd-209682d40f11"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.944846 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.944906 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:31 crc kubenswrapper[4764]: I0309 13:41:31.944938 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c38cb253-b945-43b3-8dcd-209682d40f11-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:31 crc kubenswrapper[4764]: W0309 13:41:31.958413 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6afeb6a7_a0a0_40de_90ee_97b497663798.slice/crio-ca431b92c1671db841cf4cedf0d25ffe1db3212b3b5515413a5c717be63cab9f WatchSource:0}: Error finding container ca431b92c1671db841cf4cedf0d25ffe1db3212b3b5515413a5c717be63cab9f: Status 404 returned error can't find the container with id ca431b92c1671db841cf4cedf0d25ffe1db3212b3b5515413a5c717be63cab9f Mar 09 13:41:32 crc kubenswrapper[4764]: I0309 13:41:32.239557 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-d2v25"] Mar 09 13:41:32 crc kubenswrapper[4764]: I0309 13:41:32.373123 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 09 13:41:32 crc kubenswrapper[4764]: I0309 13:41:32.579354 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-qsc97" event={"ID":"c38cb253-b945-43b3-8dcd-209682d40f11","Type":"ContainerDied","Data":"744dd94a1b4869c7dc37123bb898cb66f67f114bb38f59e8f906fc2e985184ea"} Mar 09 13:41:32 crc kubenswrapper[4764]: I0309 13:41:32.579372 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869f779d85-qsc97" Mar 09 13:41:32 crc kubenswrapper[4764]: I0309 13:41:32.579799 4764 scope.go:117] "RemoveContainer" containerID="15010747611fb0fe00d92531331c8086dfb085efb07034db0f644ee0d2a65525" Mar 09 13:41:32 crc kubenswrapper[4764]: I0309 13:41:32.594895 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6afeb6a7-a0a0-40de-90ee-97b497663798","Type":"ContainerStarted","Data":"ca431b92c1671db841cf4cedf0d25ffe1db3212b3b5515413a5c717be63cab9f"} Mar 09 13:41:32 crc kubenswrapper[4764]: I0309 13:41:32.647155 4764 generic.go:334] "Generic (PLEG): container finished" podID="4e886001-842a-4f97-b4c3-d088d80e6a45" containerID="d406e10da6d16d2f9b6ec89cf139706a3d0d52a75646e6e8156525cfc445bd01" exitCode=0 Mar 09 13:41:32 crc kubenswrapper[4764]: I0309 13:41:32.647320 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-d2v25" event={"ID":"4e886001-842a-4f97-b4c3-d088d80e6a45","Type":"ContainerDied","Data":"d406e10da6d16d2f9b6ec89cf139706a3d0d52a75646e6e8156525cfc445bd01"} Mar 09 13:41:32 crc kubenswrapper[4764]: I0309 13:41:32.647364 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-d2v25" event={"ID":"4e886001-842a-4f97-b4c3-d088d80e6a45","Type":"ContainerStarted","Data":"f8318b8e268cec9ccfcf591135ec8e9761aa9bf10f09e2ff5ebd0b76bbd7c843"} Mar 09 13:41:32 crc kubenswrapper[4764]: I0309 13:41:32.699816 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-qsc97"] Mar 09 13:41:32 crc kubenswrapper[4764]: I0309 13:41:32.704073 4764 scope.go:117] "RemoveContainer" containerID="83e5b1aa14005ff710f52b631cae367c77a62e68a139a563a732645f67f44497" Mar 09 13:41:32 crc kubenswrapper[4764]: I0309 13:41:32.760169 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8c43805e-424a-4820-924b-314b3e2f0a84","Type":"ContainerStarted","Data":"ecb76bfd718406ffe153b8cac9fb315762aaa65f4ab170d3da19c41066826c56"} Mar 09 13:41:32 crc kubenswrapper[4764]: I0309 13:41:32.796743 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-qsc97"] Mar 09 13:41:33 crc kubenswrapper[4764]: I0309 13:41:33.580416 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c38cb253-b945-43b3-8dcd-209682d40f11" path="/var/lib/kubelet/pods/c38cb253-b945-43b3-8dcd-209682d40f11/volumes" Mar 09 13:41:33 crc kubenswrapper[4764]: I0309 13:41:33.587374 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 09 13:41:33 crc kubenswrapper[4764]: I0309 13:41:33.829767 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-d2v25" event={"ID":"4e886001-842a-4f97-b4c3-d088d80e6a45","Type":"ContainerStarted","Data":"f733fa97c901fa8349b84558eaddb5374df152c045dfc4a2b694279f5d3fa434"} Mar 09 13:41:33 crc kubenswrapper[4764]: I0309 13:41:33.829940 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:33 crc kubenswrapper[4764]: I0309 13:41:33.832336 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8c43805e-424a-4820-924b-314b3e2f0a84","Type":"ContainerStarted","Data":"ddff3fcd534d4f115f4a054a32d5b0f2fe7f9d7b7685de4b4a6fc84a82659da4"} Mar 09 13:41:33 crc kubenswrapper[4764]: I0309 13:41:33.836764 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3","Type":"ContainerStarted","Data":"8e00a9b64763a83494a83ac4067c14a67cca49b0a2f0a325e393f9400b39892b"} Mar 09 13:41:33 crc kubenswrapper[4764]: I0309 13:41:33.837390 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 13:41:33 crc kubenswrapper[4764]: I0309 13:41:33.862272 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58db5546cc-d2v25" podStartSLOduration=3.862246486 podStartE2EDuration="3.862246486s" podCreationTimestamp="2026-03-09 13:41:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:33.846282616 +0000 UTC m=+1249.096454534" watchObservedRunningTime="2026-03-09 13:41:33.862246486 +0000 UTC m=+1249.112418394" Mar 09 13:41:34 crc kubenswrapper[4764]: I0309 13:41:34.856835 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6afeb6a7-a0a0-40de-90ee-97b497663798","Type":"ContainerStarted","Data":"3048f7a722fe7b1924f522a8fa95862bc27ca27611f2c6a75277ad4a82b3a7d9"} Mar 09 13:41:34 crc kubenswrapper[4764]: I0309 13:41:34.862744 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8c43805e-424a-4820-924b-314b3e2f0a84","Type":"ContainerStarted","Data":"e3bfc05e3d43a60e9bf0152b630c3da61e89fcd4bab24c5c69069ee06ec1458b"} Mar 09 13:41:34 crc kubenswrapper[4764]: I0309 13:41:34.862790 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="8c43805e-424a-4820-924b-314b3e2f0a84" containerName="cinder-api-log" containerID="cri-o://ddff3fcd534d4f115f4a054a32d5b0f2fe7f9d7b7685de4b4a6fc84a82659da4" gracePeriod=30 Mar 09 13:41:34 crc kubenswrapper[4764]: I0309 13:41:34.862900 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="8c43805e-424a-4820-924b-314b3e2f0a84" containerName="cinder-api" containerID="cri-o://e3bfc05e3d43a60e9bf0152b630c3da61e89fcd4bab24c5c69069ee06ec1458b" gracePeriod=30 Mar 09 13:41:34 crc kubenswrapper[4764]: I0309 13:41:34.862919 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 09 13:41:34 crc kubenswrapper[4764]: I0309 13:41:34.898367 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.898327675 podStartE2EDuration="3.898327675s" podCreationTimestamp="2026-03-09 13:41:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:34.883822712 +0000 UTC m=+1250.133994620" watchObservedRunningTime="2026-03-09 13:41:34.898327675 +0000 UTC m=+1250.148499583" Mar 09 13:41:34 crc kubenswrapper[4764]: I0309 13:41:34.907199 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.3282932689999996 podStartE2EDuration="8.907166577s" podCreationTimestamp="2026-03-09 13:41:26 +0000 UTC" firstStartedPulling="2026-03-09 13:41:27.84361533 +0000 UTC m=+1243.093787238" lastFinishedPulling="2026-03-09 13:41:32.422488628 +0000 UTC m=+1247.672660546" observedRunningTime="2026-03-09 13:41:33.886214346 +0000 UTC m=+1249.136386254" watchObservedRunningTime="2026-03-09 13:41:34.907166577 +0000 UTC m=+1250.157338485" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.300365 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.646086 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6c97985d69-khcvg"] Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.655214 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6c97985d69-khcvg" podUID="5d65fa53-be02-4d11-b300-5cb4629c03da" containerName="neutron-api" containerID="cri-o://af86c5d555559ee78e3558628ecf704c4c51493754120be1f860b79edb606896" gracePeriod=30 Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.656818 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6c97985d69-khcvg" podUID="5d65fa53-be02-4d11-b300-5cb4629c03da" containerName="neutron-httpd" containerID="cri-o://cff13fa571b5cb0d7d8cf860c4a1efdb55518757df6a7508bbd7dca053bc0e44" gracePeriod=30 Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.728465 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6b7bfdfd5-56dnz"] Mar 09 13:41:35 crc kubenswrapper[4764]: E0309 13:41:35.728953 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c38cb253-b945-43b3-8dcd-209682d40f11" containerName="dnsmasq-dns" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.728971 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c38cb253-b945-43b3-8dcd-209682d40f11" containerName="dnsmasq-dns" Mar 09 13:41:35 crc kubenswrapper[4764]: E0309 13:41:35.728985 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c38cb253-b945-43b3-8dcd-209682d40f11" containerName="init" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.728992 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c38cb253-b945-43b3-8dcd-209682d40f11" containerName="init" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.729194 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c38cb253-b945-43b3-8dcd-209682d40f11" containerName="dnsmasq-dns" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.730288 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.766458 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6b7bfdfd5-56dnz"] Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.781778 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-httpd-config\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.781832 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-config\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.781870 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-internal-tls-certs\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.781921 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8f9s\" (UniqueName: \"kubernetes.io/projected/fd7dadfc-b8e4-479f-8880-4ffeec051d30-kube-api-access-h8f9s\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.781951 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-ovndb-tls-certs\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.782002 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-public-tls-certs\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.782022 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-combined-ca-bundle\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.911534 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-public-tls-certs\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.911609 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-combined-ca-bundle\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.911732 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-httpd-config\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.911794 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-config\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.911857 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-internal-tls-certs\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.911938 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8f9s\" (UniqueName: \"kubernetes.io/projected/fd7dadfc-b8e4-479f-8880-4ffeec051d30-kube-api-access-h8f9s\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.912000 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-ovndb-tls-certs\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.920630 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-ovndb-tls-certs\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.923991 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-httpd-config\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.927160 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-config\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.941338 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.968266 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-internal-tls-certs\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.973855 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-public-tls-certs\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.979382 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8f9s\" (UniqueName: \"kubernetes.io/projected/fd7dadfc-b8e4-479f-8880-4ffeec051d30-kube-api-access-h8f9s\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.981584 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd7dadfc-b8e4-479f-8880-4ffeec051d30-combined-ca-bundle\") pod \"neutron-6b7bfdfd5-56dnz\" (UID: \"fd7dadfc-b8e4-479f-8880-4ffeec051d30\") " pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.985344 4764 generic.go:334] "Generic (PLEG): container finished" podID="8c43805e-424a-4820-924b-314b3e2f0a84" containerID="e3bfc05e3d43a60e9bf0152b630c3da61e89fcd4bab24c5c69069ee06ec1458b" exitCode=0 Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.985369 4764 generic.go:334] "Generic (PLEG): container finished" podID="8c43805e-424a-4820-924b-314b3e2f0a84" containerID="ddff3fcd534d4f115f4a054a32d5b0f2fe7f9d7b7685de4b4a6fc84a82659da4" exitCode=143 Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.985476 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8c43805e-424a-4820-924b-314b3e2f0a84","Type":"ContainerDied","Data":"e3bfc05e3d43a60e9bf0152b630c3da61e89fcd4bab24c5c69069ee06ec1458b"} Mar 09 13:41:35 crc kubenswrapper[4764]: I0309 13:41:35.985509 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8c43805e-424a-4820-924b-314b3e2f0a84","Type":"ContainerDied","Data":"ddff3fcd534d4f115f4a054a32d5b0f2fe7f9d7b7685de4b4a6fc84a82659da4"} Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.035888 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6afeb6a7-a0a0-40de-90ee-97b497663798","Type":"ContainerStarted","Data":"dd358903bd4bede8bf029bbf11c70e3e47fe50f447bead2b369061bf66fe7262"} Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.089109 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.122871 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.988840153 podStartE2EDuration="6.122847117s" podCreationTimestamp="2026-03-09 13:41:30 +0000 UTC" firstStartedPulling="2026-03-09 13:41:31.962365485 +0000 UTC m=+1247.212537393" lastFinishedPulling="2026-03-09 13:41:33.096372449 +0000 UTC m=+1248.346544357" observedRunningTime="2026-03-09 13:41:36.107267197 +0000 UTC m=+1251.357439105" watchObservedRunningTime="2026-03-09 13:41:36.122847117 +0000 UTC m=+1251.373019025" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.198009 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.447086 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.518276 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.570622 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-scripts\") pod \"8c43805e-424a-4820-924b-314b3e2f0a84\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.570722 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8c43805e-424a-4820-924b-314b3e2f0a84-etc-machine-id\") pod \"8c43805e-424a-4820-924b-314b3e2f0a84\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.570750 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-config-data\") pod \"8c43805e-424a-4820-924b-314b3e2f0a84\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.570840 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-config-data-custom\") pod \"8c43805e-424a-4820-924b-314b3e2f0a84\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.570906 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c43805e-424a-4820-924b-314b3e2f0a84-logs\") pod \"8c43805e-424a-4820-924b-314b3e2f0a84\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.571186 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-combined-ca-bundle\") pod \"8c43805e-424a-4820-924b-314b3e2f0a84\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.571231 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz6zt\" (UniqueName: \"kubernetes.io/projected/8c43805e-424a-4820-924b-314b3e2f0a84-kube-api-access-zz6zt\") pod \"8c43805e-424a-4820-924b-314b3e2f0a84\" (UID: \"8c43805e-424a-4820-924b-314b3e2f0a84\") " Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.574871 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8c43805e-424a-4820-924b-314b3e2f0a84-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8c43805e-424a-4820-924b-314b3e2f0a84" (UID: "8c43805e-424a-4820-924b-314b3e2f0a84"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.577851 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c43805e-424a-4820-924b-314b3e2f0a84-logs" (OuterVolumeSpecName: "logs") pod "8c43805e-424a-4820-924b-314b3e2f0a84" (UID: "8c43805e-424a-4820-924b-314b3e2f0a84"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.580158 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c43805e-424a-4820-924b-314b3e2f0a84-kube-api-access-zz6zt" (OuterVolumeSpecName: "kube-api-access-zz6zt") pod "8c43805e-424a-4820-924b-314b3e2f0a84" (UID: "8c43805e-424a-4820-924b-314b3e2f0a84"). InnerVolumeSpecName "kube-api-access-zz6zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.586918 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-scripts" (OuterVolumeSpecName: "scripts") pod "8c43805e-424a-4820-924b-314b3e2f0a84" (UID: "8c43805e-424a-4820-924b-314b3e2f0a84"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.590920 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8c43805e-424a-4820-924b-314b3e2f0a84" (UID: "8c43805e-424a-4820-924b-314b3e2f0a84"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.641257 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c43805e-424a-4820-924b-314b3e2f0a84" (UID: "8c43805e-424a-4820-924b-314b3e2f0a84"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.643254 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-config-data" (OuterVolumeSpecName: "config-data") pod "8c43805e-424a-4820-924b-314b3e2f0a84" (UID: "8c43805e-424a-4820-924b-314b3e2f0a84"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.676140 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.676171 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz6zt\" (UniqueName: \"kubernetes.io/projected/8c43805e-424a-4820-924b-314b3e2f0a84-kube-api-access-zz6zt\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.676183 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.676192 4764 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8c43805e-424a-4820-924b-314b3e2f0a84-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.676202 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.676213 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c43805e-424a-4820-924b-314b3e2f0a84-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.676222 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c43805e-424a-4820-924b-314b3e2f0a84-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:36 crc kubenswrapper[4764]: I0309 13:41:36.783442 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.057200 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.057731 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8c43805e-424a-4820-924b-314b3e2f0a84","Type":"ContainerDied","Data":"ecb76bfd718406ffe153b8cac9fb315762aaa65f4ab170d3da19c41066826c56"} Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.057824 4764 scope.go:117] "RemoveContainer" containerID="e3bfc05e3d43a60e9bf0152b630c3da61e89fcd4bab24c5c69069ee06ec1458b" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.122952 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.137730 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.152013 4764 scope.go:117] "RemoveContainer" containerID="ddff3fcd534d4f115f4a054a32d5b0f2fe7f9d7b7685de4b4a6fc84a82659da4" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.156901 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 09 13:41:37 crc kubenswrapper[4764]: E0309 13:41:37.157412 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c43805e-424a-4820-924b-314b3e2f0a84" containerName="cinder-api" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.157429 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c43805e-424a-4820-924b-314b3e2f0a84" containerName="cinder-api" Mar 09 13:41:37 crc kubenswrapper[4764]: E0309 13:41:37.157453 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c43805e-424a-4820-924b-314b3e2f0a84" containerName="cinder-api-log" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.157459 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c43805e-424a-4820-924b-314b3e2f0a84" containerName="cinder-api-log" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.157684 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c43805e-424a-4820-924b-314b3e2f0a84" containerName="cinder-api" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.157737 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c43805e-424a-4820-924b-314b3e2f0a84" containerName="cinder-api-log" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.160546 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.165323 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.165880 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.166049 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.186249 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.298991 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-config-data-custom\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.299066 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfrtf\" (UniqueName: \"kubernetes.io/projected/9cf43ab7-e625-4ffa-9af4-9f810a43d270-kube-api-access-rfrtf\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.299137 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.299185 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.299212 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-config-data\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.299431 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-scripts\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.299512 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9cf43ab7-e625-4ffa-9af4-9f810a43d270-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.299559 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cf43ab7-e625-4ffa-9af4-9f810a43d270-logs\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.299693 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.347913 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6b7bfdfd5-56dnz"] Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.401416 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9cf43ab7-e625-4ffa-9af4-9f810a43d270-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.401924 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cf43ab7-e625-4ffa-9af4-9f810a43d270-logs\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.401966 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.402060 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-config-data-custom\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.402092 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfrtf\" (UniqueName: \"kubernetes.io/projected/9cf43ab7-e625-4ffa-9af4-9f810a43d270-kube-api-access-rfrtf\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.402160 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.402280 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-config-data\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.402303 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.402329 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-scripts\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.401801 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9cf43ab7-e625-4ffa-9af4-9f810a43d270-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.403087 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cf43ab7-e625-4ffa-9af4-9f810a43d270-logs\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.408527 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-scripts\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.415178 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.415466 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.415771 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-config-data-custom\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.430974 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.431069 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cf43ab7-e625-4ffa-9af4-9f810a43d270-config-data\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.438206 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfrtf\" (UniqueName: \"kubernetes.io/projected/9cf43ab7-e625-4ffa-9af4-9f810a43d270-kube-api-access-rfrtf\") pod \"cinder-api-0\" (UID: \"9cf43ab7-e625-4ffa-9af4-9f810a43d270\") " pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.541199 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.552373 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6c97985d69-khcvg" podUID="5d65fa53-be02-4d11-b300-5cb4629c03da" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.146:9696/\": dial tcp 10.217.0.146:9696: connect: connection refused" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.583853 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c43805e-424a-4820-924b-314b3e2f0a84" path="/var/lib/kubelet/pods/8c43805e-424a-4820-924b-314b3e2f0a84/volumes" Mar 09 13:41:37 crc kubenswrapper[4764]: I0309 13:41:37.726950 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7dcdc6487b-j8w75" podUID="60a250ee-f349-4bab-b5b4-b402289210a6" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.096930 4764 generic.go:334] "Generic (PLEG): container finished" podID="5d65fa53-be02-4d11-b300-5cb4629c03da" containerID="cff13fa571b5cb0d7d8cf860c4a1efdb55518757df6a7508bbd7dca053bc0e44" exitCode=0 Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.097446 4764 generic.go:334] "Generic (PLEG): container finished" podID="5d65fa53-be02-4d11-b300-5cb4629c03da" containerID="af86c5d555559ee78e3558628ecf704c4c51493754120be1f860b79edb606896" exitCode=0 Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.097594 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c97985d69-khcvg" event={"ID":"5d65fa53-be02-4d11-b300-5cb4629c03da","Type":"ContainerDied","Data":"cff13fa571b5cb0d7d8cf860c4a1efdb55518757df6a7508bbd7dca053bc0e44"} Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.097658 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c97985d69-khcvg" event={"ID":"5d65fa53-be02-4d11-b300-5cb4629c03da","Type":"ContainerDied","Data":"af86c5d555559ee78e3558628ecf704c4c51493754120be1f860b79edb606896"} Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.111084 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b7bfdfd5-56dnz" event={"ID":"fd7dadfc-b8e4-479f-8880-4ffeec051d30","Type":"ContainerStarted","Data":"6f8188cfb291dea06d23e38b0677f28aab694b1f88c37dae466403569d1a0201"} Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.111153 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b7bfdfd5-56dnz" event={"ID":"fd7dadfc-b8e4-479f-8880-4ffeec051d30","Type":"ContainerStarted","Data":"1529a25d88fc576b021e03beb3031e5bc9556086c62d711281315ccd609d8726"} Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.511385 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 09 13:41:38 crc kubenswrapper[4764]: W0309 13:41:38.523789 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cf43ab7_e625_4ffa_9af4_9f810a43d270.slice/crio-16e6a56dec1e899bdf9119ed1f909410e74b18a5610ad6a7a75a8cc5bd666b78 WatchSource:0}: Error finding container 16e6a56dec1e899bdf9119ed1f909410e74b18a5610ad6a7a75a8cc5bd666b78: Status 404 returned error can't find the container with id 16e6a56dec1e899bdf9119ed1f909410e74b18a5610ad6a7a75a8cc5bd666b78 Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.656857 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.745442 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-public-tls-certs\") pod \"5d65fa53-be02-4d11-b300-5cb4629c03da\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.761173 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnnlq\" (UniqueName: \"kubernetes.io/projected/5d65fa53-be02-4d11-b300-5cb4629c03da-kube-api-access-bnnlq\") pod \"5d65fa53-be02-4d11-b300-5cb4629c03da\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.761281 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-internal-tls-certs\") pod \"5d65fa53-be02-4d11-b300-5cb4629c03da\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.761382 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-ovndb-tls-certs\") pod \"5d65fa53-be02-4d11-b300-5cb4629c03da\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.761458 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-config\") pod \"5d65fa53-be02-4d11-b300-5cb4629c03da\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.761608 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-httpd-config\") pod \"5d65fa53-be02-4d11-b300-5cb4629c03da\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.761716 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-combined-ca-bundle\") pod \"5d65fa53-be02-4d11-b300-5cb4629c03da\" (UID: \"5d65fa53-be02-4d11-b300-5cb4629c03da\") " Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.784148 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d65fa53-be02-4d11-b300-5cb4629c03da-kube-api-access-bnnlq" (OuterVolumeSpecName: "kube-api-access-bnnlq") pod "5d65fa53-be02-4d11-b300-5cb4629c03da" (UID: "5d65fa53-be02-4d11-b300-5cb4629c03da"). InnerVolumeSpecName "kube-api-access-bnnlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.798840 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "5d65fa53-be02-4d11-b300-5cb4629c03da" (UID: "5d65fa53-be02-4d11-b300-5cb4629c03da"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.845049 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5d65fa53-be02-4d11-b300-5cb4629c03da" (UID: "5d65fa53-be02-4d11-b300-5cb4629c03da"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.859410 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d65fa53-be02-4d11-b300-5cb4629c03da" (UID: "5d65fa53-be02-4d11-b300-5cb4629c03da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.860844 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-config" (OuterVolumeSpecName: "config") pod "5d65fa53-be02-4d11-b300-5cb4629c03da" (UID: "5d65fa53-be02-4d11-b300-5cb4629c03da"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.863956 4764 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.863987 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.863998 4764 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.864007 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnnlq\" (UniqueName: \"kubernetes.io/projected/5d65fa53-be02-4d11-b300-5cb4629c03da-kube-api-access-bnnlq\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.864016 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.864583 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5d65fa53-be02-4d11-b300-5cb4629c03da" (UID: "5d65fa53-be02-4d11-b300-5cb4629c03da"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.888824 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "5d65fa53-be02-4d11-b300-5cb4629c03da" (UID: "5d65fa53-be02-4d11-b300-5cb4629c03da"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.970375 4764 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:38 crc kubenswrapper[4764]: I0309 13:41:38.974707 4764 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d65fa53-be02-4d11-b300-5cb4629c03da-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:39 crc kubenswrapper[4764]: I0309 13:41:39.124456 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c97985d69-khcvg" event={"ID":"5d65fa53-be02-4d11-b300-5cb4629c03da","Type":"ContainerDied","Data":"63c3a5316fc5ec3fc301ebe753725e716ab876289ed6944f416ebd4b78894abb"} Mar 09 13:41:39 crc kubenswrapper[4764]: I0309 13:41:39.124525 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c97985d69-khcvg" Mar 09 13:41:39 crc kubenswrapper[4764]: I0309 13:41:39.124538 4764 scope.go:117] "RemoveContainer" containerID="cff13fa571b5cb0d7d8cf860c4a1efdb55518757df6a7508bbd7dca053bc0e44" Mar 09 13:41:39 crc kubenswrapper[4764]: I0309 13:41:39.141431 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9cf43ab7-e625-4ffa-9af4-9f810a43d270","Type":"ContainerStarted","Data":"16e6a56dec1e899bdf9119ed1f909410e74b18a5610ad6a7a75a8cc5bd666b78"} Mar 09 13:41:39 crc kubenswrapper[4764]: I0309 13:41:39.154813 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b7bfdfd5-56dnz" event={"ID":"fd7dadfc-b8e4-479f-8880-4ffeec051d30","Type":"ContainerStarted","Data":"4d891998d26f933839c81257749a0591a2f852656838ddfe2f49f0007eedf1e7"} Mar 09 13:41:39 crc kubenswrapper[4764]: I0309 13:41:39.155816 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:41:39 crc kubenswrapper[4764]: I0309 13:41:39.163572 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6c97985d69-khcvg"] Mar 09 13:41:39 crc kubenswrapper[4764]: I0309 13:41:39.174590 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6c97985d69-khcvg"] Mar 09 13:41:39 crc kubenswrapper[4764]: I0309 13:41:39.192863 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6b7bfdfd5-56dnz" podStartSLOduration=4.192837026 podStartE2EDuration="4.192837026s" podCreationTimestamp="2026-03-09 13:41:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:39.192457256 +0000 UTC m=+1254.442629164" watchObservedRunningTime="2026-03-09 13:41:39.192837026 +0000 UTC m=+1254.443008934" Mar 09 13:41:39 crc kubenswrapper[4764]: I0309 13:41:39.281960 4764 scope.go:117] "RemoveContainer" containerID="af86c5d555559ee78e3558628ecf704c4c51493754120be1f860b79edb606896" Mar 09 13:41:39 crc kubenswrapper[4764]: I0309 13:41:39.513529 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:39 crc kubenswrapper[4764]: I0309 13:41:39.586328 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d65fa53-be02-4d11-b300-5cb4629c03da" path="/var/lib/kubelet/pods/5d65fa53-be02-4d11-b300-5cb4629c03da/volumes" Mar 09 13:41:40 crc kubenswrapper[4764]: I0309 13:41:40.128903 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-94887676d-fp9dl" Mar 09 13:41:40 crc kubenswrapper[4764]: I0309 13:41:40.173686 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9cf43ab7-e625-4ffa-9af4-9f810a43d270","Type":"ContainerStarted","Data":"e31a2bbf58badcb3152b918a495470f759eaa31d486f175e453e447ec70b971f"} Mar 09 13:41:40 crc kubenswrapper[4764]: I0309 13:41:40.173744 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9cf43ab7-e625-4ffa-9af4-9f810a43d270","Type":"ContainerStarted","Data":"5ca851280bdd40db8d723ce3b940914a960b436b472d7f96ef6e9363f1f7d55a"} Mar 09 13:41:40 crc kubenswrapper[4764]: I0309 13:41:40.173854 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 09 13:41:40 crc kubenswrapper[4764]: I0309 13:41:40.202924 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.202905274 podStartE2EDuration="3.202905274s" podCreationTimestamp="2026-03-09 13:41:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:40.193314303 +0000 UTC m=+1255.443486231" watchObservedRunningTime="2026-03-09 13:41:40.202905274 +0000 UTC m=+1255.453077182" Mar 09 13:41:40 crc kubenswrapper[4764]: I0309 13:41:40.230813 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7dcdc6487b-j8w75"] Mar 09 13:41:40 crc kubenswrapper[4764]: I0309 13:41:40.231073 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7dcdc6487b-j8w75" podUID="60a250ee-f349-4bab-b5b4-b402289210a6" containerName="barbican-api-log" containerID="cri-o://4c95fd726c1b28b4f8ab8a6d5ebf1002f0c9dea4159ff74f568392de3a2ff067" gracePeriod=30 Mar 09 13:41:40 crc kubenswrapper[4764]: I0309 13:41:40.231213 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7dcdc6487b-j8w75" podUID="60a250ee-f349-4bab-b5b4-b402289210a6" containerName="barbican-api" containerID="cri-o://73990548f38f06ad2795c4de799dea38305ca7065b97482d22fcdeed8a8051c5" gracePeriod=30 Mar 09 13:41:41 crc kubenswrapper[4764]: I0309 13:41:41.189532 4764 generic.go:334] "Generic (PLEG): container finished" podID="60a250ee-f349-4bab-b5b4-b402289210a6" containerID="4c95fd726c1b28b4f8ab8a6d5ebf1002f0c9dea4159ff74f568392de3a2ff067" exitCode=143 Mar 09 13:41:41 crc kubenswrapper[4764]: I0309 13:41:41.189612 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dcdc6487b-j8w75" event={"ID":"60a250ee-f349-4bab-b5b4-b402289210a6","Type":"ContainerDied","Data":"4c95fd726c1b28b4f8ab8a6d5ebf1002f0c9dea4159ff74f568392de3a2ff067"} Mar 09 13:41:41 crc kubenswrapper[4764]: I0309 13:41:41.466000 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 09 13:41:41 crc kubenswrapper[4764]: I0309 13:41:41.516655 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 13:41:41 crc kubenswrapper[4764]: I0309 13:41:41.532689 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:41:41 crc kubenswrapper[4764]: I0309 13:41:41.669670 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-sh5rn"] Mar 09 13:41:41 crc kubenswrapper[4764]: I0309 13:41:41.669949 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" podUID="5b2b268a-adc9-46ca-908a-d30ab8543059" containerName="dnsmasq-dns" containerID="cri-o://20dab72734e5903795d478bac44970ac57f0690f8f454dee9dae223fbd3e92ac" gracePeriod=10 Mar 09 13:41:41 crc kubenswrapper[4764]: I0309 13:41:41.719850 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:41 crc kubenswrapper[4764]: I0309 13:41:41.958551 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.205726 4764 generic.go:334] "Generic (PLEG): container finished" podID="5b2b268a-adc9-46ca-908a-d30ab8543059" containerID="20dab72734e5903795d478bac44970ac57f0690f8f454dee9dae223fbd3e92ac" exitCode=0 Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.206065 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="6afeb6a7-a0a0-40de-90ee-97b497663798" containerName="cinder-scheduler" containerID="cri-o://3048f7a722fe7b1924f522a8fa95862bc27ca27611f2c6a75277ad4a82b3a7d9" gracePeriod=30 Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.206500 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" event={"ID":"5b2b268a-adc9-46ca-908a-d30ab8543059","Type":"ContainerDied","Data":"20dab72734e5903795d478bac44970ac57f0690f8f454dee9dae223fbd3e92ac"} Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.206532 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" event={"ID":"5b2b268a-adc9-46ca-908a-d30ab8543059","Type":"ContainerDied","Data":"d82d6336b352da3e8f6c37dd12f8ae1d4a1f5e3d8a23342826dc108286471929"} Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.206544 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d82d6336b352da3e8f6c37dd12f8ae1d4a1f5e3d8a23342826dc108286471929" Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.208019 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="6afeb6a7-a0a0-40de-90ee-97b497663798" containerName="probe" containerID="cri-o://dd358903bd4bede8bf029bbf11c70e3e47fe50f447bead2b369061bf66fe7262" gracePeriod=30 Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.279310 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.382823 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-ovsdbserver-sb\") pod \"5b2b268a-adc9-46ca-908a-d30ab8543059\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.383190 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-ovsdbserver-nb\") pod \"5b2b268a-adc9-46ca-908a-d30ab8543059\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.383215 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-config\") pod \"5b2b268a-adc9-46ca-908a-d30ab8543059\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.383468 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f92p7\" (UniqueName: \"kubernetes.io/projected/5b2b268a-adc9-46ca-908a-d30ab8543059-kube-api-access-f92p7\") pod \"5b2b268a-adc9-46ca-908a-d30ab8543059\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.383492 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-dns-svc\") pod \"5b2b268a-adc9-46ca-908a-d30ab8543059\" (UID: \"5b2b268a-adc9-46ca-908a-d30ab8543059\") " Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.408049 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b2b268a-adc9-46ca-908a-d30ab8543059-kube-api-access-f92p7" (OuterVolumeSpecName: "kube-api-access-f92p7") pod "5b2b268a-adc9-46ca-908a-d30ab8543059" (UID: "5b2b268a-adc9-46ca-908a-d30ab8543059"). InnerVolumeSpecName "kube-api-access-f92p7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.438387 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5b2b268a-adc9-46ca-908a-d30ab8543059" (UID: "5b2b268a-adc9-46ca-908a-d30ab8543059"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.438570 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5b2b268a-adc9-46ca-908a-d30ab8543059" (UID: "5b2b268a-adc9-46ca-908a-d30ab8543059"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.444743 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5b2b268a-adc9-46ca-908a-d30ab8543059" (UID: "5b2b268a-adc9-46ca-908a-d30ab8543059"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.465491 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-config" (OuterVolumeSpecName: "config") pod "5b2b268a-adc9-46ca-908a-d30ab8543059" (UID: "5b2b268a-adc9-46ca-908a-d30ab8543059"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.485770 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f92p7\" (UniqueName: \"kubernetes.io/projected/5b2b268a-adc9-46ca-908a-d30ab8543059-kube-api-access-f92p7\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.485805 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.485817 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.485826 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:42 crc kubenswrapper[4764]: I0309 13:41:42.485834 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b2b268a-adc9-46ca-908a-d30ab8543059-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:43 crc kubenswrapper[4764]: I0309 13:41:43.214148 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-sh5rn" Mar 09 13:41:43 crc kubenswrapper[4764]: I0309 13:41:43.255957 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-sh5rn"] Mar 09 13:41:43 crc kubenswrapper[4764]: I0309 13:41:43.263271 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-sh5rn"] Mar 09 13:41:43 crc kubenswrapper[4764]: I0309 13:41:43.576106 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b2b268a-adc9-46ca-908a-d30ab8543059" path="/var/lib/kubelet/pods/5b2b268a-adc9-46ca-908a-d30ab8543059/volumes" Mar 09 13:41:43 crc kubenswrapper[4764]: I0309 13:41:43.686449 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7dcdc6487b-j8w75" podUID="60a250ee-f349-4bab-b5b4-b402289210a6" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": dial tcp 10.217.0.153:9311: connect: connection refused" Mar 09 13:41:43 crc kubenswrapper[4764]: I0309 13:41:43.692005 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7dcdc6487b-j8w75" podUID="60a250ee-f349-4bab-b5b4-b402289210a6" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": dial tcp 10.217.0.153:9311: connect: connection refused" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.077541 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.188406 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-config-data\") pod \"60a250ee-f349-4bab-b5b4-b402289210a6\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.188631 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60a250ee-f349-4bab-b5b4-b402289210a6-logs\") pod \"60a250ee-f349-4bab-b5b4-b402289210a6\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.188789 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-combined-ca-bundle\") pod \"60a250ee-f349-4bab-b5b4-b402289210a6\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.188861 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gstx5\" (UniqueName: \"kubernetes.io/projected/60a250ee-f349-4bab-b5b4-b402289210a6-kube-api-access-gstx5\") pod \"60a250ee-f349-4bab-b5b4-b402289210a6\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.188992 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-config-data-custom\") pod \"60a250ee-f349-4bab-b5b4-b402289210a6\" (UID: \"60a250ee-f349-4bab-b5b4-b402289210a6\") " Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.190131 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60a250ee-f349-4bab-b5b4-b402289210a6-logs" (OuterVolumeSpecName: "logs") pod "60a250ee-f349-4bab-b5b4-b402289210a6" (UID: "60a250ee-f349-4bab-b5b4-b402289210a6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.198420 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "60a250ee-f349-4bab-b5b4-b402289210a6" (UID: "60a250ee-f349-4bab-b5b4-b402289210a6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.198731 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60a250ee-f349-4bab-b5b4-b402289210a6-kube-api-access-gstx5" (OuterVolumeSpecName: "kube-api-access-gstx5") pod "60a250ee-f349-4bab-b5b4-b402289210a6" (UID: "60a250ee-f349-4bab-b5b4-b402289210a6"). InnerVolumeSpecName "kube-api-access-gstx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.227497 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.241881 4764 generic.go:334] "Generic (PLEG): container finished" podID="60a250ee-f349-4bab-b5b4-b402289210a6" containerID="73990548f38f06ad2795c4de799dea38305ca7065b97482d22fcdeed8a8051c5" exitCode=0 Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.242042 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dcdc6487b-j8w75" event={"ID":"60a250ee-f349-4bab-b5b4-b402289210a6","Type":"ContainerDied","Data":"73990548f38f06ad2795c4de799dea38305ca7065b97482d22fcdeed8a8051c5"} Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.242126 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dcdc6487b-j8w75" event={"ID":"60a250ee-f349-4bab-b5b4-b402289210a6","Type":"ContainerDied","Data":"60454879be0f224ed5bb74217d2090bdcd6efc318e5799fb2111033ee5eacbe4"} Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.242193 4764 scope.go:117] "RemoveContainer" containerID="73990548f38f06ad2795c4de799dea38305ca7065b97482d22fcdeed8a8051c5" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.242389 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7dcdc6487b-j8w75" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.246869 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60a250ee-f349-4bab-b5b4-b402289210a6" (UID: "60a250ee-f349-4bab-b5b4-b402289210a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.259042 4764 generic.go:334] "Generic (PLEG): container finished" podID="6afeb6a7-a0a0-40de-90ee-97b497663798" containerID="dd358903bd4bede8bf029bbf11c70e3e47fe50f447bead2b369061bf66fe7262" exitCode=0 Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.259082 4764 generic.go:334] "Generic (PLEG): container finished" podID="6afeb6a7-a0a0-40de-90ee-97b497663798" containerID="3048f7a722fe7b1924f522a8fa95862bc27ca27611f2c6a75277ad4a82b3a7d9" exitCode=0 Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.259110 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6afeb6a7-a0a0-40de-90ee-97b497663798","Type":"ContainerDied","Data":"dd358903bd4bede8bf029bbf11c70e3e47fe50f447bead2b369061bf66fe7262"} Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.259147 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6afeb6a7-a0a0-40de-90ee-97b497663798","Type":"ContainerDied","Data":"3048f7a722fe7b1924f522a8fa95862bc27ca27611f2c6a75277ad4a82b3a7d9"} Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.296685 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60a250ee-f349-4bab-b5b4-b402289210a6-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.296726 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.296833 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gstx5\" (UniqueName: \"kubernetes.io/projected/60a250ee-f349-4bab-b5b4-b402289210a6-kube-api-access-gstx5\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.296850 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.329010 4764 scope.go:117] "RemoveContainer" containerID="4c95fd726c1b28b4f8ab8a6d5ebf1002f0c9dea4159ff74f568392de3a2ff067" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.350070 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-config-data" (OuterVolumeSpecName: "config-data") pod "60a250ee-f349-4bab-b5b4-b402289210a6" (UID: "60a250ee-f349-4bab-b5b4-b402289210a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.377868 4764 scope.go:117] "RemoveContainer" containerID="73990548f38f06ad2795c4de799dea38305ca7065b97482d22fcdeed8a8051c5" Mar 09 13:41:44 crc kubenswrapper[4764]: E0309 13:41:44.389795 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73990548f38f06ad2795c4de799dea38305ca7065b97482d22fcdeed8a8051c5\": container with ID starting with 73990548f38f06ad2795c4de799dea38305ca7065b97482d22fcdeed8a8051c5 not found: ID does not exist" containerID="73990548f38f06ad2795c4de799dea38305ca7065b97482d22fcdeed8a8051c5" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.389851 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73990548f38f06ad2795c4de799dea38305ca7065b97482d22fcdeed8a8051c5"} err="failed to get container status \"73990548f38f06ad2795c4de799dea38305ca7065b97482d22fcdeed8a8051c5\": rpc error: code = NotFound desc = could not find container \"73990548f38f06ad2795c4de799dea38305ca7065b97482d22fcdeed8a8051c5\": container with ID starting with 73990548f38f06ad2795c4de799dea38305ca7065b97482d22fcdeed8a8051c5 not found: ID does not exist" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.389880 4764 scope.go:117] "RemoveContainer" containerID="4c95fd726c1b28b4f8ab8a6d5ebf1002f0c9dea4159ff74f568392de3a2ff067" Mar 09 13:41:44 crc kubenswrapper[4764]: E0309 13:41:44.390320 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c95fd726c1b28b4f8ab8a6d5ebf1002f0c9dea4159ff74f568392de3a2ff067\": container with ID starting with 4c95fd726c1b28b4f8ab8a6d5ebf1002f0c9dea4159ff74f568392de3a2ff067 not found: ID does not exist" containerID="4c95fd726c1b28b4f8ab8a6d5ebf1002f0c9dea4159ff74f568392de3a2ff067" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.390345 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c95fd726c1b28b4f8ab8a6d5ebf1002f0c9dea4159ff74f568392de3a2ff067"} err="failed to get container status \"4c95fd726c1b28b4f8ab8a6d5ebf1002f0c9dea4159ff74f568392de3a2ff067\": rpc error: code = NotFound desc = could not find container \"4c95fd726c1b28b4f8ab8a6d5ebf1002f0c9dea4159ff74f568392de3a2ff067\": container with ID starting with 4c95fd726c1b28b4f8ab8a6d5ebf1002f0c9dea4159ff74f568392de3a2ff067 not found: ID does not exist" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.398624 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a250ee-f349-4bab-b5b4-b402289210a6-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.433077 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-759c9c64fb-nwls6" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.512394 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.589657 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7dcdc6487b-j8w75"] Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.605536 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7dcdc6487b-j8w75"] Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.705485 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-combined-ca-bundle\") pod \"6afeb6a7-a0a0-40de-90ee-97b497663798\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.705758 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-scripts\") pod \"6afeb6a7-a0a0-40de-90ee-97b497663798\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.705799 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6afeb6a7-a0a0-40de-90ee-97b497663798-etc-machine-id\") pod \"6afeb6a7-a0a0-40de-90ee-97b497663798\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.705938 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rsb9\" (UniqueName: \"kubernetes.io/projected/6afeb6a7-a0a0-40de-90ee-97b497663798-kube-api-access-5rsb9\") pod \"6afeb6a7-a0a0-40de-90ee-97b497663798\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.706073 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-config-data-custom\") pod \"6afeb6a7-a0a0-40de-90ee-97b497663798\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.706098 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-config-data\") pod \"6afeb6a7-a0a0-40de-90ee-97b497663798\" (UID: \"6afeb6a7-a0a0-40de-90ee-97b497663798\") " Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.706338 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6afeb6a7-a0a0-40de-90ee-97b497663798-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6afeb6a7-a0a0-40de-90ee-97b497663798" (UID: "6afeb6a7-a0a0-40de-90ee-97b497663798"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.706736 4764 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6afeb6a7-a0a0-40de-90ee-97b497663798-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.712726 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-scripts" (OuterVolumeSpecName: "scripts") pod "6afeb6a7-a0a0-40de-90ee-97b497663798" (UID: "6afeb6a7-a0a0-40de-90ee-97b497663798"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.714559 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6afeb6a7-a0a0-40de-90ee-97b497663798-kube-api-access-5rsb9" (OuterVolumeSpecName: "kube-api-access-5rsb9") pod "6afeb6a7-a0a0-40de-90ee-97b497663798" (UID: "6afeb6a7-a0a0-40de-90ee-97b497663798"). InnerVolumeSpecName "kube-api-access-5rsb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.715823 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6afeb6a7-a0a0-40de-90ee-97b497663798" (UID: "6afeb6a7-a0a0-40de-90ee-97b497663798"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.771824 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6afeb6a7-a0a0-40de-90ee-97b497663798" (UID: "6afeb6a7-a0a0-40de-90ee-97b497663798"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.774479 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-f85c59cb-gm4df" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.812560 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rsb9\" (UniqueName: \"kubernetes.io/projected/6afeb6a7-a0a0-40de-90ee-97b497663798-kube-api-access-5rsb9\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.812598 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.812607 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.812616 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.833608 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-config-data" (OuterVolumeSpecName: "config-data") pod "6afeb6a7-a0a0-40de-90ee-97b497663798" (UID: "6afeb6a7-a0a0-40de-90ee-97b497663798"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.870298 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-586d68b4fd-xj4tk"] Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.870570 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-586d68b4fd-xj4tk" podUID="e89507a7-04a1-444b-b38a-40b001ec079a" containerName="placement-log" containerID="cri-o://846d1c4f6ed0185cda05e320aba446c1f9b2558f2a5da71ef147be109e27b726" gracePeriod=30 Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.870771 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-586d68b4fd-xj4tk" podUID="e89507a7-04a1-444b-b38a-40b001ec079a" containerName="placement-api" containerID="cri-o://6939747e3ae2dc764ad9ad287190d81cffed4e3b8fef6031a7e72a9769a88873" gracePeriod=30 Mar 09 13:41:44 crc kubenswrapper[4764]: I0309 13:41:44.915097 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6afeb6a7-a0a0-40de-90ee-97b497663798-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.271231 4764 generic.go:334] "Generic (PLEG): container finished" podID="e89507a7-04a1-444b-b38a-40b001ec079a" containerID="846d1c4f6ed0185cda05e320aba446c1f9b2558f2a5da71ef147be109e27b726" exitCode=143 Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.271314 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-586d68b4fd-xj4tk" event={"ID":"e89507a7-04a1-444b-b38a-40b001ec079a","Type":"ContainerDied","Data":"846d1c4f6ed0185cda05e320aba446c1f9b2558f2a5da71ef147be109e27b726"} Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.277609 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6afeb6a7-a0a0-40de-90ee-97b497663798","Type":"ContainerDied","Data":"ca431b92c1671db841cf4cedf0d25ffe1db3212b3b5515413a5c717be63cab9f"} Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.277768 4764 scope.go:117] "RemoveContainer" containerID="dd358903bd4bede8bf029bbf11c70e3e47fe50f447bead2b369061bf66fe7262" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.277777 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.306666 4764 scope.go:117] "RemoveContainer" containerID="3048f7a722fe7b1924f522a8fa95862bc27ca27611f2c6a75277ad4a82b3a7d9" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.315449 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.344761 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.352552 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 13:41:45 crc kubenswrapper[4764]: E0309 13:41:45.353092 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d65fa53-be02-4d11-b300-5cb4629c03da" containerName="neutron-httpd" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.353117 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d65fa53-be02-4d11-b300-5cb4629c03da" containerName="neutron-httpd" Mar 09 13:41:45 crc kubenswrapper[4764]: E0309 13:41:45.353131 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b2b268a-adc9-46ca-908a-d30ab8543059" containerName="init" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.353138 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b2b268a-adc9-46ca-908a-d30ab8543059" containerName="init" Mar 09 13:41:45 crc kubenswrapper[4764]: E0309 13:41:45.353151 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6afeb6a7-a0a0-40de-90ee-97b497663798" containerName="cinder-scheduler" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.353158 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6afeb6a7-a0a0-40de-90ee-97b497663798" containerName="cinder-scheduler" Mar 09 13:41:45 crc kubenswrapper[4764]: E0309 13:41:45.353173 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6afeb6a7-a0a0-40de-90ee-97b497663798" containerName="probe" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.353179 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6afeb6a7-a0a0-40de-90ee-97b497663798" containerName="probe" Mar 09 13:41:45 crc kubenswrapper[4764]: E0309 13:41:45.353189 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b2b268a-adc9-46ca-908a-d30ab8543059" containerName="dnsmasq-dns" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.353195 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b2b268a-adc9-46ca-908a-d30ab8543059" containerName="dnsmasq-dns" Mar 09 13:41:45 crc kubenswrapper[4764]: E0309 13:41:45.353205 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d65fa53-be02-4d11-b300-5cb4629c03da" containerName="neutron-api" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.353211 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d65fa53-be02-4d11-b300-5cb4629c03da" containerName="neutron-api" Mar 09 13:41:45 crc kubenswrapper[4764]: E0309 13:41:45.353220 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a250ee-f349-4bab-b5b4-b402289210a6" containerName="barbican-api" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.353226 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a250ee-f349-4bab-b5b4-b402289210a6" containerName="barbican-api" Mar 09 13:41:45 crc kubenswrapper[4764]: E0309 13:41:45.353245 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a250ee-f349-4bab-b5b4-b402289210a6" containerName="barbican-api-log" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.353252 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a250ee-f349-4bab-b5b4-b402289210a6" containerName="barbican-api-log" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.353450 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b2b268a-adc9-46ca-908a-d30ab8543059" containerName="dnsmasq-dns" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.353475 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d65fa53-be02-4d11-b300-5cb4629c03da" containerName="neutron-httpd" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.353486 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="60a250ee-f349-4bab-b5b4-b402289210a6" containerName="barbican-api" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.353510 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d65fa53-be02-4d11-b300-5cb4629c03da" containerName="neutron-api" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.353527 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="60a250ee-f349-4bab-b5b4-b402289210a6" containerName="barbican-api-log" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.353535 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6afeb6a7-a0a0-40de-90ee-97b497663798" containerName="cinder-scheduler" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.353543 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6afeb6a7-a0a0-40de-90ee-97b497663798" containerName="probe" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.354523 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.358771 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.413061 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.527568 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05d58314-31c8-4b6a-8c8c-1dc211d9f424-config-data\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.527634 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/05d58314-31c8-4b6a-8c8c-1dc211d9f424-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.527694 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05d58314-31c8-4b6a-8c8c-1dc211d9f424-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.527720 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05d58314-31c8-4b6a-8c8c-1dc211d9f424-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.528124 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtrq2\" (UniqueName: \"kubernetes.io/projected/05d58314-31c8-4b6a-8c8c-1dc211d9f424-kube-api-access-qtrq2\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.528479 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05d58314-31c8-4b6a-8c8c-1dc211d9f424-scripts\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.579840 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60a250ee-f349-4bab-b5b4-b402289210a6" path="/var/lib/kubelet/pods/60a250ee-f349-4bab-b5b4-b402289210a6/volumes" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.580481 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6afeb6a7-a0a0-40de-90ee-97b497663798" path="/var/lib/kubelet/pods/6afeb6a7-a0a0-40de-90ee-97b497663798/volumes" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.631000 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05d58314-31c8-4b6a-8c8c-1dc211d9f424-config-data\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.631810 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/05d58314-31c8-4b6a-8c8c-1dc211d9f424-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.631870 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05d58314-31c8-4b6a-8c8c-1dc211d9f424-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.631901 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05d58314-31c8-4b6a-8c8c-1dc211d9f424-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.631958 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtrq2\" (UniqueName: \"kubernetes.io/projected/05d58314-31c8-4b6a-8c8c-1dc211d9f424-kube-api-access-qtrq2\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.632027 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/05d58314-31c8-4b6a-8c8c-1dc211d9f424-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.632040 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05d58314-31c8-4b6a-8c8c-1dc211d9f424-scripts\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.636153 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.638898 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05d58314-31c8-4b6a-8c8c-1dc211d9f424-scripts\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.639178 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05d58314-31c8-4b6a-8c8c-1dc211d9f424-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.639413 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05d58314-31c8-4b6a-8c8c-1dc211d9f424-config-data\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.650219 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05d58314-31c8-4b6a-8c8c-1dc211d9f424-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.650684 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtrq2\" (UniqueName: \"kubernetes.io/projected/05d58314-31c8-4b6a-8c8c-1dc211d9f424-kube-api-access-qtrq2\") pod \"cinder-scheduler-0\" (UID: \"05d58314-31c8-4b6a-8c8c-1dc211d9f424\") " pod="openstack/cinder-scheduler-0" Mar 09 13:41:45 crc kubenswrapper[4764]: I0309 13:41:45.722101 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 13:41:46 crc kubenswrapper[4764]: I0309 13:41:46.323940 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 13:41:47 crc kubenswrapper[4764]: I0309 13:41:47.314699 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"05d58314-31c8-4b6a-8c8c-1dc211d9f424","Type":"ContainerStarted","Data":"280763f18ffca699716abeb32eb578440349294752b3cb7f97f829f5b450fa17"} Mar 09 13:41:47 crc kubenswrapper[4764]: I0309 13:41:47.315345 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"05d58314-31c8-4b6a-8c8c-1dc211d9f424","Type":"ContainerStarted","Data":"15ea221d2905458f73538d3f2a43c8a79b9c820ae64821655ca5ba44c22d5dbd"} Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.327005 4764 generic.go:334] "Generic (PLEG): container finished" podID="e89507a7-04a1-444b-b38a-40b001ec079a" containerID="6939747e3ae2dc764ad9ad287190d81cffed4e3b8fef6031a7e72a9769a88873" exitCode=0 Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.327469 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-586d68b4fd-xj4tk" event={"ID":"e89507a7-04a1-444b-b38a-40b001ec079a","Type":"ContainerDied","Data":"6939747e3ae2dc764ad9ad287190d81cffed4e3b8fef6031a7e72a9769a88873"} Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.334056 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"05d58314-31c8-4b6a-8c8c-1dc211d9f424","Type":"ContainerStarted","Data":"749a569e9b8139758a9781531307bf7e1c326dcee90021ec2c57d32ca51f6804"} Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.542281 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.579624 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.579592025 podStartE2EDuration="3.579592025s" podCreationTimestamp="2026-03-09 13:41:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:48.360923184 +0000 UTC m=+1263.611095092" watchObservedRunningTime="2026-03-09 13:41:48.579592025 +0000 UTC m=+1263.829763933" Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.629748 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-combined-ca-bundle\") pod \"e89507a7-04a1-444b-b38a-40b001ec079a\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.629831 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-scripts\") pod \"e89507a7-04a1-444b-b38a-40b001ec079a\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.629877 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-public-tls-certs\") pod \"e89507a7-04a1-444b-b38a-40b001ec079a\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.629950 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8mnp\" (UniqueName: \"kubernetes.io/projected/e89507a7-04a1-444b-b38a-40b001ec079a-kube-api-access-j8mnp\") pod \"e89507a7-04a1-444b-b38a-40b001ec079a\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.630007 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e89507a7-04a1-444b-b38a-40b001ec079a-logs\") pod \"e89507a7-04a1-444b-b38a-40b001ec079a\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.630091 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-config-data\") pod \"e89507a7-04a1-444b-b38a-40b001ec079a\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.630118 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-internal-tls-certs\") pod \"e89507a7-04a1-444b-b38a-40b001ec079a\" (UID: \"e89507a7-04a1-444b-b38a-40b001ec079a\") " Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.634175 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e89507a7-04a1-444b-b38a-40b001ec079a-logs" (OuterVolumeSpecName: "logs") pod "e89507a7-04a1-444b-b38a-40b001ec079a" (UID: "e89507a7-04a1-444b-b38a-40b001ec079a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.659278 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e89507a7-04a1-444b-b38a-40b001ec079a-kube-api-access-j8mnp" (OuterVolumeSpecName: "kube-api-access-j8mnp") pod "e89507a7-04a1-444b-b38a-40b001ec079a" (UID: "e89507a7-04a1-444b-b38a-40b001ec079a"). InnerVolumeSpecName "kube-api-access-j8mnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.659501 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-scripts" (OuterVolumeSpecName: "scripts") pod "e89507a7-04a1-444b-b38a-40b001ec079a" (UID: "e89507a7-04a1-444b-b38a-40b001ec079a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.733170 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.733217 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8mnp\" (UniqueName: \"kubernetes.io/projected/e89507a7-04a1-444b-b38a-40b001ec079a-kube-api-access-j8mnp\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.733232 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e89507a7-04a1-444b-b38a-40b001ec079a-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.756042 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e89507a7-04a1-444b-b38a-40b001ec079a" (UID: "e89507a7-04a1-444b-b38a-40b001ec079a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.763827 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-config-data" (OuterVolumeSpecName: "config-data") pod "e89507a7-04a1-444b-b38a-40b001ec079a" (UID: "e89507a7-04a1-444b-b38a-40b001ec079a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.799547 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e89507a7-04a1-444b-b38a-40b001ec079a" (UID: "e89507a7-04a1-444b-b38a-40b001ec079a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.803844 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e89507a7-04a1-444b-b38a-40b001ec079a" (UID: "e89507a7-04a1-444b-b38a-40b001ec079a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.835212 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.835262 4764 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.835277 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:48 crc kubenswrapper[4764]: I0309 13:41:48.835289 4764 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e89507a7-04a1-444b-b38a-40b001ec079a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.270376 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 09 13:41:49 crc kubenswrapper[4764]: E0309 13:41:49.271310 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e89507a7-04a1-444b-b38a-40b001ec079a" containerName="placement-log" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.271332 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89507a7-04a1-444b-b38a-40b001ec079a" containerName="placement-log" Mar 09 13:41:49 crc kubenswrapper[4764]: E0309 13:41:49.271347 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e89507a7-04a1-444b-b38a-40b001ec079a" containerName="placement-api" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.271356 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89507a7-04a1-444b-b38a-40b001ec079a" containerName="placement-api" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.271593 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e89507a7-04a1-444b-b38a-40b001ec079a" containerName="placement-api" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.271630 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e89507a7-04a1-444b-b38a-40b001ec079a" containerName="placement-log" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.272356 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.278771 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.278770 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.279154 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-p9c7c" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.294451 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.356033 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d82ed357-9f4c-478b-b893-ab6ff10fc83c-openstack-config-secret\") pod \"openstackclient\" (UID: \"d82ed357-9f4c-478b-b893-ab6ff10fc83c\") " pod="openstack/openstackclient" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.356098 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d82ed357-9f4c-478b-b893-ab6ff10fc83c-openstack-config\") pod \"openstackclient\" (UID: \"d82ed357-9f4c-478b-b893-ab6ff10fc83c\") " pod="openstack/openstackclient" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.356169 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d82ed357-9f4c-478b-b893-ab6ff10fc83c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d82ed357-9f4c-478b-b893-ab6ff10fc83c\") " pod="openstack/openstackclient" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.356281 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v458d\" (UniqueName: \"kubernetes.io/projected/d82ed357-9f4c-478b-b893-ab6ff10fc83c-kube-api-access-v458d\") pod \"openstackclient\" (UID: \"d82ed357-9f4c-478b-b893-ab6ff10fc83c\") " pod="openstack/openstackclient" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.384746 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-586d68b4fd-xj4tk" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.384739 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-586d68b4fd-xj4tk" event={"ID":"e89507a7-04a1-444b-b38a-40b001ec079a","Type":"ContainerDied","Data":"521edccf936d07531fd9f618513cede620e92cebf6edb91c3eb8e6d1d0940b40"} Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.384850 4764 scope.go:117] "RemoveContainer" containerID="6939747e3ae2dc764ad9ad287190d81cffed4e3b8fef6031a7e72a9769a88873" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.457854 4764 scope.go:117] "RemoveContainer" containerID="846d1c4f6ed0185cda05e320aba446c1f9b2558f2a5da71ef147be109e27b726" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.460328 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d82ed357-9f4c-478b-b893-ab6ff10fc83c-openstack-config-secret\") pod \"openstackclient\" (UID: \"d82ed357-9f4c-478b-b893-ab6ff10fc83c\") " pod="openstack/openstackclient" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.460392 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d82ed357-9f4c-478b-b893-ab6ff10fc83c-openstack-config\") pod \"openstackclient\" (UID: \"d82ed357-9f4c-478b-b893-ab6ff10fc83c\") " pod="openstack/openstackclient" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.460454 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d82ed357-9f4c-478b-b893-ab6ff10fc83c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d82ed357-9f4c-478b-b893-ab6ff10fc83c\") " pod="openstack/openstackclient" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.460527 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v458d\" (UniqueName: \"kubernetes.io/projected/d82ed357-9f4c-478b-b893-ab6ff10fc83c-kube-api-access-v458d\") pod \"openstackclient\" (UID: \"d82ed357-9f4c-478b-b893-ab6ff10fc83c\") " pod="openstack/openstackclient" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.463548 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d82ed357-9f4c-478b-b893-ab6ff10fc83c-openstack-config\") pod \"openstackclient\" (UID: \"d82ed357-9f4c-478b-b893-ab6ff10fc83c\") " pod="openstack/openstackclient" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.483699 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-586d68b4fd-xj4tk"] Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.485684 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d82ed357-9f4c-478b-b893-ab6ff10fc83c-openstack-config-secret\") pod \"openstackclient\" (UID: \"d82ed357-9f4c-478b-b893-ab6ff10fc83c\") " pod="openstack/openstackclient" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.489303 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d82ed357-9f4c-478b-b893-ab6ff10fc83c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d82ed357-9f4c-478b-b893-ab6ff10fc83c\") " pod="openstack/openstackclient" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.490699 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v458d\" (UniqueName: \"kubernetes.io/projected/d82ed357-9f4c-478b-b893-ab6ff10fc83c-kube-api-access-v458d\") pod \"openstackclient\" (UID: \"d82ed357-9f4c-478b-b893-ab6ff10fc83c\") " pod="openstack/openstackclient" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.518580 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-586d68b4fd-xj4tk"] Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.591899 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 09 13:41:49 crc kubenswrapper[4764]: I0309 13:41:49.604457 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e89507a7-04a1-444b-b38a-40b001ec079a" path="/var/lib/kubelet/pods/e89507a7-04a1-444b-b38a-40b001ec079a/volumes" Mar 09 13:41:50 crc kubenswrapper[4764]: I0309 13:41:50.121946 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 09 13:41:50 crc kubenswrapper[4764]: W0309 13:41:50.124296 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd82ed357_9f4c_478b_b893_ab6ff10fc83c.slice/crio-b37961ab13817fdc9aaa7d3640d627039046c5872124cc4eb3d499e3dd38bec3 WatchSource:0}: Error finding container b37961ab13817fdc9aaa7d3640d627039046c5872124cc4eb3d499e3dd38bec3: Status 404 returned error can't find the container with id b37961ab13817fdc9aaa7d3640d627039046c5872124cc4eb3d499e3dd38bec3 Mar 09 13:41:50 crc kubenswrapper[4764]: I0309 13:41:50.321328 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 09 13:41:50 crc kubenswrapper[4764]: I0309 13:41:50.398486 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d82ed357-9f4c-478b-b893-ab6ff10fc83c","Type":"ContainerStarted","Data":"b37961ab13817fdc9aaa7d3640d627039046c5872124cc4eb3d499e3dd38bec3"} Mar 09 13:41:50 crc kubenswrapper[4764]: I0309 13:41:50.722785 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 09 13:41:55 crc kubenswrapper[4764]: I0309 13:41:55.999041 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 09 13:41:57 crc kubenswrapper[4764]: I0309 13:41:57.319333 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 09 13:41:58 crc kubenswrapper[4764]: I0309 13:41:58.348795 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:41:58 crc kubenswrapper[4764]: I0309 13:41:58.349233 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="proxy-httpd" containerID="cri-o://8e00a9b64763a83494a83ac4067c14a67cca49b0a2f0a325e393f9400b39892b" gracePeriod=30 Mar 09 13:41:58 crc kubenswrapper[4764]: I0309 13:41:58.349306 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="ceilometer-notification-agent" containerID="cri-o://1e8bef222d882b21a7ec25aed9429545abfc2c7a862b52718f01f100ae12b17d" gracePeriod=30 Mar 09 13:41:58 crc kubenswrapper[4764]: I0309 13:41:58.349398 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="sg-core" containerID="cri-o://4fe2306e68853bfa3180ad2e5295480a2103723960a9860736e8ad880fc1db61" gracePeriod=30 Mar 09 13:41:58 crc kubenswrapper[4764]: I0309 13:41:58.349506 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="ceilometer-central-agent" containerID="cri-o://d86adcecb94ef745afb424352583b984c64f486d6d2dfabedad259db0634d772" gracePeriod=30 Mar 09 13:41:58 crc kubenswrapper[4764]: I0309 13:41:58.494114 4764 generic.go:334] "Generic (PLEG): container finished" podID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerID="4fe2306e68853bfa3180ad2e5295480a2103723960a9860736e8ad880fc1db61" exitCode=2 Mar 09 13:41:58 crc kubenswrapper[4764]: I0309 13:41:58.494208 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3","Type":"ContainerDied","Data":"4fe2306e68853bfa3180ad2e5295480a2103723960a9860736e8ad880fc1db61"} Mar 09 13:41:59 crc kubenswrapper[4764]: I0309 13:41:59.529667 4764 generic.go:334] "Generic (PLEG): container finished" podID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerID="8e00a9b64763a83494a83ac4067c14a67cca49b0a2f0a325e393f9400b39892b" exitCode=0 Mar 09 13:41:59 crc kubenswrapper[4764]: I0309 13:41:59.529710 4764 generic.go:334] "Generic (PLEG): container finished" podID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerID="1e8bef222d882b21a7ec25aed9429545abfc2c7a862b52718f01f100ae12b17d" exitCode=0 Mar 09 13:41:59 crc kubenswrapper[4764]: I0309 13:41:59.529723 4764 generic.go:334] "Generic (PLEG): container finished" podID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerID="d86adcecb94ef745afb424352583b984c64f486d6d2dfabedad259db0634d772" exitCode=0 Mar 09 13:41:59 crc kubenswrapper[4764]: I0309 13:41:59.529750 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3","Type":"ContainerDied","Data":"8e00a9b64763a83494a83ac4067c14a67cca49b0a2f0a325e393f9400b39892b"} Mar 09 13:41:59 crc kubenswrapper[4764]: I0309 13:41:59.529788 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3","Type":"ContainerDied","Data":"1e8bef222d882b21a7ec25aed9429545abfc2c7a862b52718f01f100ae12b17d"} Mar 09 13:41:59 crc kubenswrapper[4764]: I0309 13:41:59.529805 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3","Type":"ContainerDied","Data":"d86adcecb94ef745afb424352583b984c64f486d6d2dfabedad259db0634d772"} Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.145717 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551062-wl7gf"] Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.155011 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551062-wl7gf" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.160299 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.160321 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.160563 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.164111 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551062-wl7gf"] Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.258147 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mcfc\" (UniqueName: \"kubernetes.io/projected/61f41fcc-bb74-4c90-a6af-bfcd168ef2cb-kube-api-access-7mcfc\") pod \"auto-csr-approver-29551062-wl7gf\" (UID: \"61f41fcc-bb74-4c90-a6af-bfcd168ef2cb\") " pod="openshift-infra/auto-csr-approver-29551062-wl7gf" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.360600 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mcfc\" (UniqueName: \"kubernetes.io/projected/61f41fcc-bb74-4c90-a6af-bfcd168ef2cb-kube-api-access-7mcfc\") pod \"auto-csr-approver-29551062-wl7gf\" (UID: \"61f41fcc-bb74-4c90-a6af-bfcd168ef2cb\") " pod="openshift-infra/auto-csr-approver-29551062-wl7gf" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.399042 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mcfc\" (UniqueName: \"kubernetes.io/projected/61f41fcc-bb74-4c90-a6af-bfcd168ef2cb-kube-api-access-7mcfc\") pod \"auto-csr-approver-29551062-wl7gf\" (UID: \"61f41fcc-bb74-4c90-a6af-bfcd168ef2cb\") " pod="openshift-infra/auto-csr-approver-29551062-wl7gf" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.478311 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551062-wl7gf" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.729782 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.769300 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-scripts\") pod \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.769362 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-config-data\") pod \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.769466 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpxx4\" (UniqueName: \"kubernetes.io/projected/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-kube-api-access-qpxx4\") pod \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.769512 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-run-httpd\") pod \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.769551 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-log-httpd\") pod \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.769718 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-sg-core-conf-yaml\") pod \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.769742 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-combined-ca-bundle\") pod \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\" (UID: \"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3\") " Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.772531 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" (UID: "df5caebe-ad05-48b0-bbce-ecb2ec29e7c3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.773668 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" (UID: "df5caebe-ad05-48b0-bbce-ecb2ec29e7c3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.776664 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-kube-api-access-qpxx4" (OuterVolumeSpecName: "kube-api-access-qpxx4") pod "df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" (UID: "df5caebe-ad05-48b0-bbce-ecb2ec29e7c3"). InnerVolumeSpecName "kube-api-access-qpxx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.777061 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-scripts" (OuterVolumeSpecName: "scripts") pod "df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" (UID: "df5caebe-ad05-48b0-bbce-ecb2ec29e7c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.802120 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" (UID: "df5caebe-ad05-48b0-bbce-ecb2ec29e7c3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.870719 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" (UID: "df5caebe-ad05-48b0-bbce-ecb2ec29e7c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.871950 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.871976 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.871988 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.872000 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpxx4\" (UniqueName: \"kubernetes.io/projected/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-kube-api-access-qpxx4\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.872011 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.872019 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.910534 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-config-data" (OuterVolumeSpecName: "config-data") pod "df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" (UID: "df5caebe-ad05-48b0-bbce-ecb2ec29e7c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:00 crc kubenswrapper[4764]: I0309 13:42:00.973487 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.036168 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551062-wl7gf"] Mar 09 13:42:01 crc kubenswrapper[4764]: W0309 13:42:01.042992 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61f41fcc_bb74_4c90_a6af_bfcd168ef2cb.slice/crio-eb63554a3776e0f69aa0f087de4b9957a83350b60020c393086a1cd637f2cf8d WatchSource:0}: Error finding container eb63554a3776e0f69aa0f087de4b9957a83350b60020c393086a1cd637f2cf8d: Status 404 returned error can't find the container with id eb63554a3776e0f69aa0f087de4b9957a83350b60020c393086a1cd637f2cf8d Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.549810 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d82ed357-9f4c-478b-b893-ab6ff10fc83c","Type":"ContainerStarted","Data":"37dc3e9342a9a8a18b3bbe6377f05fd5914b7ef1ca66c80903cc720041e16747"} Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.551357 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551062-wl7gf" event={"ID":"61f41fcc-bb74-4c90-a6af-bfcd168ef2cb","Type":"ContainerStarted","Data":"eb63554a3776e0f69aa0f087de4b9957a83350b60020c393086a1cd637f2cf8d"} Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.556066 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df5caebe-ad05-48b0-bbce-ecb2ec29e7c3","Type":"ContainerDied","Data":"a9199722fbed2e4a6e2a95c99d206366b5d4467326eba28ca7c133400c787e70"} Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.556127 4764 scope.go:117] "RemoveContainer" containerID="8e00a9b64763a83494a83ac4067c14a67cca49b0a2f0a325e393f9400b39892b" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.556333 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.593033 4764 scope.go:117] "RemoveContainer" containerID="4fe2306e68853bfa3180ad2e5295480a2103723960a9860736e8ad880fc1db61" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.593508 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.411367263 podStartE2EDuration="12.593477667s" podCreationTimestamp="2026-03-09 13:41:49 +0000 UTC" firstStartedPulling="2026-03-09 13:41:50.126950229 +0000 UTC m=+1265.377122137" lastFinishedPulling="2026-03-09 13:42:00.309060643 +0000 UTC m=+1275.559232541" observedRunningTime="2026-03-09 13:42:01.56807716 +0000 UTC m=+1276.818249078" watchObservedRunningTime="2026-03-09 13:42:01.593477667 +0000 UTC m=+1276.843649575" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.645056 4764 scope.go:117] "RemoveContainer" containerID="1e8bef222d882b21a7ec25aed9429545abfc2c7a862b52718f01f100ae12b17d" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.646293 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.661920 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.673202 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:01 crc kubenswrapper[4764]: E0309 13:42:01.673627 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="ceilometer-central-agent" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.673664 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="ceilometer-central-agent" Mar 09 13:42:01 crc kubenswrapper[4764]: E0309 13:42:01.673693 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="sg-core" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.673699 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="sg-core" Mar 09 13:42:01 crc kubenswrapper[4764]: E0309 13:42:01.673707 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="proxy-httpd" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.673713 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="proxy-httpd" Mar 09 13:42:01 crc kubenswrapper[4764]: E0309 13:42:01.673723 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="ceilometer-notification-agent" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.673729 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="ceilometer-notification-agent" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.673880 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="ceilometer-notification-agent" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.673897 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="ceilometer-central-agent" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.673909 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="proxy-httpd" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.673920 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" containerName="sg-core" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.675595 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.682029 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.688662 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.688692 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.703137 4764 scope.go:117] "RemoveContainer" containerID="d86adcecb94ef745afb424352583b984c64f486d6d2dfabedad259db0634d772" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.709827 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cafd43e-a12e-46ee-8108-8e33d10c47ee-log-httpd\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.709871 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-scripts\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.709900 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-config-data\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.709917 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cafd43e-a12e-46ee-8108-8e33d10c47ee-run-httpd\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.709951 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.709991 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.710052 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x677c\" (UniqueName: \"kubernetes.io/projected/4cafd43e-a12e-46ee-8108-8e33d10c47ee-kube-api-access-x677c\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.812554 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cafd43e-a12e-46ee-8108-8e33d10c47ee-log-httpd\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.812660 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-scripts\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.812722 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-config-data\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.812757 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cafd43e-a12e-46ee-8108-8e33d10c47ee-run-httpd\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.813292 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cafd43e-a12e-46ee-8108-8e33d10c47ee-log-httpd\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.813353 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cafd43e-a12e-46ee-8108-8e33d10c47ee-run-httpd\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.813413 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.813554 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.813638 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x677c\" (UniqueName: \"kubernetes.io/projected/4cafd43e-a12e-46ee-8108-8e33d10c47ee-kube-api-access-x677c\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.819777 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-scripts\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.824052 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.824993 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.831416 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-config-data\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.838105 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x677c\" (UniqueName: \"kubernetes.io/projected/4cafd43e-a12e-46ee-8108-8e33d10c47ee-kube-api-access-x677c\") pod \"ceilometer-0\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.933640 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.934724 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.967956 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 13:42:01 crc kubenswrapper[4764]: I0309 13:42:01.968230 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="ce660994-4427-4d54-b83c-9c9ec7f64a9d" containerName="kube-state-metrics" containerID="cri-o://9847def4972a082de5de5ce66af1aea1d5c2f1147d6cb0e73ffadb728f7879a5" gracePeriod=30 Mar 09 13:42:02 crc kubenswrapper[4764]: I0309 13:42:02.505271 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:02 crc kubenswrapper[4764]: W0309 13:42:02.511833 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cafd43e_a12e_46ee_8108_8e33d10c47ee.slice/crio-5953a3b710032064c75479dd87c9f1404c8431fcc42a83d5bac920c5b52ac282 WatchSource:0}: Error finding container 5953a3b710032064c75479dd87c9f1404c8431fcc42a83d5bac920c5b52ac282: Status 404 returned error can't find the container with id 5953a3b710032064c75479dd87c9f1404c8431fcc42a83d5bac920c5b52ac282 Mar 09 13:42:02 crc kubenswrapper[4764]: I0309 13:42:02.572452 4764 generic.go:334] "Generic (PLEG): container finished" podID="ce660994-4427-4d54-b83c-9c9ec7f64a9d" containerID="9847def4972a082de5de5ce66af1aea1d5c2f1147d6cb0e73ffadb728f7879a5" exitCode=2 Mar 09 13:42:02 crc kubenswrapper[4764]: I0309 13:42:02.572550 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ce660994-4427-4d54-b83c-9c9ec7f64a9d","Type":"ContainerDied","Data":"9847def4972a082de5de5ce66af1aea1d5c2f1147d6cb0e73ffadb728f7879a5"} Mar 09 13:42:02 crc kubenswrapper[4764]: I0309 13:42:02.572603 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ce660994-4427-4d54-b83c-9c9ec7f64a9d","Type":"ContainerDied","Data":"b8a8570b31f5838a5c44b32bc16eef1004cf79cfc6a6ee8f31255abe6b100221"} Mar 09 13:42:02 crc kubenswrapper[4764]: I0309 13:42:02.572618 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8a8570b31f5838a5c44b32bc16eef1004cf79cfc6a6ee8f31255abe6b100221" Mar 09 13:42:02 crc kubenswrapper[4764]: I0309 13:42:02.576494 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551062-wl7gf" event={"ID":"61f41fcc-bb74-4c90-a6af-bfcd168ef2cb","Type":"ContainerStarted","Data":"9f52cd378552f4424230b7014d94477e188c2eb48ce843f7df141a1298245bd6"} Mar 09 13:42:02 crc kubenswrapper[4764]: I0309 13:42:02.582969 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cafd43e-a12e-46ee-8108-8e33d10c47ee","Type":"ContainerStarted","Data":"5953a3b710032064c75479dd87c9f1404c8431fcc42a83d5bac920c5b52ac282"} Mar 09 13:42:02 crc kubenswrapper[4764]: I0309 13:42:02.583511 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 13:42:02 crc kubenswrapper[4764]: I0309 13:42:02.604065 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551062-wl7gf" podStartSLOduration=1.64226788 podStartE2EDuration="2.604040657s" podCreationTimestamp="2026-03-09 13:42:00 +0000 UTC" firstStartedPulling="2026-03-09 13:42:01.045187184 +0000 UTC m=+1276.295359092" lastFinishedPulling="2026-03-09 13:42:02.006959971 +0000 UTC m=+1277.257131869" observedRunningTime="2026-03-09 13:42:02.597181515 +0000 UTC m=+1277.847353423" watchObservedRunningTime="2026-03-09 13:42:02.604040657 +0000 UTC m=+1277.854212565" Mar 09 13:42:02 crc kubenswrapper[4764]: I0309 13:42:02.632224 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gphmn\" (UniqueName: \"kubernetes.io/projected/ce660994-4427-4d54-b83c-9c9ec7f64a9d-kube-api-access-gphmn\") pod \"ce660994-4427-4d54-b83c-9c9ec7f64a9d\" (UID: \"ce660994-4427-4d54-b83c-9c9ec7f64a9d\") " Mar 09 13:42:02 crc kubenswrapper[4764]: I0309 13:42:02.643909 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce660994-4427-4d54-b83c-9c9ec7f64a9d-kube-api-access-gphmn" (OuterVolumeSpecName: "kube-api-access-gphmn") pod "ce660994-4427-4d54-b83c-9c9ec7f64a9d" (UID: "ce660994-4427-4d54-b83c-9c9ec7f64a9d"). InnerVolumeSpecName "kube-api-access-gphmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:02 crc kubenswrapper[4764]: I0309 13:42:02.735285 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gphmn\" (UniqueName: \"kubernetes.io/projected/ce660994-4427-4d54-b83c-9c9ec7f64a9d-kube-api-access-gphmn\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.573394 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df5caebe-ad05-48b0-bbce-ecb2ec29e7c3" path="/var/lib/kubelet/pods/df5caebe-ad05-48b0-bbce-ecb2ec29e7c3/volumes" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.608670 4764 generic.go:334] "Generic (PLEG): container finished" podID="61f41fcc-bb74-4c90-a6af-bfcd168ef2cb" containerID="9f52cd378552f4424230b7014d94477e188c2eb48ce843f7df141a1298245bd6" exitCode=0 Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.609000 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551062-wl7gf" event={"ID":"61f41fcc-bb74-4c90-a6af-bfcd168ef2cb","Type":"ContainerDied","Data":"9f52cd378552f4424230b7014d94477e188c2eb48ce843f7df141a1298245bd6"} Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.613472 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cafd43e-a12e-46ee-8108-8e33d10c47ee","Type":"ContainerStarted","Data":"8584bf911d27e4b234249d90cbfea6653767b26c85f2f4fa5eeca9b1ef3d93a2"} Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.613493 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.654474 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.668684 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.725897 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 13:42:03 crc kubenswrapper[4764]: E0309 13:42:03.727388 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce660994-4427-4d54-b83c-9c9ec7f64a9d" containerName="kube-state-metrics" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.727407 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce660994-4427-4d54-b83c-9c9ec7f64a9d" containerName="kube-state-metrics" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.727834 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce660994-4427-4d54-b83c-9c9ec7f64a9d" containerName="kube-state-metrics" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.728801 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.738377 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.738729 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.754524 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.764840 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/179736ec-4215-4ad8-9800-a186978a767f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"179736ec-4215-4ad8-9800-a186978a767f\") " pod="openstack/kube-state-metrics-0" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.765726 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/179736ec-4215-4ad8-9800-a186978a767f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"179736ec-4215-4ad8-9800-a186978a767f\") " pod="openstack/kube-state-metrics-0" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.766187 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgtww\" (UniqueName: \"kubernetes.io/projected/179736ec-4215-4ad8-9800-a186978a767f-kube-api-access-bgtww\") pod \"kube-state-metrics-0\" (UID: \"179736ec-4215-4ad8-9800-a186978a767f\") " pod="openstack/kube-state-metrics-0" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.766278 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/179736ec-4215-4ad8-9800-a186978a767f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"179736ec-4215-4ad8-9800-a186978a767f\") " pod="openstack/kube-state-metrics-0" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.868081 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/179736ec-4215-4ad8-9800-a186978a767f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"179736ec-4215-4ad8-9800-a186978a767f\") " pod="openstack/kube-state-metrics-0" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.868204 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/179736ec-4215-4ad8-9800-a186978a767f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"179736ec-4215-4ad8-9800-a186978a767f\") " pod="openstack/kube-state-metrics-0" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.868304 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/179736ec-4215-4ad8-9800-a186978a767f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"179736ec-4215-4ad8-9800-a186978a767f\") " pod="openstack/kube-state-metrics-0" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.868346 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgtww\" (UniqueName: \"kubernetes.io/projected/179736ec-4215-4ad8-9800-a186978a767f-kube-api-access-bgtww\") pod \"kube-state-metrics-0\" (UID: \"179736ec-4215-4ad8-9800-a186978a767f\") " pod="openstack/kube-state-metrics-0" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.873712 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/179736ec-4215-4ad8-9800-a186978a767f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"179736ec-4215-4ad8-9800-a186978a767f\") " pod="openstack/kube-state-metrics-0" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.874443 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/179736ec-4215-4ad8-9800-a186978a767f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"179736ec-4215-4ad8-9800-a186978a767f\") " pod="openstack/kube-state-metrics-0" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.874458 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/179736ec-4215-4ad8-9800-a186978a767f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"179736ec-4215-4ad8-9800-a186978a767f\") " pod="openstack/kube-state-metrics-0" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.888511 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgtww\" (UniqueName: \"kubernetes.io/projected/179736ec-4215-4ad8-9800-a186978a767f-kube-api-access-bgtww\") pod \"kube-state-metrics-0\" (UID: \"179736ec-4215-4ad8-9800-a186978a767f\") " pod="openstack/kube-state-metrics-0" Mar 09 13:42:03 crc kubenswrapper[4764]: I0309 13:42:03.974693 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 13:42:04 crc kubenswrapper[4764]: I0309 13:42:04.430290 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 13:42:04 crc kubenswrapper[4764]: W0309 13:42:04.438024 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod179736ec_4215_4ad8_9800_a186978a767f.slice/crio-bea433135fb4cb0d388ba0f7dce87d81fbded59498da059f958f5ba30c0c4844 WatchSource:0}: Error finding container bea433135fb4cb0d388ba0f7dce87d81fbded59498da059f958f5ba30c0c4844: Status 404 returned error can't find the container with id bea433135fb4cb0d388ba0f7dce87d81fbded59498da059f958f5ba30c0c4844 Mar 09 13:42:04 crc kubenswrapper[4764]: I0309 13:42:04.671912 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"179736ec-4215-4ad8-9800-a186978a767f","Type":"ContainerStarted","Data":"bea433135fb4cb0d388ba0f7dce87d81fbded59498da059f958f5ba30c0c4844"} Mar 09 13:42:04 crc kubenswrapper[4764]: I0309 13:42:04.694769 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cafd43e-a12e-46ee-8108-8e33d10c47ee","Type":"ContainerStarted","Data":"77c79186eac3b625857450d160ad883bb9d71ca93dc8e2e59f473a386200af76"} Mar 09 13:42:05 crc kubenswrapper[4764]: I0309 13:42:05.063206 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551062-wl7gf" Mar 09 13:42:05 crc kubenswrapper[4764]: I0309 13:42:05.101879 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mcfc\" (UniqueName: \"kubernetes.io/projected/61f41fcc-bb74-4c90-a6af-bfcd168ef2cb-kube-api-access-7mcfc\") pod \"61f41fcc-bb74-4c90-a6af-bfcd168ef2cb\" (UID: \"61f41fcc-bb74-4c90-a6af-bfcd168ef2cb\") " Mar 09 13:42:05 crc kubenswrapper[4764]: I0309 13:42:05.108923 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61f41fcc-bb74-4c90-a6af-bfcd168ef2cb-kube-api-access-7mcfc" (OuterVolumeSpecName: "kube-api-access-7mcfc") pod "61f41fcc-bb74-4c90-a6af-bfcd168ef2cb" (UID: "61f41fcc-bb74-4c90-a6af-bfcd168ef2cb"). InnerVolumeSpecName "kube-api-access-7mcfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:05 crc kubenswrapper[4764]: I0309 13:42:05.204401 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mcfc\" (UniqueName: \"kubernetes.io/projected/61f41fcc-bb74-4c90-a6af-bfcd168ef2cb-kube-api-access-7mcfc\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:05 crc kubenswrapper[4764]: I0309 13:42:05.580163 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce660994-4427-4d54-b83c-9c9ec7f64a9d" path="/var/lib/kubelet/pods/ce660994-4427-4d54-b83c-9c9ec7f64a9d/volumes" Mar 09 13:42:05 crc kubenswrapper[4764]: I0309 13:42:05.677178 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551056-chr2q"] Mar 09 13:42:05 crc kubenswrapper[4764]: I0309 13:42:05.685552 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551056-chr2q"] Mar 09 13:42:05 crc kubenswrapper[4764]: I0309 13:42:05.709076 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cafd43e-a12e-46ee-8108-8e33d10c47ee","Type":"ContainerStarted","Data":"ef59562971287c90872fa0c6501fb93b73e7422eb42b03a1776c5d4727a706c9"} Mar 09 13:42:05 crc kubenswrapper[4764]: I0309 13:42:05.717930 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551062-wl7gf" Mar 09 13:42:05 crc kubenswrapper[4764]: I0309 13:42:05.717923 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551062-wl7gf" event={"ID":"61f41fcc-bb74-4c90-a6af-bfcd168ef2cb","Type":"ContainerDied","Data":"eb63554a3776e0f69aa0f087de4b9957a83350b60020c393086a1cd637f2cf8d"} Mar 09 13:42:05 crc kubenswrapper[4764]: I0309 13:42:05.718138 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb63554a3776e0f69aa0f087de4b9957a83350b60020c393086a1cd637f2cf8d" Mar 09 13:42:05 crc kubenswrapper[4764]: I0309 13:42:05.726463 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"179736ec-4215-4ad8-9800-a186978a767f","Type":"ContainerStarted","Data":"081d3f8cf76e67f42fde0999650f46ab737b778f55da60e9d9086f5add2b2601"} Mar 09 13:42:05 crc kubenswrapper[4764]: I0309 13:42:05.726725 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 09 13:42:05 crc kubenswrapper[4764]: I0309 13:42:05.756596 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.346859035 podStartE2EDuration="2.756563224s" podCreationTimestamp="2026-03-09 13:42:03 +0000 UTC" firstStartedPulling="2026-03-09 13:42:04.442127688 +0000 UTC m=+1279.692299596" lastFinishedPulling="2026-03-09 13:42:04.851831877 +0000 UTC m=+1280.102003785" observedRunningTime="2026-03-09 13:42:05.746385429 +0000 UTC m=+1280.996557357" watchObservedRunningTime="2026-03-09 13:42:05.756563224 +0000 UTC m=+1281.006735172" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.085600 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-tchwl"] Mar 09 13:42:06 crc kubenswrapper[4764]: E0309 13:42:06.086420 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f41fcc-bb74-4c90-a6af-bfcd168ef2cb" containerName="oc" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.086496 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f41fcc-bb74-4c90-a6af-bfcd168ef2cb" containerName="oc" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.086771 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="61f41fcc-bb74-4c90-a6af-bfcd168ef2cb" containerName="oc" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.087606 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tchwl" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.110662 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-tchwl"] Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.127584 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwkjb\" (UniqueName: \"kubernetes.io/projected/66003ca3-e579-4dab-b714-b5b2baa26bad-kube-api-access-dwkjb\") pod \"nova-api-db-create-tchwl\" (UID: \"66003ca3-e579-4dab-b714-b5b2baa26bad\") " pod="openstack/nova-api-db-create-tchwl" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.127681 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66003ca3-e579-4dab-b714-b5b2baa26bad-operator-scripts\") pod \"nova-api-db-create-tchwl\" (UID: \"66003ca3-e579-4dab-b714-b5b2baa26bad\") " pod="openstack/nova-api-db-create-tchwl" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.147809 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6b7bfdfd5-56dnz" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.180341 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-mnqg7"] Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.181747 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mnqg7" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.229244 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwkjb\" (UniqueName: \"kubernetes.io/projected/66003ca3-e579-4dab-b714-b5b2baa26bad-kube-api-access-dwkjb\") pod \"nova-api-db-create-tchwl\" (UID: \"66003ca3-e579-4dab-b714-b5b2baa26bad\") " pod="openstack/nova-api-db-create-tchwl" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.229352 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66003ca3-e579-4dab-b714-b5b2baa26bad-operator-scripts\") pod \"nova-api-db-create-tchwl\" (UID: \"66003ca3-e579-4dab-b714-b5b2baa26bad\") " pod="openstack/nova-api-db-create-tchwl" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.229559 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a75ea85a-1e66-4e8d-92d7-6f9b766abfda-operator-scripts\") pod \"nova-cell0-db-create-mnqg7\" (UID: \"a75ea85a-1e66-4e8d-92d7-6f9b766abfda\") " pod="openstack/nova-cell0-db-create-mnqg7" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.229609 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2ffg\" (UniqueName: \"kubernetes.io/projected/a75ea85a-1e66-4e8d-92d7-6f9b766abfda-kube-api-access-t2ffg\") pod \"nova-cell0-db-create-mnqg7\" (UID: \"a75ea85a-1e66-4e8d-92d7-6f9b766abfda\") " pod="openstack/nova-cell0-db-create-mnqg7" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.232450 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66003ca3-e579-4dab-b714-b5b2baa26bad-operator-scripts\") pod \"nova-api-db-create-tchwl\" (UID: \"66003ca3-e579-4dab-b714-b5b2baa26bad\") " pod="openstack/nova-api-db-create-tchwl" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.261048 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-mnqg7"] Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.286429 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwkjb\" (UniqueName: \"kubernetes.io/projected/66003ca3-e579-4dab-b714-b5b2baa26bad-kube-api-access-dwkjb\") pod \"nova-api-db-create-tchwl\" (UID: \"66003ca3-e579-4dab-b714-b5b2baa26bad\") " pod="openstack/nova-api-db-create-tchwl" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.341433 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a75ea85a-1e66-4e8d-92d7-6f9b766abfda-operator-scripts\") pod \"nova-cell0-db-create-mnqg7\" (UID: \"a75ea85a-1e66-4e8d-92d7-6f9b766abfda\") " pod="openstack/nova-cell0-db-create-mnqg7" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.341954 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2ffg\" (UniqueName: \"kubernetes.io/projected/a75ea85a-1e66-4e8d-92d7-6f9b766abfda-kube-api-access-t2ffg\") pod \"nova-cell0-db-create-mnqg7\" (UID: \"a75ea85a-1e66-4e8d-92d7-6f9b766abfda\") " pod="openstack/nova-cell0-db-create-mnqg7" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.343323 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a75ea85a-1e66-4e8d-92d7-6f9b766abfda-operator-scripts\") pod \"nova-cell0-db-create-mnqg7\" (UID: \"a75ea85a-1e66-4e8d-92d7-6f9b766abfda\") " pod="openstack/nova-cell0-db-create-mnqg7" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.367901 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2ffg\" (UniqueName: \"kubernetes.io/projected/a75ea85a-1e66-4e8d-92d7-6f9b766abfda-kube-api-access-t2ffg\") pod \"nova-cell0-db-create-mnqg7\" (UID: \"a75ea85a-1e66-4e8d-92d7-6f9b766abfda\") " pod="openstack/nova-cell0-db-create-mnqg7" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.381516 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6b74bc6bc6-vsxl5"] Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.382282 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6b74bc6bc6-vsxl5" podUID="50610296-d076-4c9f-ac34-a976202ce135" containerName="neutron-httpd" containerID="cri-o://40fc71094cc1ee70bba232001951743f253a32a95993837575a49f9b6e385df9" gracePeriod=30 Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.382519 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6b74bc6bc6-vsxl5" podUID="50610296-d076-4c9f-ac34-a976202ce135" containerName="neutron-api" containerID="cri-o://508370eb92a6887d3b1ecf26300791f931d8f328026a6caf81b9a1ce10c18bed" gracePeriod=30 Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.406539 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tchwl" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.407353 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-78eb-account-create-update-2dqgt"] Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.409241 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-78eb-account-create-update-2dqgt" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.417856 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-tbf9j"] Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.418738 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.420313 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tbf9j" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.439074 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-78eb-account-create-update-2dqgt"] Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.468149 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-tbf9j"] Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.503971 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mnqg7" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.517046 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cc98-account-create-update-sjfqm"] Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.522073 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cc98-account-create-update-sjfqm" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.524890 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.525974 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cc98-account-create-update-sjfqm"] Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.550110 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d97l\" (UniqueName: \"kubernetes.io/projected/8fa35355-06e1-403f-9691-92398769ac09-kube-api-access-8d97l\") pod \"nova-cell1-db-create-tbf9j\" (UID: \"8fa35355-06e1-403f-9691-92398769ac09\") " pod="openstack/nova-cell1-db-create-tbf9j" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.550187 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r55f9\" (UniqueName: \"kubernetes.io/projected/b5daba6a-a01a-4400-aa87-01f9efd3abd8-kube-api-access-r55f9\") pod \"nova-api-78eb-account-create-update-2dqgt\" (UID: \"b5daba6a-a01a-4400-aa87-01f9efd3abd8\") " pod="openstack/nova-api-78eb-account-create-update-2dqgt" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.550262 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5daba6a-a01a-4400-aa87-01f9efd3abd8-operator-scripts\") pod \"nova-api-78eb-account-create-update-2dqgt\" (UID: \"b5daba6a-a01a-4400-aa87-01f9efd3abd8\") " pod="openstack/nova-api-78eb-account-create-update-2dqgt" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.550298 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fa35355-06e1-403f-9691-92398769ac09-operator-scripts\") pod \"nova-cell1-db-create-tbf9j\" (UID: \"8fa35355-06e1-403f-9691-92398769ac09\") " pod="openstack/nova-cell1-db-create-tbf9j" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.654146 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d97l\" (UniqueName: \"kubernetes.io/projected/8fa35355-06e1-403f-9691-92398769ac09-kube-api-access-8d97l\") pod \"nova-cell1-db-create-tbf9j\" (UID: \"8fa35355-06e1-403f-9691-92398769ac09\") " pod="openstack/nova-cell1-db-create-tbf9j" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.654215 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r55f9\" (UniqueName: \"kubernetes.io/projected/b5daba6a-a01a-4400-aa87-01f9efd3abd8-kube-api-access-r55f9\") pod \"nova-api-78eb-account-create-update-2dqgt\" (UID: \"b5daba6a-a01a-4400-aa87-01f9efd3abd8\") " pod="openstack/nova-api-78eb-account-create-update-2dqgt" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.654252 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5daba6a-a01a-4400-aa87-01f9efd3abd8-operator-scripts\") pod \"nova-api-78eb-account-create-update-2dqgt\" (UID: \"b5daba6a-a01a-4400-aa87-01f9efd3abd8\") " pod="openstack/nova-api-78eb-account-create-update-2dqgt" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.654294 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fa35355-06e1-403f-9691-92398769ac09-operator-scripts\") pod \"nova-cell1-db-create-tbf9j\" (UID: \"8fa35355-06e1-403f-9691-92398769ac09\") " pod="openstack/nova-cell1-db-create-tbf9j" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.654373 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwv2d\" (UniqueName: \"kubernetes.io/projected/5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c-kube-api-access-jwv2d\") pod \"nova-cell0-cc98-account-create-update-sjfqm\" (UID: \"5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c\") " pod="openstack/nova-cell0-cc98-account-create-update-sjfqm" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.654416 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c-operator-scripts\") pod \"nova-cell0-cc98-account-create-update-sjfqm\" (UID: \"5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c\") " pod="openstack/nova-cell0-cc98-account-create-update-sjfqm" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.656965 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fa35355-06e1-403f-9691-92398769ac09-operator-scripts\") pod \"nova-cell1-db-create-tbf9j\" (UID: \"8fa35355-06e1-403f-9691-92398769ac09\") " pod="openstack/nova-cell1-db-create-tbf9j" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.657637 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5daba6a-a01a-4400-aa87-01f9efd3abd8-operator-scripts\") pod \"nova-api-78eb-account-create-update-2dqgt\" (UID: \"b5daba6a-a01a-4400-aa87-01f9efd3abd8\") " pod="openstack/nova-api-78eb-account-create-update-2dqgt" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.689960 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r55f9\" (UniqueName: \"kubernetes.io/projected/b5daba6a-a01a-4400-aa87-01f9efd3abd8-kube-api-access-r55f9\") pod \"nova-api-78eb-account-create-update-2dqgt\" (UID: \"b5daba6a-a01a-4400-aa87-01f9efd3abd8\") " pod="openstack/nova-api-78eb-account-create-update-2dqgt" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.693199 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d97l\" (UniqueName: \"kubernetes.io/projected/8fa35355-06e1-403f-9691-92398769ac09-kube-api-access-8d97l\") pod \"nova-cell1-db-create-tbf9j\" (UID: \"8fa35355-06e1-403f-9691-92398769ac09\") " pod="openstack/nova-cell1-db-create-tbf9j" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.703081 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-f7d8-account-create-update-rp748"] Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.711799 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f7d8-account-create-update-rp748"] Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.711987 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f7d8-account-create-update-rp748" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.716037 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.742495 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-78eb-account-create-update-2dqgt" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.756173 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3db8af07-1310-4cd5-be07-3fd062fe89a7-operator-scripts\") pod \"nova-cell1-f7d8-account-create-update-rp748\" (UID: \"3db8af07-1310-4cd5-be07-3fd062fe89a7\") " pod="openstack/nova-cell1-f7d8-account-create-update-rp748" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.756254 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jk6c\" (UniqueName: \"kubernetes.io/projected/3db8af07-1310-4cd5-be07-3fd062fe89a7-kube-api-access-7jk6c\") pod \"nova-cell1-f7d8-account-create-update-rp748\" (UID: \"3db8af07-1310-4cd5-be07-3fd062fe89a7\") " pod="openstack/nova-cell1-f7d8-account-create-update-rp748" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.756311 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwv2d\" (UniqueName: \"kubernetes.io/projected/5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c-kube-api-access-jwv2d\") pod \"nova-cell0-cc98-account-create-update-sjfqm\" (UID: \"5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c\") " pod="openstack/nova-cell0-cc98-account-create-update-sjfqm" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.756340 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c-operator-scripts\") pod \"nova-cell0-cc98-account-create-update-sjfqm\" (UID: \"5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c\") " pod="openstack/nova-cell0-cc98-account-create-update-sjfqm" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.757146 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c-operator-scripts\") pod \"nova-cell0-cc98-account-create-update-sjfqm\" (UID: \"5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c\") " pod="openstack/nova-cell0-cc98-account-create-update-sjfqm" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.771457 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tbf9j" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.780470 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwv2d\" (UniqueName: \"kubernetes.io/projected/5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c-kube-api-access-jwv2d\") pod \"nova-cell0-cc98-account-create-update-sjfqm\" (UID: \"5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c\") " pod="openstack/nova-cell0-cc98-account-create-update-sjfqm" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.861844 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3db8af07-1310-4cd5-be07-3fd062fe89a7-operator-scripts\") pod \"nova-cell1-f7d8-account-create-update-rp748\" (UID: \"3db8af07-1310-4cd5-be07-3fd062fe89a7\") " pod="openstack/nova-cell1-f7d8-account-create-update-rp748" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.861945 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jk6c\" (UniqueName: \"kubernetes.io/projected/3db8af07-1310-4cd5-be07-3fd062fe89a7-kube-api-access-7jk6c\") pod \"nova-cell1-f7d8-account-create-update-rp748\" (UID: \"3db8af07-1310-4cd5-be07-3fd062fe89a7\") " pod="openstack/nova-cell1-f7d8-account-create-update-rp748" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.865728 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3db8af07-1310-4cd5-be07-3fd062fe89a7-operator-scripts\") pod \"nova-cell1-f7d8-account-create-update-rp748\" (UID: \"3db8af07-1310-4cd5-be07-3fd062fe89a7\") " pod="openstack/nova-cell1-f7d8-account-create-update-rp748" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.887815 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jk6c\" (UniqueName: \"kubernetes.io/projected/3db8af07-1310-4cd5-be07-3fd062fe89a7-kube-api-access-7jk6c\") pod \"nova-cell1-f7d8-account-create-update-rp748\" (UID: \"3db8af07-1310-4cd5-be07-3fd062fe89a7\") " pod="openstack/nova-cell1-f7d8-account-create-update-rp748" Mar 09 13:42:06 crc kubenswrapper[4764]: I0309 13:42:06.899435 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cc98-account-create-update-sjfqm" Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.043235 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f7d8-account-create-update-rp748" Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.098043 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-tchwl"] Mar 09 13:42:07 crc kubenswrapper[4764]: W0309 13:42:07.106090 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66003ca3_e579_4dab_b714_b5b2baa26bad.slice/crio-1b4ee63c3be90245829c57ff18357c6790f696fc4922e58488a0b15d231d7670 WatchSource:0}: Error finding container 1b4ee63c3be90245829c57ff18357c6790f696fc4922e58488a0b15d231d7670: Status 404 returned error can't find the container with id 1b4ee63c3be90245829c57ff18357c6790f696fc4922e58488a0b15d231d7670 Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.183576 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-mnqg7"] Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.357285 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-tbf9j"] Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.389479 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-78eb-account-create-update-2dqgt"] Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.578791 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5ceebdd-e9ad-472a-8806-f5b441ced89a" path="/var/lib/kubelet/pods/b5ceebdd-e9ad-472a-8806-f5b441ced89a/volumes" Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.667949 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cc98-account-create-update-sjfqm"] Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.754470 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f7d8-account-create-update-rp748"] Mar 09 13:42:07 crc kubenswrapper[4764]: W0309 13:42:07.767782 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3db8af07_1310_4cd5_be07_3fd062fe89a7.slice/crio-460f2f7700725f04372e42b1bbec425c0f8c5db8d2dfaf33891c7041869dfce9 WatchSource:0}: Error finding container 460f2f7700725f04372e42b1bbec425c0f8c5db8d2dfaf33891c7041869dfce9: Status 404 returned error can't find the container with id 460f2f7700725f04372e42b1bbec425c0f8c5db8d2dfaf33891c7041869dfce9 Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.768757 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-78eb-account-create-update-2dqgt" event={"ID":"b5daba6a-a01a-4400-aa87-01f9efd3abd8","Type":"ContainerStarted","Data":"68e1ea9724cff446a48722f5d3beb35f95107e4f92a2482a35467b3ee65b6d69"} Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.768824 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-78eb-account-create-update-2dqgt" event={"ID":"b5daba6a-a01a-4400-aa87-01f9efd3abd8","Type":"ContainerStarted","Data":"8b9629cb75484588167c111d3c7dbfcae1e4f0827b9d156b8c69932b90498b31"} Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.771941 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cc98-account-create-update-sjfqm" event={"ID":"5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c","Type":"ContainerStarted","Data":"d57d5ab16c67bc105085ecfbab0c639605f83f267c2a597f6e63924c995e5ba1"} Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.792516 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tbf9j" event={"ID":"8fa35355-06e1-403f-9691-92398769ac09","Type":"ContainerStarted","Data":"ea1b24536350a8debd8d3a73db534419bac8b767469b0a62084cf6812d4fbd33"} Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.792578 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tbf9j" event={"ID":"8fa35355-06e1-403f-9691-92398769ac09","Type":"ContainerStarted","Data":"7802efe10177c9d393bd5a30dc7dd873fbe98ff3606d74449c6a46a5501b117c"} Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.801616 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-78eb-account-create-update-2dqgt" podStartSLOduration=1.801591232 podStartE2EDuration="1.801591232s" podCreationTimestamp="2026-03-09 13:42:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:42:07.793339565 +0000 UTC m=+1283.043511483" watchObservedRunningTime="2026-03-09 13:42:07.801591232 +0000 UTC m=+1283.051763140" Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.809198 4764 generic.go:334] "Generic (PLEG): container finished" podID="50610296-d076-4c9f-ac34-a976202ce135" containerID="40fc71094cc1ee70bba232001951743f253a32a95993837575a49f9b6e385df9" exitCode=0 Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.809352 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b74bc6bc6-vsxl5" event={"ID":"50610296-d076-4c9f-ac34-a976202ce135","Type":"ContainerDied","Data":"40fc71094cc1ee70bba232001951743f253a32a95993837575a49f9b6e385df9"} Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.830135 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tchwl" event={"ID":"66003ca3-e579-4dab-b714-b5b2baa26bad","Type":"ContainerStarted","Data":"5ac07a8dfc2c0d6d538f5fde06f5c7806d78d08f4964c253a85bccef288caf3e"} Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.830203 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tchwl" event={"ID":"66003ca3-e579-4dab-b714-b5b2baa26bad","Type":"ContainerStarted","Data":"1b4ee63c3be90245829c57ff18357c6790f696fc4922e58488a0b15d231d7670"} Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.838258 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-tbf9j" podStartSLOduration=1.838234111 podStartE2EDuration="1.838234111s" podCreationTimestamp="2026-03-09 13:42:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:42:07.832461046 +0000 UTC m=+1283.082632954" watchObservedRunningTime="2026-03-09 13:42:07.838234111 +0000 UTC m=+1283.088406039" Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.845267 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cafd43e-a12e-46ee-8108-8e33d10c47ee","Type":"ContainerStarted","Data":"5e72510b160d74d50bc0d91ad305ff42b401476dfb0ab2426d6d04bc04d5d679"} Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.845624 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="ceilometer-central-agent" containerID="cri-o://8584bf911d27e4b234249d90cbfea6653767b26c85f2f4fa5eeca9b1ef3d93a2" gracePeriod=30 Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.845701 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.845799 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="proxy-httpd" containerID="cri-o://5e72510b160d74d50bc0d91ad305ff42b401476dfb0ab2426d6d04bc04d5d679" gracePeriod=30 Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.845874 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="sg-core" containerID="cri-o://ef59562971287c90872fa0c6501fb93b73e7422eb42b03a1776c5d4727a706c9" gracePeriod=30 Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.845933 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="ceilometer-notification-agent" containerID="cri-o://77c79186eac3b625857450d160ad883bb9d71ca93dc8e2e59f473a386200af76" gracePeriod=30 Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.853811 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mnqg7" event={"ID":"a75ea85a-1e66-4e8d-92d7-6f9b766abfda","Type":"ContainerStarted","Data":"17c99031bb7ad39e3a6bf2e953eeaec9098edceedb7c2ac4d40bb3b9857101a1"} Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.853875 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mnqg7" event={"ID":"a75ea85a-1e66-4e8d-92d7-6f9b766abfda","Type":"ContainerStarted","Data":"ab77b0a3ad0dcdcd518820673440ab1d67aaea30117cd3647b19e8df2313926c"} Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.944809 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.59244328 podStartE2EDuration="6.944779831s" podCreationTimestamp="2026-03-09 13:42:01 +0000 UTC" firstStartedPulling="2026-03-09 13:42:02.516159824 +0000 UTC m=+1277.766331732" lastFinishedPulling="2026-03-09 13:42:06.868496375 +0000 UTC m=+1282.118668283" observedRunningTime="2026-03-09 13:42:07.887892045 +0000 UTC m=+1283.138063953" watchObservedRunningTime="2026-03-09 13:42:07.944779831 +0000 UTC m=+1283.194951749" Mar 09 13:42:07 crc kubenswrapper[4764]: I0309 13:42:07.957939 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-mnqg7" podStartSLOduration=1.95788655 podStartE2EDuration="1.95788655s" podCreationTimestamp="2026-03-09 13:42:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:42:07.91040232 +0000 UTC m=+1283.160574228" watchObservedRunningTime="2026-03-09 13:42:07.95788655 +0000 UTC m=+1283.208058468" Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.878795 4764 generic.go:334] "Generic (PLEG): container finished" podID="b5daba6a-a01a-4400-aa87-01f9efd3abd8" containerID="68e1ea9724cff446a48722f5d3beb35f95107e4f92a2482a35467b3ee65b6d69" exitCode=0 Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.879246 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-78eb-account-create-update-2dqgt" event={"ID":"b5daba6a-a01a-4400-aa87-01f9efd3abd8","Type":"ContainerDied","Data":"68e1ea9724cff446a48722f5d3beb35f95107e4f92a2482a35467b3ee65b6d69"} Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.891239 4764 generic.go:334] "Generic (PLEG): container finished" podID="5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c" containerID="2c7f017bce7c92c14d6be609c4899f3333d712b1c6dc036ede4a982ad2477f70" exitCode=0 Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.891362 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cc98-account-create-update-sjfqm" event={"ID":"5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c","Type":"ContainerDied","Data":"2c7f017bce7c92c14d6be609c4899f3333d712b1c6dc036ede4a982ad2477f70"} Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.903482 4764 generic.go:334] "Generic (PLEG): container finished" podID="3db8af07-1310-4cd5-be07-3fd062fe89a7" containerID="d053ca844a6ffecdf233cee0796c2188368f4f6b2a5097474b6a14b438549f4b" exitCode=0 Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.903618 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f7d8-account-create-update-rp748" event={"ID":"3db8af07-1310-4cd5-be07-3fd062fe89a7","Type":"ContainerDied","Data":"d053ca844a6ffecdf233cee0796c2188368f4f6b2a5097474b6a14b438549f4b"} Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.903691 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f7d8-account-create-update-rp748" event={"ID":"3db8af07-1310-4cd5-be07-3fd062fe89a7","Type":"ContainerStarted","Data":"460f2f7700725f04372e42b1bbec425c0f8c5db8d2dfaf33891c7041869dfce9"} Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.915454 4764 generic.go:334] "Generic (PLEG): container finished" podID="8fa35355-06e1-403f-9691-92398769ac09" containerID="ea1b24536350a8debd8d3a73db534419bac8b767469b0a62084cf6812d4fbd33" exitCode=0 Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.915534 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tbf9j" event={"ID":"8fa35355-06e1-403f-9691-92398769ac09","Type":"ContainerDied","Data":"ea1b24536350a8debd8d3a73db534419bac8b767469b0a62084cf6812d4fbd33"} Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.922458 4764 generic.go:334] "Generic (PLEG): container finished" podID="66003ca3-e579-4dab-b714-b5b2baa26bad" containerID="5ac07a8dfc2c0d6d538f5fde06f5c7806d78d08f4964c253a85bccef288caf3e" exitCode=0 Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.922656 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tchwl" event={"ID":"66003ca3-e579-4dab-b714-b5b2baa26bad","Type":"ContainerDied","Data":"5ac07a8dfc2c0d6d538f5fde06f5c7806d78d08f4964c253a85bccef288caf3e"} Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.948608 4764 generic.go:334] "Generic (PLEG): container finished" podID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerID="5e72510b160d74d50bc0d91ad305ff42b401476dfb0ab2426d6d04bc04d5d679" exitCode=0 Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.948663 4764 generic.go:334] "Generic (PLEG): container finished" podID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerID="ef59562971287c90872fa0c6501fb93b73e7422eb42b03a1776c5d4727a706c9" exitCode=2 Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.948673 4764 generic.go:334] "Generic (PLEG): container finished" podID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerID="77c79186eac3b625857450d160ad883bb9d71ca93dc8e2e59f473a386200af76" exitCode=0 Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.948749 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cafd43e-a12e-46ee-8108-8e33d10c47ee","Type":"ContainerDied","Data":"5e72510b160d74d50bc0d91ad305ff42b401476dfb0ab2426d6d04bc04d5d679"} Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.948783 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cafd43e-a12e-46ee-8108-8e33d10c47ee","Type":"ContainerDied","Data":"ef59562971287c90872fa0c6501fb93b73e7422eb42b03a1776c5d4727a706c9"} Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.948798 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cafd43e-a12e-46ee-8108-8e33d10c47ee","Type":"ContainerDied","Data":"77c79186eac3b625857450d160ad883bb9d71ca93dc8e2e59f473a386200af76"} Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.969457 4764 generic.go:334] "Generic (PLEG): container finished" podID="a75ea85a-1e66-4e8d-92d7-6f9b766abfda" containerID="17c99031bb7ad39e3a6bf2e953eeaec9098edceedb7c2ac4d40bb3b9857101a1" exitCode=0 Mar 09 13:42:08 crc kubenswrapper[4764]: I0309 13:42:08.969600 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mnqg7" event={"ID":"a75ea85a-1e66-4e8d-92d7-6f9b766abfda","Type":"ContainerDied","Data":"17c99031bb7ad39e3a6bf2e953eeaec9098edceedb7c2ac4d40bb3b9857101a1"} Mar 09 13:42:09 crc kubenswrapper[4764]: I0309 13:42:09.328141 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tchwl" Mar 09 13:42:09 crc kubenswrapper[4764]: I0309 13:42:09.434839 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwkjb\" (UniqueName: \"kubernetes.io/projected/66003ca3-e579-4dab-b714-b5b2baa26bad-kube-api-access-dwkjb\") pod \"66003ca3-e579-4dab-b714-b5b2baa26bad\" (UID: \"66003ca3-e579-4dab-b714-b5b2baa26bad\") " Mar 09 13:42:09 crc kubenswrapper[4764]: I0309 13:42:09.434990 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66003ca3-e579-4dab-b714-b5b2baa26bad-operator-scripts\") pod \"66003ca3-e579-4dab-b714-b5b2baa26bad\" (UID: \"66003ca3-e579-4dab-b714-b5b2baa26bad\") " Mar 09 13:42:09 crc kubenswrapper[4764]: I0309 13:42:09.435438 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66003ca3-e579-4dab-b714-b5b2baa26bad-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "66003ca3-e579-4dab-b714-b5b2baa26bad" (UID: "66003ca3-e579-4dab-b714-b5b2baa26bad"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:42:09 crc kubenswrapper[4764]: I0309 13:42:09.441558 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66003ca3-e579-4dab-b714-b5b2baa26bad-kube-api-access-dwkjb" (OuterVolumeSpecName: "kube-api-access-dwkjb") pod "66003ca3-e579-4dab-b714-b5b2baa26bad" (UID: "66003ca3-e579-4dab-b714-b5b2baa26bad"). InnerVolumeSpecName "kube-api-access-dwkjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:09 crc kubenswrapper[4764]: I0309 13:42:09.537764 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwkjb\" (UniqueName: \"kubernetes.io/projected/66003ca3-e579-4dab-b714-b5b2baa26bad-kube-api-access-dwkjb\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:09 crc kubenswrapper[4764]: I0309 13:42:09.538347 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66003ca3-e579-4dab-b714-b5b2baa26bad-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:09 crc kubenswrapper[4764]: I0309 13:42:09.979928 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tchwl" event={"ID":"66003ca3-e579-4dab-b714-b5b2baa26bad","Type":"ContainerDied","Data":"1b4ee63c3be90245829c57ff18357c6790f696fc4922e58488a0b15d231d7670"} Mar 09 13:42:09 crc kubenswrapper[4764]: I0309 13:42:09.979986 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b4ee63c3be90245829c57ff18357c6790f696fc4922e58488a0b15d231d7670" Mar 09 13:42:09 crc kubenswrapper[4764]: I0309 13:42:09.980195 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tchwl" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.458429 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f7d8-account-create-update-rp748" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.560069 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jk6c\" (UniqueName: \"kubernetes.io/projected/3db8af07-1310-4cd5-be07-3fd062fe89a7-kube-api-access-7jk6c\") pod \"3db8af07-1310-4cd5-be07-3fd062fe89a7\" (UID: \"3db8af07-1310-4cd5-be07-3fd062fe89a7\") " Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.560134 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3db8af07-1310-4cd5-be07-3fd062fe89a7-operator-scripts\") pod \"3db8af07-1310-4cd5-be07-3fd062fe89a7\" (UID: \"3db8af07-1310-4cd5-be07-3fd062fe89a7\") " Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.561113 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3db8af07-1310-4cd5-be07-3fd062fe89a7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3db8af07-1310-4cd5-be07-3fd062fe89a7" (UID: "3db8af07-1310-4cd5-be07-3fd062fe89a7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.567571 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3db8af07-1310-4cd5-be07-3fd062fe89a7-kube-api-access-7jk6c" (OuterVolumeSpecName: "kube-api-access-7jk6c") pod "3db8af07-1310-4cd5-be07-3fd062fe89a7" (UID: "3db8af07-1310-4cd5-be07-3fd062fe89a7"). InnerVolumeSpecName "kube-api-access-7jk6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.669156 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jk6c\" (UniqueName: \"kubernetes.io/projected/3db8af07-1310-4cd5-be07-3fd062fe89a7-kube-api-access-7jk6c\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.669498 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3db8af07-1310-4cd5-be07-3fd062fe89a7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.678533 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-78eb-account-create-update-2dqgt" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.687756 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mnqg7" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.696160 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tbf9j" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.712176 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cc98-account-create-update-sjfqm" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.770553 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5daba6a-a01a-4400-aa87-01f9efd3abd8-operator-scripts\") pod \"b5daba6a-a01a-4400-aa87-01f9efd3abd8\" (UID: \"b5daba6a-a01a-4400-aa87-01f9efd3abd8\") " Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.770674 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c-operator-scripts\") pod \"5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c\" (UID: \"5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c\") " Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.770717 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r55f9\" (UniqueName: \"kubernetes.io/projected/b5daba6a-a01a-4400-aa87-01f9efd3abd8-kube-api-access-r55f9\") pod \"b5daba6a-a01a-4400-aa87-01f9efd3abd8\" (UID: \"b5daba6a-a01a-4400-aa87-01f9efd3abd8\") " Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.770869 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fa35355-06e1-403f-9691-92398769ac09-operator-scripts\") pod \"8fa35355-06e1-403f-9691-92398769ac09\" (UID: \"8fa35355-06e1-403f-9691-92398769ac09\") " Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.770923 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2ffg\" (UniqueName: \"kubernetes.io/projected/a75ea85a-1e66-4e8d-92d7-6f9b766abfda-kube-api-access-t2ffg\") pod \"a75ea85a-1e66-4e8d-92d7-6f9b766abfda\" (UID: \"a75ea85a-1e66-4e8d-92d7-6f9b766abfda\") " Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.770985 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwv2d\" (UniqueName: \"kubernetes.io/projected/5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c-kube-api-access-jwv2d\") pod \"5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c\" (UID: \"5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c\") " Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.771040 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a75ea85a-1e66-4e8d-92d7-6f9b766abfda-operator-scripts\") pod \"a75ea85a-1e66-4e8d-92d7-6f9b766abfda\" (UID: \"a75ea85a-1e66-4e8d-92d7-6f9b766abfda\") " Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.771128 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d97l\" (UniqueName: \"kubernetes.io/projected/8fa35355-06e1-403f-9691-92398769ac09-kube-api-access-8d97l\") pod \"8fa35355-06e1-403f-9691-92398769ac09\" (UID: \"8fa35355-06e1-403f-9691-92398769ac09\") " Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.771219 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5daba6a-a01a-4400-aa87-01f9efd3abd8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b5daba6a-a01a-4400-aa87-01f9efd3abd8" (UID: "b5daba6a-a01a-4400-aa87-01f9efd3abd8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.771231 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c" (UID: "5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.771292 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fa35355-06e1-403f-9691-92398769ac09-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8fa35355-06e1-403f-9691-92398769ac09" (UID: "8fa35355-06e1-403f-9691-92398769ac09"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.771720 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a75ea85a-1e66-4e8d-92d7-6f9b766abfda-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a75ea85a-1e66-4e8d-92d7-6f9b766abfda" (UID: "a75ea85a-1e66-4e8d-92d7-6f9b766abfda"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.771758 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5daba6a-a01a-4400-aa87-01f9efd3abd8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.771776 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.771790 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fa35355-06e1-403f-9691-92398769ac09-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.776853 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c-kube-api-access-jwv2d" (OuterVolumeSpecName: "kube-api-access-jwv2d") pod "5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c" (UID: "5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c"). InnerVolumeSpecName "kube-api-access-jwv2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.783094 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fa35355-06e1-403f-9691-92398769ac09-kube-api-access-8d97l" (OuterVolumeSpecName: "kube-api-access-8d97l") pod "8fa35355-06e1-403f-9691-92398769ac09" (UID: "8fa35355-06e1-403f-9691-92398769ac09"). InnerVolumeSpecName "kube-api-access-8d97l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.783449 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a75ea85a-1e66-4e8d-92d7-6f9b766abfda-kube-api-access-t2ffg" (OuterVolumeSpecName: "kube-api-access-t2ffg") pod "a75ea85a-1e66-4e8d-92d7-6f9b766abfda" (UID: "a75ea85a-1e66-4e8d-92d7-6f9b766abfda"). InnerVolumeSpecName "kube-api-access-t2ffg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.786945 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5daba6a-a01a-4400-aa87-01f9efd3abd8-kube-api-access-r55f9" (OuterVolumeSpecName: "kube-api-access-r55f9") pod "b5daba6a-a01a-4400-aa87-01f9efd3abd8" (UID: "b5daba6a-a01a-4400-aa87-01f9efd3abd8"). InnerVolumeSpecName "kube-api-access-r55f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.873836 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r55f9\" (UniqueName: \"kubernetes.io/projected/b5daba6a-a01a-4400-aa87-01f9efd3abd8-kube-api-access-r55f9\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.873868 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2ffg\" (UniqueName: \"kubernetes.io/projected/a75ea85a-1e66-4e8d-92d7-6f9b766abfda-kube-api-access-t2ffg\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.873877 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwv2d\" (UniqueName: \"kubernetes.io/projected/5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c-kube-api-access-jwv2d\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.873888 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a75ea85a-1e66-4e8d-92d7-6f9b766abfda-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:10 crc kubenswrapper[4764]: I0309 13:42:10.873899 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d97l\" (UniqueName: \"kubernetes.io/projected/8fa35355-06e1-403f-9691-92398769ac09-kube-api-access-8d97l\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.017958 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-78eb-account-create-update-2dqgt" event={"ID":"b5daba6a-a01a-4400-aa87-01f9efd3abd8","Type":"ContainerDied","Data":"8b9629cb75484588167c111d3c7dbfcae1e4f0827b9d156b8c69932b90498b31"} Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.018006 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b9629cb75484588167c111d3c7dbfcae1e4f0827b9d156b8c69932b90498b31" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.018085 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-78eb-account-create-update-2dqgt" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.024464 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cc98-account-create-update-sjfqm" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.024546 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cc98-account-create-update-sjfqm" event={"ID":"5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c","Type":"ContainerDied","Data":"d57d5ab16c67bc105085ecfbab0c639605f83f267c2a597f6e63924c995e5ba1"} Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.024605 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d57d5ab16c67bc105085ecfbab0c639605f83f267c2a597f6e63924c995e5ba1" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.026304 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f7d8-account-create-update-rp748" event={"ID":"3db8af07-1310-4cd5-be07-3fd062fe89a7","Type":"ContainerDied","Data":"460f2f7700725f04372e42b1bbec425c0f8c5db8d2dfaf33891c7041869dfce9"} Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.026356 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="460f2f7700725f04372e42b1bbec425c0f8c5db8d2dfaf33891c7041869dfce9" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.026430 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f7d8-account-create-update-rp748" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.034103 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tbf9j" event={"ID":"8fa35355-06e1-403f-9691-92398769ac09","Type":"ContainerDied","Data":"7802efe10177c9d393bd5a30dc7dd873fbe98ff3606d74449c6a46a5501b117c"} Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.034157 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7802efe10177c9d393bd5a30dc7dd873fbe98ff3606d74449c6a46a5501b117c" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.034274 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tbf9j" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.038114 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mnqg7" event={"ID":"a75ea85a-1e66-4e8d-92d7-6f9b766abfda","Type":"ContainerDied","Data":"ab77b0a3ad0dcdcd518820673440ab1d67aaea30117cd3647b19e8df2313926c"} Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.038163 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab77b0a3ad0dcdcd518820673440ab1d67aaea30117cd3647b19e8df2313926c" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.038259 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mnqg7" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.712470 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.790923 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-config\") pod \"50610296-d076-4c9f-ac34-a976202ce135\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.791001 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-ovndb-tls-certs\") pod \"50610296-d076-4c9f-ac34-a976202ce135\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.791185 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-httpd-config\") pod \"50610296-d076-4c9f-ac34-a976202ce135\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.791241 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnqrq\" (UniqueName: \"kubernetes.io/projected/50610296-d076-4c9f-ac34-a976202ce135-kube-api-access-xnqrq\") pod \"50610296-d076-4c9f-ac34-a976202ce135\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.791352 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-combined-ca-bundle\") pod \"50610296-d076-4c9f-ac34-a976202ce135\" (UID: \"50610296-d076-4c9f-ac34-a976202ce135\") " Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.812013 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50610296-d076-4c9f-ac34-a976202ce135-kube-api-access-xnqrq" (OuterVolumeSpecName: "kube-api-access-xnqrq") pod "50610296-d076-4c9f-ac34-a976202ce135" (UID: "50610296-d076-4c9f-ac34-a976202ce135"). InnerVolumeSpecName "kube-api-access-xnqrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.812017 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "50610296-d076-4c9f-ac34-a976202ce135" (UID: "50610296-d076-4c9f-ac34-a976202ce135"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.864632 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50610296-d076-4c9f-ac34-a976202ce135" (UID: "50610296-d076-4c9f-ac34-a976202ce135"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.873229 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-config" (OuterVolumeSpecName: "config") pod "50610296-d076-4c9f-ac34-a976202ce135" (UID: "50610296-d076-4c9f-ac34-a976202ce135"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.879844 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "50610296-d076-4c9f-ac34-a976202ce135" (UID: "50610296-d076-4c9f-ac34-a976202ce135"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.896221 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnqrq\" (UniqueName: \"kubernetes.io/projected/50610296-d076-4c9f-ac34-a976202ce135-kube-api-access-xnqrq\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.896266 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.896278 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.896288 4764 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:11 crc kubenswrapper[4764]: I0309 13:42:11.896301 4764 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/50610296-d076-4c9f-ac34-a976202ce135-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:12 crc kubenswrapper[4764]: I0309 13:42:12.048630 4764 generic.go:334] "Generic (PLEG): container finished" podID="50610296-d076-4c9f-ac34-a976202ce135" containerID="508370eb92a6887d3b1ecf26300791f931d8f328026a6caf81b9a1ce10c18bed" exitCode=0 Mar 09 13:42:12 crc kubenswrapper[4764]: I0309 13:42:12.048697 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b74bc6bc6-vsxl5" event={"ID":"50610296-d076-4c9f-ac34-a976202ce135","Type":"ContainerDied","Data":"508370eb92a6887d3b1ecf26300791f931d8f328026a6caf81b9a1ce10c18bed"} Mar 09 13:42:12 crc kubenswrapper[4764]: I0309 13:42:12.048724 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b74bc6bc6-vsxl5" Mar 09 13:42:12 crc kubenswrapper[4764]: I0309 13:42:12.048732 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b74bc6bc6-vsxl5" event={"ID":"50610296-d076-4c9f-ac34-a976202ce135","Type":"ContainerDied","Data":"bd4c896bc38b604cb19726769c37db30c4145f3642057a166913e3d7cfd24c8f"} Mar 09 13:42:12 crc kubenswrapper[4764]: I0309 13:42:12.048752 4764 scope.go:117] "RemoveContainer" containerID="40fc71094cc1ee70bba232001951743f253a32a95993837575a49f9b6e385df9" Mar 09 13:42:12 crc kubenswrapper[4764]: I0309 13:42:12.078168 4764 scope.go:117] "RemoveContainer" containerID="508370eb92a6887d3b1ecf26300791f931d8f328026a6caf81b9a1ce10c18bed" Mar 09 13:42:12 crc kubenswrapper[4764]: I0309 13:42:12.093092 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6b74bc6bc6-vsxl5"] Mar 09 13:42:12 crc kubenswrapper[4764]: I0309 13:42:12.104285 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6b74bc6bc6-vsxl5"] Mar 09 13:42:12 crc kubenswrapper[4764]: I0309 13:42:12.112377 4764 scope.go:117] "RemoveContainer" containerID="40fc71094cc1ee70bba232001951743f253a32a95993837575a49f9b6e385df9" Mar 09 13:42:12 crc kubenswrapper[4764]: E0309 13:42:12.113048 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40fc71094cc1ee70bba232001951743f253a32a95993837575a49f9b6e385df9\": container with ID starting with 40fc71094cc1ee70bba232001951743f253a32a95993837575a49f9b6e385df9 not found: ID does not exist" containerID="40fc71094cc1ee70bba232001951743f253a32a95993837575a49f9b6e385df9" Mar 09 13:42:12 crc kubenswrapper[4764]: I0309 13:42:12.113099 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40fc71094cc1ee70bba232001951743f253a32a95993837575a49f9b6e385df9"} err="failed to get container status \"40fc71094cc1ee70bba232001951743f253a32a95993837575a49f9b6e385df9\": rpc error: code = NotFound desc = could not find container \"40fc71094cc1ee70bba232001951743f253a32a95993837575a49f9b6e385df9\": container with ID starting with 40fc71094cc1ee70bba232001951743f253a32a95993837575a49f9b6e385df9 not found: ID does not exist" Mar 09 13:42:12 crc kubenswrapper[4764]: I0309 13:42:12.113126 4764 scope.go:117] "RemoveContainer" containerID="508370eb92a6887d3b1ecf26300791f931d8f328026a6caf81b9a1ce10c18bed" Mar 09 13:42:12 crc kubenswrapper[4764]: E0309 13:42:12.113542 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"508370eb92a6887d3b1ecf26300791f931d8f328026a6caf81b9a1ce10c18bed\": container with ID starting with 508370eb92a6887d3b1ecf26300791f931d8f328026a6caf81b9a1ce10c18bed not found: ID does not exist" containerID="508370eb92a6887d3b1ecf26300791f931d8f328026a6caf81b9a1ce10c18bed" Mar 09 13:42:12 crc kubenswrapper[4764]: I0309 13:42:12.113586 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"508370eb92a6887d3b1ecf26300791f931d8f328026a6caf81b9a1ce10c18bed"} err="failed to get container status \"508370eb92a6887d3b1ecf26300791f931d8f328026a6caf81b9a1ce10c18bed\": rpc error: code = NotFound desc = could not find container \"508370eb92a6887d3b1ecf26300791f931d8f328026a6caf81b9a1ce10c18bed\": container with ID starting with 508370eb92a6887d3b1ecf26300791f931d8f328026a6caf81b9a1ce10c18bed not found: ID does not exist" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.376920 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.426563 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cafd43e-a12e-46ee-8108-8e33d10c47ee-log-httpd\") pod \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.427099 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-combined-ca-bundle\") pod \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.427178 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-sg-core-conf-yaml\") pod \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.427213 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cafd43e-a12e-46ee-8108-8e33d10c47ee-run-httpd\") pod \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.427276 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-config-data\") pod \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.427300 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-scripts\") pod \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.427387 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x677c\" (UniqueName: \"kubernetes.io/projected/4cafd43e-a12e-46ee-8108-8e33d10c47ee-kube-api-access-x677c\") pod \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\" (UID: \"4cafd43e-a12e-46ee-8108-8e33d10c47ee\") " Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.432502 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cafd43e-a12e-46ee-8108-8e33d10c47ee-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4cafd43e-a12e-46ee-8108-8e33d10c47ee" (UID: "4cafd43e-a12e-46ee-8108-8e33d10c47ee"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.435593 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cafd43e-a12e-46ee-8108-8e33d10c47ee-kube-api-access-x677c" (OuterVolumeSpecName: "kube-api-access-x677c") pod "4cafd43e-a12e-46ee-8108-8e33d10c47ee" (UID: "4cafd43e-a12e-46ee-8108-8e33d10c47ee"). InnerVolumeSpecName "kube-api-access-x677c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.435929 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cafd43e-a12e-46ee-8108-8e33d10c47ee-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4cafd43e-a12e-46ee-8108-8e33d10c47ee" (UID: "4cafd43e-a12e-46ee-8108-8e33d10c47ee"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.442869 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-scripts" (OuterVolumeSpecName: "scripts") pod "4cafd43e-a12e-46ee-8108-8e33d10c47ee" (UID: "4cafd43e-a12e-46ee-8108-8e33d10c47ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.461707 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4cafd43e-a12e-46ee-8108-8e33d10c47ee" (UID: "4cafd43e-a12e-46ee-8108-8e33d10c47ee"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.529093 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4cafd43e-a12e-46ee-8108-8e33d10c47ee" (UID: "4cafd43e-a12e-46ee-8108-8e33d10c47ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.529634 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x677c\" (UniqueName: \"kubernetes.io/projected/4cafd43e-a12e-46ee-8108-8e33d10c47ee-kube-api-access-x677c\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.529688 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cafd43e-a12e-46ee-8108-8e33d10c47ee-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.529701 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.529713 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.529723 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cafd43e-a12e-46ee-8108-8e33d10c47ee-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.529733 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.536955 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-config-data" (OuterVolumeSpecName: "config-data") pod "4cafd43e-a12e-46ee-8108-8e33d10c47ee" (UID: "4cafd43e-a12e-46ee-8108-8e33d10c47ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.574229 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50610296-d076-4c9f-ac34-a976202ce135" path="/var/lib/kubelet/pods/50610296-d076-4c9f-ac34-a976202ce135/volumes" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.631227 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cafd43e-a12e-46ee-8108-8e33d10c47ee-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:13 crc kubenswrapper[4764]: I0309 13:42:13.992148 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.078854 4764 generic.go:334] "Generic (PLEG): container finished" podID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerID="8584bf911d27e4b234249d90cbfea6653767b26c85f2f4fa5eeca9b1ef3d93a2" exitCode=0 Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.078924 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cafd43e-a12e-46ee-8108-8e33d10c47ee","Type":"ContainerDied","Data":"8584bf911d27e4b234249d90cbfea6653767b26c85f2f4fa5eeca9b1ef3d93a2"} Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.078968 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cafd43e-a12e-46ee-8108-8e33d10c47ee","Type":"ContainerDied","Data":"5953a3b710032064c75479dd87c9f1404c8431fcc42a83d5bac920c5b52ac282"} Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.079020 4764 scope.go:117] "RemoveContainer" containerID="5e72510b160d74d50bc0d91ad305ff42b401476dfb0ab2426d6d04bc04d5d679" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.079128 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.128741 4764 scope.go:117] "RemoveContainer" containerID="ef59562971287c90872fa0c6501fb93b73e7422eb42b03a1776c5d4727a706c9" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.130313 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.150838 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.159822 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.160308 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5daba6a-a01a-4400-aa87-01f9efd3abd8" containerName="mariadb-account-create-update" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160330 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5daba6a-a01a-4400-aa87-01f9efd3abd8" containerName="mariadb-account-create-update" Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.160344 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a75ea85a-1e66-4e8d-92d7-6f9b766abfda" containerName="mariadb-database-create" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160351 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75ea85a-1e66-4e8d-92d7-6f9b766abfda" containerName="mariadb-database-create" Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.160363 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="sg-core" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160370 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="sg-core" Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.160386 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c" containerName="mariadb-account-create-update" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160392 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c" containerName="mariadb-account-create-update" Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.160403 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50610296-d076-4c9f-ac34-a976202ce135" containerName="neutron-httpd" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160410 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="50610296-d076-4c9f-ac34-a976202ce135" containerName="neutron-httpd" Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.160420 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66003ca3-e579-4dab-b714-b5b2baa26bad" containerName="mariadb-database-create" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160427 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="66003ca3-e579-4dab-b714-b5b2baa26bad" containerName="mariadb-database-create" Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.160442 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="proxy-httpd" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160449 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="proxy-httpd" Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.160463 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="ceilometer-central-agent" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160468 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="ceilometer-central-agent" Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.160479 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="ceilometer-notification-agent" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160485 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="ceilometer-notification-agent" Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.160495 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50610296-d076-4c9f-ac34-a976202ce135" containerName="neutron-api" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160502 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="50610296-d076-4c9f-ac34-a976202ce135" containerName="neutron-api" Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.160508 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3db8af07-1310-4cd5-be07-3fd062fe89a7" containerName="mariadb-account-create-update" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160516 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db8af07-1310-4cd5-be07-3fd062fe89a7" containerName="mariadb-account-create-update" Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.160525 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fa35355-06e1-403f-9691-92398769ac09" containerName="mariadb-database-create" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160531 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fa35355-06e1-403f-9691-92398769ac09" containerName="mariadb-database-create" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160767 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="ceilometer-central-agent" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160781 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="50610296-d076-4c9f-ac34-a976202ce135" containerName="neutron-httpd" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160795 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="66003ca3-e579-4dab-b714-b5b2baa26bad" containerName="mariadb-database-create" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160810 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="sg-core" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160824 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a75ea85a-1e66-4e8d-92d7-6f9b766abfda" containerName="mariadb-database-create" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160832 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="proxy-httpd" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160841 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3db8af07-1310-4cd5-be07-3fd062fe89a7" containerName="mariadb-account-create-update" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160852 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5daba6a-a01a-4400-aa87-01f9efd3abd8" containerName="mariadb-account-create-update" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160864 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" containerName="ceilometer-notification-agent" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160874 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="50610296-d076-4c9f-ac34-a976202ce135" containerName="neutron-api" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160886 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fa35355-06e1-403f-9691-92398769ac09" containerName="mariadb-database-create" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.160898 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c" containerName="mariadb-account-create-update" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.162462 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.167105 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.167442 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.167467 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.169775 4764 scope.go:117] "RemoveContainer" containerID="77c79186eac3b625857450d160ad883bb9d71ca93dc8e2e59f473a386200af76" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.173112 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.199322 4764 scope.go:117] "RemoveContainer" containerID="8584bf911d27e4b234249d90cbfea6653767b26c85f2f4fa5eeca9b1ef3d93a2" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.216947 4764 scope.go:117] "RemoveContainer" containerID="5e72510b160d74d50bc0d91ad305ff42b401476dfb0ab2426d6d04bc04d5d679" Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.217403 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e72510b160d74d50bc0d91ad305ff42b401476dfb0ab2426d6d04bc04d5d679\": container with ID starting with 5e72510b160d74d50bc0d91ad305ff42b401476dfb0ab2426d6d04bc04d5d679 not found: ID does not exist" containerID="5e72510b160d74d50bc0d91ad305ff42b401476dfb0ab2426d6d04bc04d5d679" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.217436 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e72510b160d74d50bc0d91ad305ff42b401476dfb0ab2426d6d04bc04d5d679"} err="failed to get container status \"5e72510b160d74d50bc0d91ad305ff42b401476dfb0ab2426d6d04bc04d5d679\": rpc error: code = NotFound desc = could not find container \"5e72510b160d74d50bc0d91ad305ff42b401476dfb0ab2426d6d04bc04d5d679\": container with ID starting with 5e72510b160d74d50bc0d91ad305ff42b401476dfb0ab2426d6d04bc04d5d679 not found: ID does not exist" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.217458 4764 scope.go:117] "RemoveContainer" containerID="ef59562971287c90872fa0c6501fb93b73e7422eb42b03a1776c5d4727a706c9" Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.217677 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef59562971287c90872fa0c6501fb93b73e7422eb42b03a1776c5d4727a706c9\": container with ID starting with ef59562971287c90872fa0c6501fb93b73e7422eb42b03a1776c5d4727a706c9 not found: ID does not exist" containerID="ef59562971287c90872fa0c6501fb93b73e7422eb42b03a1776c5d4727a706c9" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.217697 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef59562971287c90872fa0c6501fb93b73e7422eb42b03a1776c5d4727a706c9"} err="failed to get container status \"ef59562971287c90872fa0c6501fb93b73e7422eb42b03a1776c5d4727a706c9\": rpc error: code = NotFound desc = could not find container \"ef59562971287c90872fa0c6501fb93b73e7422eb42b03a1776c5d4727a706c9\": container with ID starting with ef59562971287c90872fa0c6501fb93b73e7422eb42b03a1776c5d4727a706c9 not found: ID does not exist" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.217709 4764 scope.go:117] "RemoveContainer" containerID="77c79186eac3b625857450d160ad883bb9d71ca93dc8e2e59f473a386200af76" Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.217939 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77c79186eac3b625857450d160ad883bb9d71ca93dc8e2e59f473a386200af76\": container with ID starting with 77c79186eac3b625857450d160ad883bb9d71ca93dc8e2e59f473a386200af76 not found: ID does not exist" containerID="77c79186eac3b625857450d160ad883bb9d71ca93dc8e2e59f473a386200af76" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.217961 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77c79186eac3b625857450d160ad883bb9d71ca93dc8e2e59f473a386200af76"} err="failed to get container status \"77c79186eac3b625857450d160ad883bb9d71ca93dc8e2e59f473a386200af76\": rpc error: code = NotFound desc = could not find container \"77c79186eac3b625857450d160ad883bb9d71ca93dc8e2e59f473a386200af76\": container with ID starting with 77c79186eac3b625857450d160ad883bb9d71ca93dc8e2e59f473a386200af76 not found: ID does not exist" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.217973 4764 scope.go:117] "RemoveContainer" containerID="8584bf911d27e4b234249d90cbfea6653767b26c85f2f4fa5eeca9b1ef3d93a2" Mar 09 13:42:14 crc kubenswrapper[4764]: E0309 13:42:14.218187 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8584bf911d27e4b234249d90cbfea6653767b26c85f2f4fa5eeca9b1ef3d93a2\": container with ID starting with 8584bf911d27e4b234249d90cbfea6653767b26c85f2f4fa5eeca9b1ef3d93a2 not found: ID does not exist" containerID="8584bf911d27e4b234249d90cbfea6653767b26c85f2f4fa5eeca9b1ef3d93a2" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.218206 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8584bf911d27e4b234249d90cbfea6653767b26c85f2f4fa5eeca9b1ef3d93a2"} err="failed to get container status \"8584bf911d27e4b234249d90cbfea6653767b26c85f2f4fa5eeca9b1ef3d93a2\": rpc error: code = NotFound desc = could not find container \"8584bf911d27e4b234249d90cbfea6653767b26c85f2f4fa5eeca9b1ef3d93a2\": container with ID starting with 8584bf911d27e4b234249d90cbfea6653767b26c85f2f4fa5eeca9b1ef3d93a2 not found: ID does not exist" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.247704 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-scripts\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.247900 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.248078 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.248437 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-log-httpd\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.248556 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-config-data\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.248637 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-run-httpd\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.248869 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m8ll\" (UniqueName: \"kubernetes.io/projected/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-kube-api-access-4m8ll\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.248981 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.350607 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-log-httpd\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.350688 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-config-data\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.350713 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-run-httpd\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.350766 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m8ll\" (UniqueName: \"kubernetes.io/projected/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-kube-api-access-4m8ll\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.350803 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.350837 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-scripts\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.350865 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.350897 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.351745 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-run-httpd\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.352282 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-log-httpd\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.355930 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.356502 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.357543 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-scripts\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.365502 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-config-data\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.371008 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m8ll\" (UniqueName: \"kubernetes.io/projected/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-kube-api-access-4m8ll\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.377879 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.481417 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:42:14 crc kubenswrapper[4764]: I0309 13:42:14.935920 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:15 crc kubenswrapper[4764]: I0309 13:42:15.089864 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e78a4ead-5459-49a9-89f6-5e21ac1baa3c","Type":"ContainerStarted","Data":"79e7c65f033121e5c29021bfaba325c95dd7684d4ddbd0796fab12986b9aed27"} Mar 09 13:42:15 crc kubenswrapper[4764]: I0309 13:42:15.584076 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cafd43e-a12e-46ee-8108-8e33d10c47ee" path="/var/lib/kubelet/pods/4cafd43e-a12e-46ee-8108-8e33d10c47ee/volumes" Mar 09 13:42:16 crc kubenswrapper[4764]: I0309 13:42:16.119420 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e78a4ead-5459-49a9-89f6-5e21ac1baa3c","Type":"ContainerStarted","Data":"bb420468c16a3231754e16ddfa5b5dad7573dcd0fff5049f6f34e1d807166b01"} Mar 09 13:42:16 crc kubenswrapper[4764]: I0309 13:42:16.883938 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kkcml"] Mar 09 13:42:16 crc kubenswrapper[4764]: I0309 13:42:16.885864 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:16 crc kubenswrapper[4764]: I0309 13:42:16.889124 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 09 13:42:16 crc kubenswrapper[4764]: I0309 13:42:16.889370 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 09 13:42:16 crc kubenswrapper[4764]: I0309 13:42:16.897546 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-v8mzr" Mar 09 13:42:16 crc kubenswrapper[4764]: I0309 13:42:16.908871 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kkcml"] Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.013064 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-scripts\") pod \"nova-cell0-conductor-db-sync-kkcml\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.013420 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-config-data\") pod \"nova-cell0-conductor-db-sync-kkcml\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.013452 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grsz2\" (UniqueName: \"kubernetes.io/projected/c0a40476-ff1d-443d-846f-a54cd956aaa3-kube-api-access-grsz2\") pod \"nova-cell0-conductor-db-sync-kkcml\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.013533 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-kkcml\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.115750 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grsz2\" (UniqueName: \"kubernetes.io/projected/c0a40476-ff1d-443d-846f-a54cd956aaa3-kube-api-access-grsz2\") pod \"nova-cell0-conductor-db-sync-kkcml\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.115911 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-kkcml\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.115985 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-scripts\") pod \"nova-cell0-conductor-db-sync-kkcml\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.116037 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-config-data\") pod \"nova-cell0-conductor-db-sync-kkcml\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.123397 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-kkcml\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.125525 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-scripts\") pod \"nova-cell0-conductor-db-sync-kkcml\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.125808 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-config-data\") pod \"nova-cell0-conductor-db-sync-kkcml\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.133496 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e78a4ead-5459-49a9-89f6-5e21ac1baa3c","Type":"ContainerStarted","Data":"b214631e329516c4f150fb9c3cadae477dc2a4812e6916faa36fc494a5d835b8"} Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.133534 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e78a4ead-5459-49a9-89f6-5e21ac1baa3c","Type":"ContainerStarted","Data":"1c0d0d684bf2cb708c8f8559ea9b7bec1b048fb2e4835892b259f454ffd824ea"} Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.140439 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grsz2\" (UniqueName: \"kubernetes.io/projected/c0a40476-ff1d-443d-846f-a54cd956aaa3-kube-api-access-grsz2\") pod \"nova-cell0-conductor-db-sync-kkcml\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.206809 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:17 crc kubenswrapper[4764]: I0309 13:42:17.688636 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kkcml"] Mar 09 13:42:18 crc kubenswrapper[4764]: I0309 13:42:18.161620 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kkcml" event={"ID":"c0a40476-ff1d-443d-846f-a54cd956aaa3","Type":"ContainerStarted","Data":"185f2547ce3fbe2885edcb4f82761dd8558e20b523bd1229f310adc3829d04d5"} Mar 09 13:42:19 crc kubenswrapper[4764]: I0309 13:42:19.175620 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e78a4ead-5459-49a9-89f6-5e21ac1baa3c","Type":"ContainerStarted","Data":"bed8beb8204aa72aa2bde54753848668402830e464bf5f8c233fd5f8d9ccc674"} Mar 09 13:42:19 crc kubenswrapper[4764]: I0309 13:42:19.176157 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 13:42:19 crc kubenswrapper[4764]: I0309 13:42:19.208709 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.250911549 podStartE2EDuration="5.20868431s" podCreationTimestamp="2026-03-09 13:42:14 +0000 UTC" firstStartedPulling="2026-03-09 13:42:14.949379241 +0000 UTC m=+1290.199551159" lastFinishedPulling="2026-03-09 13:42:18.907152012 +0000 UTC m=+1294.157323920" observedRunningTime="2026-03-09 13:42:19.199311555 +0000 UTC m=+1294.449483473" watchObservedRunningTime="2026-03-09 13:42:19.20868431 +0000 UTC m=+1294.458856218" Mar 09 13:42:26 crc kubenswrapper[4764]: I0309 13:42:26.265164 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kkcml" event={"ID":"c0a40476-ff1d-443d-846f-a54cd956aaa3","Type":"ContainerStarted","Data":"01f8a44cba8ccff51deb3a73f62a17d6b208f1f8fb17e65f80892f3b0a90b431"} Mar 09 13:42:26 crc kubenswrapper[4764]: I0309 13:42:26.288017 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-kkcml" podStartSLOduration=2.04921348 podStartE2EDuration="10.287999013s" podCreationTimestamp="2026-03-09 13:42:16 +0000 UTC" firstStartedPulling="2026-03-09 13:42:17.711550105 +0000 UTC m=+1292.961722013" lastFinishedPulling="2026-03-09 13:42:25.950335638 +0000 UTC m=+1301.200507546" observedRunningTime="2026-03-09 13:42:26.281768947 +0000 UTC m=+1301.531940855" watchObservedRunningTime="2026-03-09 13:42:26.287999013 +0000 UTC m=+1301.538170921" Mar 09 13:42:30 crc kubenswrapper[4764]: I0309 13:42:30.998886 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:31 crc kubenswrapper[4764]: I0309 13:42:31.000203 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="sg-core" containerID="cri-o://b214631e329516c4f150fb9c3cadae477dc2a4812e6916faa36fc494a5d835b8" gracePeriod=30 Mar 09 13:42:31 crc kubenswrapper[4764]: I0309 13:42:31.000192 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="ceilometer-central-agent" containerID="cri-o://bb420468c16a3231754e16ddfa5b5dad7573dcd0fff5049f6f34e1d807166b01" gracePeriod=30 Mar 09 13:42:31 crc kubenswrapper[4764]: I0309 13:42:31.000408 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="proxy-httpd" containerID="cri-o://bed8beb8204aa72aa2bde54753848668402830e464bf5f8c233fd5f8d9ccc674" gracePeriod=30 Mar 09 13:42:31 crc kubenswrapper[4764]: I0309 13:42:31.000441 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="ceilometer-notification-agent" containerID="cri-o://1c0d0d684bf2cb708c8f8559ea9b7bec1b048fb2e4835892b259f454ffd824ea" gracePeriod=30 Mar 09 13:42:31 crc kubenswrapper[4764]: I0309 13:42:31.015361 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.172:3000/\": EOF" Mar 09 13:42:31 crc kubenswrapper[4764]: I0309 13:42:31.317929 4764 generic.go:334] "Generic (PLEG): container finished" podID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerID="bed8beb8204aa72aa2bde54753848668402830e464bf5f8c233fd5f8d9ccc674" exitCode=0 Mar 09 13:42:31 crc kubenswrapper[4764]: I0309 13:42:31.317973 4764 generic.go:334] "Generic (PLEG): container finished" podID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerID="b214631e329516c4f150fb9c3cadae477dc2a4812e6916faa36fc494a5d835b8" exitCode=2 Mar 09 13:42:31 crc kubenswrapper[4764]: I0309 13:42:31.318004 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e78a4ead-5459-49a9-89f6-5e21ac1baa3c","Type":"ContainerDied","Data":"bed8beb8204aa72aa2bde54753848668402830e464bf5f8c233fd5f8d9ccc674"} Mar 09 13:42:31 crc kubenswrapper[4764]: I0309 13:42:31.318069 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e78a4ead-5459-49a9-89f6-5e21ac1baa3c","Type":"ContainerDied","Data":"b214631e329516c4f150fb9c3cadae477dc2a4812e6916faa36fc494a5d835b8"} Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.342005 4764 generic.go:334] "Generic (PLEG): container finished" podID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerID="1c0d0d684bf2cb708c8f8559ea9b7bec1b048fb2e4835892b259f454ffd824ea" exitCode=0 Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.342421 4764 generic.go:334] "Generic (PLEG): container finished" podID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerID="bb420468c16a3231754e16ddfa5b5dad7573dcd0fff5049f6f34e1d807166b01" exitCode=0 Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.342210 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e78a4ead-5459-49a9-89f6-5e21ac1baa3c","Type":"ContainerDied","Data":"1c0d0d684bf2cb708c8f8559ea9b7bec1b048fb2e4835892b259f454ffd824ea"} Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.342473 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e78a4ead-5459-49a9-89f6-5e21ac1baa3c","Type":"ContainerDied","Data":"bb420468c16a3231754e16ddfa5b5dad7573dcd0fff5049f6f34e1d807166b01"} Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.700340 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.844119 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-scripts\") pod \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.844219 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-sg-core-conf-yaml\") pod \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.844259 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-combined-ca-bundle\") pod \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.844326 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-run-httpd\") pod \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.844402 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-ceilometer-tls-certs\") pod \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.844573 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-log-httpd\") pod \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.844661 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m8ll\" (UniqueName: \"kubernetes.io/projected/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-kube-api-access-4m8ll\") pod \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.844731 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-config-data\") pod \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\" (UID: \"e78a4ead-5459-49a9-89f6-5e21ac1baa3c\") " Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.845570 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e78a4ead-5459-49a9-89f6-5e21ac1baa3c" (UID: "e78a4ead-5459-49a9-89f6-5e21ac1baa3c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.847432 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e78a4ead-5459-49a9-89f6-5e21ac1baa3c" (UID: "e78a4ead-5459-49a9-89f6-5e21ac1baa3c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.948237 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:32 crc kubenswrapper[4764]: I0309 13:42:32.948280 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.220240 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-scripts" (OuterVolumeSpecName: "scripts") pod "e78a4ead-5459-49a9-89f6-5e21ac1baa3c" (UID: "e78a4ead-5459-49a9-89f6-5e21ac1baa3c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.220271 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-kube-api-access-4m8ll" (OuterVolumeSpecName: "kube-api-access-4m8ll") pod "e78a4ead-5459-49a9-89f6-5e21ac1baa3c" (UID: "e78a4ead-5459-49a9-89f6-5e21ac1baa3c"). InnerVolumeSpecName "kube-api-access-4m8ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.237569 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e78a4ead-5459-49a9-89f6-5e21ac1baa3c" (UID: "e78a4ead-5459-49a9-89f6-5e21ac1baa3c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.258508 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m8ll\" (UniqueName: \"kubernetes.io/projected/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-kube-api-access-4m8ll\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.258551 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.258567 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.259244 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e78a4ead-5459-49a9-89f6-5e21ac1baa3c" (UID: "e78a4ead-5459-49a9-89f6-5e21ac1baa3c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.304903 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e78a4ead-5459-49a9-89f6-5e21ac1baa3c" (UID: "e78a4ead-5459-49a9-89f6-5e21ac1baa3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.305382 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-config-data" (OuterVolumeSpecName: "config-data") pod "e78a4ead-5459-49a9-89f6-5e21ac1baa3c" (UID: "e78a4ead-5459-49a9-89f6-5e21ac1baa3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.357254 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e78a4ead-5459-49a9-89f6-5e21ac1baa3c","Type":"ContainerDied","Data":"79e7c65f033121e5c29021bfaba325c95dd7684d4ddbd0796fab12986b9aed27"} Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.357318 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.357318 4764 scope.go:117] "RemoveContainer" containerID="bed8beb8204aa72aa2bde54753848668402830e464bf5f8c233fd5f8d9ccc674" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.361300 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.361328 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.361342 4764 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e78a4ead-5459-49a9-89f6-5e21ac1baa3c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.413484 4764 scope.go:117] "RemoveContainer" containerID="b214631e329516c4f150fb9c3cadae477dc2a4812e6916faa36fc494a5d835b8" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.427085 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.449901 4764 scope.go:117] "RemoveContainer" containerID="1c0d0d684bf2cb708c8f8559ea9b7bec1b048fb2e4835892b259f454ffd824ea" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.483167 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.497720 4764 scope.go:117] "RemoveContainer" containerID="bb420468c16a3231754e16ddfa5b5dad7573dcd0fff5049f6f34e1d807166b01" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.510866 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:33 crc kubenswrapper[4764]: E0309 13:42:33.511471 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="ceilometer-central-agent" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.511513 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="ceilometer-central-agent" Mar 09 13:42:33 crc kubenswrapper[4764]: E0309 13:42:33.511526 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="sg-core" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.511533 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="sg-core" Mar 09 13:42:33 crc kubenswrapper[4764]: E0309 13:42:33.511551 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="proxy-httpd" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.511557 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="proxy-httpd" Mar 09 13:42:33 crc kubenswrapper[4764]: E0309 13:42:33.511567 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="ceilometer-notification-agent" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.511572 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="ceilometer-notification-agent" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.511779 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="ceilometer-notification-agent" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.511807 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="sg-core" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.511821 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="proxy-httpd" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.511836 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" containerName="ceilometer-central-agent" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.513785 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.516928 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.517839 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.519554 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.525590 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.572301 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e78a4ead-5459-49a9-89f6-5e21ac1baa3c" path="/var/lib/kubelet/pods/e78a4ead-5459-49a9-89f6-5e21ac1baa3c/volumes" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.594342 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.594424 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-config-data\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.594449 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.594471 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9xwx\" (UniqueName: \"kubernetes.io/projected/85e5a081-f0bc-457a-a3f0-9f9152f942c3-kube-api-access-k9xwx\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.594494 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.594519 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85e5a081-f0bc-457a-a3f0-9f9152f942c3-run-httpd\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.594543 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-scripts\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.594565 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85e5a081-f0bc-457a-a3f0-9f9152f942c3-log-httpd\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.695505 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85e5a081-f0bc-457a-a3f0-9f9152f942c3-run-httpd\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.695571 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-scripts\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.695593 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85e5a081-f0bc-457a-a3f0-9f9152f942c3-log-httpd\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.695705 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.695758 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-config-data\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.695780 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.695811 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9xwx\" (UniqueName: \"kubernetes.io/projected/85e5a081-f0bc-457a-a3f0-9f9152f942c3-kube-api-access-k9xwx\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.695836 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.697445 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85e5a081-f0bc-457a-a3f0-9f9152f942c3-log-httpd\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.697983 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85e5a081-f0bc-457a-a3f0-9f9152f942c3-run-httpd\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.701479 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.701753 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.701846 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.701924 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-scripts\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.718824 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-config-data\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.729460 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9xwx\" (UniqueName: \"kubernetes.io/projected/85e5a081-f0bc-457a-a3f0-9f9152f942c3-kube-api-access-k9xwx\") pod \"ceilometer-0\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " pod="openstack/ceilometer-0" Mar 09 13:42:33 crc kubenswrapper[4764]: I0309 13:42:33.841211 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:42:34 crc kubenswrapper[4764]: I0309 13:42:34.324617 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:42:34 crc kubenswrapper[4764]: I0309 13:42:34.370023 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85e5a081-f0bc-457a-a3f0-9f9152f942c3","Type":"ContainerStarted","Data":"d2d07148bb1bfe75a3a625df2d14aba687a70d2b7d04873d08b4da416d307e9c"} Mar 09 13:42:35 crc kubenswrapper[4764]: I0309 13:42:35.385946 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85e5a081-f0bc-457a-a3f0-9f9152f942c3","Type":"ContainerStarted","Data":"83e456ab335c632ce0189226c9d2e7f0e9ecfe5a406e7edda24caa25bafb3539"} Mar 09 13:42:36 crc kubenswrapper[4764]: I0309 13:42:36.399415 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85e5a081-f0bc-457a-a3f0-9f9152f942c3","Type":"ContainerStarted","Data":"14dd461ed43101072708519038bb785a1ba517ec81427d8fb63817a3eca45624"} Mar 09 13:42:37 crc kubenswrapper[4764]: I0309 13:42:37.410993 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85e5a081-f0bc-457a-a3f0-9f9152f942c3","Type":"ContainerStarted","Data":"b944ec589887c7cb6534662085f3e5e58d2f1dce03f0aa12b1ed3e01a649c84e"} Mar 09 13:42:37 crc kubenswrapper[4764]: I0309 13:42:37.412941 4764 generic.go:334] "Generic (PLEG): container finished" podID="c0a40476-ff1d-443d-846f-a54cd956aaa3" containerID="01f8a44cba8ccff51deb3a73f62a17d6b208f1f8fb17e65f80892f3b0a90b431" exitCode=0 Mar 09 13:42:37 crc kubenswrapper[4764]: I0309 13:42:37.412997 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kkcml" event={"ID":"c0a40476-ff1d-443d-846f-a54cd956aaa3","Type":"ContainerDied","Data":"01f8a44cba8ccff51deb3a73f62a17d6b208f1f8fb17e65f80892f3b0a90b431"} Mar 09 13:42:38 crc kubenswrapper[4764]: I0309 13:42:38.831462 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.000561 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-config-data\") pod \"c0a40476-ff1d-443d-846f-a54cd956aaa3\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.000843 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-scripts\") pod \"c0a40476-ff1d-443d-846f-a54cd956aaa3\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.000884 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grsz2\" (UniqueName: \"kubernetes.io/projected/c0a40476-ff1d-443d-846f-a54cd956aaa3-kube-api-access-grsz2\") pod \"c0a40476-ff1d-443d-846f-a54cd956aaa3\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.000912 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-combined-ca-bundle\") pod \"c0a40476-ff1d-443d-846f-a54cd956aaa3\" (UID: \"c0a40476-ff1d-443d-846f-a54cd956aaa3\") " Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.008503 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0a40476-ff1d-443d-846f-a54cd956aaa3-kube-api-access-grsz2" (OuterVolumeSpecName: "kube-api-access-grsz2") pod "c0a40476-ff1d-443d-846f-a54cd956aaa3" (UID: "c0a40476-ff1d-443d-846f-a54cd956aaa3"). InnerVolumeSpecName "kube-api-access-grsz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.022091 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-scripts" (OuterVolumeSpecName: "scripts") pod "c0a40476-ff1d-443d-846f-a54cd956aaa3" (UID: "c0a40476-ff1d-443d-846f-a54cd956aaa3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.034018 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-config-data" (OuterVolumeSpecName: "config-data") pod "c0a40476-ff1d-443d-846f-a54cd956aaa3" (UID: "c0a40476-ff1d-443d-846f-a54cd956aaa3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.047816 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0a40476-ff1d-443d-846f-a54cd956aaa3" (UID: "c0a40476-ff1d-443d-846f-a54cd956aaa3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.103850 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.103902 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.103912 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0a40476-ff1d-443d-846f-a54cd956aaa3-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.103921 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grsz2\" (UniqueName: \"kubernetes.io/projected/c0a40476-ff1d-443d-846f-a54cd956aaa3-kube-api-access-grsz2\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.436171 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kkcml" event={"ID":"c0a40476-ff1d-443d-846f-a54cd956aaa3","Type":"ContainerDied","Data":"185f2547ce3fbe2885edcb4f82761dd8558e20b523bd1229f310adc3829d04d5"} Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.436713 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="185f2547ce3fbe2885edcb4f82761dd8558e20b523bd1229f310adc3829d04d5" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.436253 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kkcml" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.577676 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 09 13:42:39 crc kubenswrapper[4764]: E0309 13:42:39.578684 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0a40476-ff1d-443d-846f-a54cd956aaa3" containerName="nova-cell0-conductor-db-sync" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.578713 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0a40476-ff1d-443d-846f-a54cd956aaa3" containerName="nova-cell0-conductor-db-sync" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.578982 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0a40476-ff1d-443d-846f-a54cd956aaa3" containerName="nova-cell0-conductor-db-sync" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.579964 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.584963 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-v8mzr" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.585418 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.590692 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.719805 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.719884 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.719978 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xg49\" (UniqueName: \"kubernetes.io/projected/64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9-kube-api-access-2xg49\") pod \"nova-cell0-conductor-0\" (UID: \"64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.822535 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.822588 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.822619 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xg49\" (UniqueName: \"kubernetes.io/projected/64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9-kube-api-access-2xg49\") pod \"nova-cell0-conductor-0\" (UID: \"64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.827377 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.837540 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.847573 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xg49\" (UniqueName: \"kubernetes.io/projected/64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9-kube-api-access-2xg49\") pod \"nova-cell0-conductor-0\" (UID: \"64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9\") " pod="openstack/nova-cell0-conductor-0" Mar 09 13:42:39 crc kubenswrapper[4764]: I0309 13:42:39.899984 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 09 13:42:40 crc kubenswrapper[4764]: I0309 13:42:40.406316 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 09 13:42:40 crc kubenswrapper[4764]: W0309 13:42:40.410120 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64bc45ce_7cc3_4d3a_97d7_9e73bfcb4fe9.slice/crio-0c7f63de0584bfaa1552958048369043e4a41b160fc9a684200d3911215a7010 WatchSource:0}: Error finding container 0c7f63de0584bfaa1552958048369043e4a41b160fc9a684200d3911215a7010: Status 404 returned error can't find the container with id 0c7f63de0584bfaa1552958048369043e4a41b160fc9a684200d3911215a7010 Mar 09 13:42:40 crc kubenswrapper[4764]: I0309 13:42:40.461011 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85e5a081-f0bc-457a-a3f0-9f9152f942c3","Type":"ContainerStarted","Data":"6967373dce85bc726e8932709b98fc503ab51918ee1216795cbe6f5192aa2e00"} Mar 09 13:42:40 crc kubenswrapper[4764]: I0309 13:42:40.462659 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 13:42:40 crc kubenswrapper[4764]: I0309 13:42:40.463249 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9","Type":"ContainerStarted","Data":"0c7f63de0584bfaa1552958048369043e4a41b160fc9a684200d3911215a7010"} Mar 09 13:42:40 crc kubenswrapper[4764]: I0309 13:42:40.493267 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.180573523 podStartE2EDuration="7.493246005s" podCreationTimestamp="2026-03-09 13:42:33 +0000 UTC" firstStartedPulling="2026-03-09 13:42:34.32984102 +0000 UTC m=+1309.580012928" lastFinishedPulling="2026-03-09 13:42:39.642513502 +0000 UTC m=+1314.892685410" observedRunningTime="2026-03-09 13:42:40.483812583 +0000 UTC m=+1315.733984511" watchObservedRunningTime="2026-03-09 13:42:40.493246005 +0000 UTC m=+1315.743417913" Mar 09 13:42:41 crc kubenswrapper[4764]: I0309 13:42:41.484189 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9","Type":"ContainerStarted","Data":"c310a7e59e0c7b01e3eb31eec7e4b0fdfc9a11ffb7a2e9e3c10feffb9eb0b6c4"} Mar 09 13:42:41 crc kubenswrapper[4764]: I0309 13:42:41.484596 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 09 13:42:49 crc kubenswrapper[4764]: I0309 13:42:49.936008 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 09 13:42:49 crc kubenswrapper[4764]: I0309 13:42:49.961002 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=10.960971863 podStartE2EDuration="10.960971863s" podCreationTimestamp="2026-03-09 13:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:42:41.508632075 +0000 UTC m=+1316.758803983" watchObservedRunningTime="2026-03-09 13:42:49.960971863 +0000 UTC m=+1325.211143801" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.591005 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-5k46p"] Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.593031 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.597128 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.597283 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.614253 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-5k46p"] Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.695356 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb6mc\" (UniqueName: \"kubernetes.io/projected/9f09c604-028e-4965-aef8-6005ae365be9-kube-api-access-sb6mc\") pod \"nova-cell0-cell-mapping-5k46p\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.695518 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-config-data\") pod \"nova-cell0-cell-mapping-5k46p\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.696042 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-scripts\") pod \"nova-cell0-cell-mapping-5k46p\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.696271 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5k46p\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.785285 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.786985 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.794218 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.797815 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb6mc\" (UniqueName: \"kubernetes.io/projected/9f09c604-028e-4965-aef8-6005ae365be9-kube-api-access-sb6mc\") pod \"nova-cell0-cell-mapping-5k46p\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.797898 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-config-data\") pod \"nova-cell0-cell-mapping-5k46p\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.797967 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-scripts\") pod \"nova-cell0-cell-mapping-5k46p\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.798008 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5k46p\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.807916 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-config-data\") pod \"nova-cell0-cell-mapping-5k46p\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.808522 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-scripts\") pod \"nova-cell0-cell-mapping-5k46p\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.822151 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5k46p\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.845007 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.885320 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb6mc\" (UniqueName: \"kubernetes.io/projected/9f09c604-028e-4965-aef8-6005ae365be9-kube-api-access-sb6mc\") pod \"nova-cell0-cell-mapping-5k46p\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.903105 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cf59ae6-37a7-49a9-846d-e7815a57bda6-logs\") pod \"nova-api-0\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " pod="openstack/nova-api-0" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.903153 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkp7s\" (UniqueName: \"kubernetes.io/projected/7cf59ae6-37a7-49a9-846d-e7815a57bda6-kube-api-access-nkp7s\") pod \"nova-api-0\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " pod="openstack/nova-api-0" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.903196 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cf59ae6-37a7-49a9-846d-e7815a57bda6-config-data\") pod \"nova-api-0\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " pod="openstack/nova-api-0" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.903266 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cf59ae6-37a7-49a9-846d-e7815a57bda6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " pod="openstack/nova-api-0" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.916370 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.949777 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.951423 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.976713 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 09 13:42:50 crc kubenswrapper[4764]: I0309 13:42:50.986577 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.009377 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h84hj\" (UniqueName: \"kubernetes.io/projected/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-kube-api-access-h84hj\") pod \"nova-scheduler-0\" (UID: \"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959\") " pod="openstack/nova-scheduler-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.009449 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cf59ae6-37a7-49a9-846d-e7815a57bda6-logs\") pod \"nova-api-0\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " pod="openstack/nova-api-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.009471 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkp7s\" (UniqueName: \"kubernetes.io/projected/7cf59ae6-37a7-49a9-846d-e7815a57bda6-kube-api-access-nkp7s\") pod \"nova-api-0\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " pod="openstack/nova-api-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.009512 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cf59ae6-37a7-49a9-846d-e7815a57bda6-config-data\") pod \"nova-api-0\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " pod="openstack/nova-api-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.009538 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-config-data\") pod \"nova-scheduler-0\" (UID: \"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959\") " pod="openstack/nova-scheduler-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.009591 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cf59ae6-37a7-49a9-846d-e7815a57bda6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " pod="openstack/nova-api-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.009657 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959\") " pod="openstack/nova-scheduler-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.010422 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cf59ae6-37a7-49a9-846d-e7815a57bda6-logs\") pod \"nova-api-0\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " pod="openstack/nova-api-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.035384 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cf59ae6-37a7-49a9-846d-e7815a57bda6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " pod="openstack/nova-api-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.040535 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cf59ae6-37a7-49a9-846d-e7815a57bda6-config-data\") pod \"nova-api-0\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " pod="openstack/nova-api-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.058285 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkp7s\" (UniqueName: \"kubernetes.io/projected/7cf59ae6-37a7-49a9-846d-e7815a57bda6-kube-api-access-nkp7s\") pod \"nova-api-0\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " pod="openstack/nova-api-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.093673 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.099966 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.106315 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.111430 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h84hj\" (UniqueName: \"kubernetes.io/projected/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-kube-api-access-h84hj\") pod \"nova-scheduler-0\" (UID: \"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959\") " pod="openstack/nova-scheduler-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.111781 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-config-data\") pod \"nova-scheduler-0\" (UID: \"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959\") " pod="openstack/nova-scheduler-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.111939 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959\") " pod="openstack/nova-scheduler-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.117185 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.126508 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-config-data\") pod \"nova-scheduler-0\" (UID: \"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959\") " pod="openstack/nova-scheduler-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.127444 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959\") " pod="openstack/nova-scheduler-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.171291 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h84hj\" (UniqueName: \"kubernetes.io/projected/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-kube-api-access-h84hj\") pod \"nova-scheduler-0\" (UID: \"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959\") " pod="openstack/nova-scheduler-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.185752 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.273809 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-config-data\") pod \"nova-metadata-0\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " pod="openstack/nova-metadata-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.273911 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-logs\") pod \"nova-metadata-0\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " pod="openstack/nova-metadata-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.274027 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " pod="openstack/nova-metadata-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.274175 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnhnz\" (UniqueName: \"kubernetes.io/projected/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-kube-api-access-nnhnz\") pod \"nova-metadata-0\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " pod="openstack/nova-metadata-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.395386 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnhnz\" (UniqueName: \"kubernetes.io/projected/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-kube-api-access-nnhnz\") pod \"nova-metadata-0\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " pod="openstack/nova-metadata-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.395558 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-config-data\") pod \"nova-metadata-0\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " pod="openstack/nova-metadata-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.395611 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-logs\") pod \"nova-metadata-0\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " pod="openstack/nova-metadata-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.395679 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " pod="openstack/nova-metadata-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.404181 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-config-data\") pod \"nova-metadata-0\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " pod="openstack/nova-metadata-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.404777 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.407249 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-logs\") pod \"nova-metadata-0\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " pod="openstack/nova-metadata-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.408306 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.409715 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " pod="openstack/nova-metadata-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.422872 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.442200 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.443997 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnhnz\" (UniqueName: \"kubernetes.io/projected/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-kube-api-access-nnhnz\") pod \"nova-metadata-0\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " pod="openstack/nova-metadata-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.474001 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.506757 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.507374 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntmvb\" (UniqueName: \"kubernetes.io/projected/971aa9eb-f331-425d-bf49-d626f3552480-kube-api-access-ntmvb\") pod \"nova-cell1-novncproxy-0\" (UID: \"971aa9eb-f331-425d-bf49-d626f3552480\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.507425 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971aa9eb-f331-425d-bf49-d626f3552480-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"971aa9eb-f331-425d-bf49-d626f3552480\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.507604 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971aa9eb-f331-425d-bf49-d626f3552480-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"971aa9eb-f331-425d-bf49-d626f3552480\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.534692 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-gsgp2"] Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.537758 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.556422 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-gsgp2"] Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.610869 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-ovsdbserver-nb\") pod \"dnsmasq-dns-8b8cf6657-gsgp2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.610972 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntmvb\" (UniqueName: \"kubernetes.io/projected/971aa9eb-f331-425d-bf49-d626f3552480-kube-api-access-ntmvb\") pod \"nova-cell1-novncproxy-0\" (UID: \"971aa9eb-f331-425d-bf49-d626f3552480\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.611005 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971aa9eb-f331-425d-bf49-d626f3552480-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"971aa9eb-f331-425d-bf49-d626f3552480\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.611031 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-config\") pod \"dnsmasq-dns-8b8cf6657-gsgp2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.611125 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xl8t\" (UniqueName: \"kubernetes.io/projected/dd806e2d-2675-448c-96f5-2440c2e243f2-kube-api-access-4xl8t\") pod \"dnsmasq-dns-8b8cf6657-gsgp2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.611168 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-dns-svc\") pod \"dnsmasq-dns-8b8cf6657-gsgp2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.611211 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971aa9eb-f331-425d-bf49-d626f3552480-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"971aa9eb-f331-425d-bf49-d626f3552480\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.611279 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-ovsdbserver-sb\") pod \"dnsmasq-dns-8b8cf6657-gsgp2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.627174 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971aa9eb-f331-425d-bf49-d626f3552480-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"971aa9eb-f331-425d-bf49-d626f3552480\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.627692 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971aa9eb-f331-425d-bf49-d626f3552480-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"971aa9eb-f331-425d-bf49-d626f3552480\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.635808 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntmvb\" (UniqueName: \"kubernetes.io/projected/971aa9eb-f331-425d-bf49-d626f3552480-kube-api-access-ntmvb\") pod \"nova-cell1-novncproxy-0\" (UID: \"971aa9eb-f331-425d-bf49-d626f3552480\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.713570 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-ovsdbserver-sb\") pod \"dnsmasq-dns-8b8cf6657-gsgp2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.713763 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-ovsdbserver-nb\") pod \"dnsmasq-dns-8b8cf6657-gsgp2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.713822 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-config\") pod \"dnsmasq-dns-8b8cf6657-gsgp2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.713916 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xl8t\" (UniqueName: \"kubernetes.io/projected/dd806e2d-2675-448c-96f5-2440c2e243f2-kube-api-access-4xl8t\") pod \"dnsmasq-dns-8b8cf6657-gsgp2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.713954 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-dns-svc\") pod \"dnsmasq-dns-8b8cf6657-gsgp2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.715228 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-dns-svc\") pod \"dnsmasq-dns-8b8cf6657-gsgp2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.715386 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-config\") pod \"dnsmasq-dns-8b8cf6657-gsgp2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.716142 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-ovsdbserver-nb\") pod \"dnsmasq-dns-8b8cf6657-gsgp2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.717034 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-ovsdbserver-sb\") pod \"dnsmasq-dns-8b8cf6657-gsgp2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.755264 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xl8t\" (UniqueName: \"kubernetes.io/projected/dd806e2d-2675-448c-96f5-2440c2e243f2-kube-api-access-4xl8t\") pod \"dnsmasq-dns-8b8cf6657-gsgp2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.759625 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:42:51 crc kubenswrapper[4764]: I0309 13:42:51.868876 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.002155 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.058414 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-5k46p"] Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.189227 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:42:52 crc kubenswrapper[4764]: W0309 13:42:52.200512 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd0fc8c3_6a60_4629_8b8d_8dd8b471f959.slice/crio-6920f7803183ef0808ee3ca6aca18179666a5ad88d1d672e13be3328cbb65468 WatchSource:0}: Error finding container 6920f7803183ef0808ee3ca6aca18179666a5ad88d1d672e13be3328cbb65468: Status 404 returned error can't find the container with id 6920f7803183ef0808ee3ca6aca18179666a5ad88d1d672e13be3328cbb65468 Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.223783 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nlxjx"] Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.226853 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.237287 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.237343 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.259992 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nlxjx"] Mar 09 13:42:52 crc kubenswrapper[4764]: W0309 13:42:52.333081 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4a57a2f_3a75_4d6a_9fd1_046f26fb32d2.slice/crio-fd0fea3ed4c6e561e840066a42f896e2884b69902ca33275962f893fe59f7641 WatchSource:0}: Error finding container fd0fea3ed4c6e561e840066a42f896e2884b69902ca33275962f893fe59f7641: Status 404 returned error can't find the container with id fd0fea3ed4c6e561e840066a42f896e2884b69902ca33275962f893fe59f7641 Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.342425 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-config-data\") pod \"nova-cell1-conductor-db-sync-nlxjx\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.342528 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-scripts\") pod \"nova-cell1-conductor-db-sync-nlxjx\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.342838 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-nlxjx\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.342958 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlrlv\" (UniqueName: \"kubernetes.io/projected/b60c99da-3ae5-4340-bcb0-870731679c16-kube-api-access-xlrlv\") pod \"nova-cell1-conductor-db-sync-nlxjx\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.345930 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.424971 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.445037 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-config-data\") pod \"nova-cell1-conductor-db-sync-nlxjx\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.445093 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-scripts\") pod \"nova-cell1-conductor-db-sync-nlxjx\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.445167 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-nlxjx\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.445201 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlrlv\" (UniqueName: \"kubernetes.io/projected/b60c99da-3ae5-4340-bcb0-870731679c16-kube-api-access-xlrlv\") pod \"nova-cell1-conductor-db-sync-nlxjx\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.452596 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-scripts\") pod \"nova-cell1-conductor-db-sync-nlxjx\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.452800 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-nlxjx\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.452853 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-config-data\") pod \"nova-cell1-conductor-db-sync-nlxjx\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.460325 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlrlv\" (UniqueName: \"kubernetes.io/projected/b60c99da-3ae5-4340-bcb0-870731679c16-kube-api-access-xlrlv\") pod \"nova-cell1-conductor-db-sync-nlxjx\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.569172 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.594258 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-gsgp2"] Mar 09 13:42:52 crc kubenswrapper[4764]: W0309 13:42:52.602236 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd806e2d_2675_448c_96f5_2440c2e243f2.slice/crio-725b7755875ebd6a6866a71e5813a9d3f7def7b5fe8e24fd807f8fb4b2f4627a WatchSource:0}: Error finding container 725b7755875ebd6a6866a71e5813a9d3f7def7b5fe8e24fd807f8fb4b2f4627a: Status 404 returned error can't find the container with id 725b7755875ebd6a6866a71e5813a9d3f7def7b5fe8e24fd807f8fb4b2f4627a Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.603862 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"971aa9eb-f331-425d-bf49-d626f3552480","Type":"ContainerStarted","Data":"bd5e9cec9ddcd16266d8f6864eddb8fb5fac434d1ee835511cb22e846c128223"} Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.604890 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2","Type":"ContainerStarted","Data":"fd0fea3ed4c6e561e840066a42f896e2884b69902ca33275962f893fe59f7641"} Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.606011 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5k46p" event={"ID":"9f09c604-028e-4965-aef8-6005ae365be9","Type":"ContainerStarted","Data":"d7855c57065ea118e0a7a66a2f52d85558ba845a318ccc1f11c06fba1afae771"} Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.606039 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5k46p" event={"ID":"9f09c604-028e-4965-aef8-6005ae365be9","Type":"ContainerStarted","Data":"1b7c3f7b353ab36332a603793952cde39e961f03595f37d7c599b56a31606280"} Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.609634 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cf59ae6-37a7-49a9-846d-e7815a57bda6","Type":"ContainerStarted","Data":"572746371ca77f76f3f1aebe0923635e0af71fe2c66089e7070ed987afb57c36"} Mar 09 13:42:52 crc kubenswrapper[4764]: I0309 13:42:52.610591 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959","Type":"ContainerStarted","Data":"6920f7803183ef0808ee3ca6aca18179666a5ad88d1d672e13be3328cbb65468"} Mar 09 13:42:53 crc kubenswrapper[4764]: I0309 13:42:53.175202 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-5k46p" podStartSLOduration=3.175179388 podStartE2EDuration="3.175179388s" podCreationTimestamp="2026-03-09 13:42:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:42:52.628902863 +0000 UTC m=+1327.879074771" watchObservedRunningTime="2026-03-09 13:42:53.175179388 +0000 UTC m=+1328.425351296" Mar 09 13:42:53 crc kubenswrapper[4764]: I0309 13:42:53.176241 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nlxjx"] Mar 09 13:42:53 crc kubenswrapper[4764]: I0309 13:42:53.633105 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd806e2d-2675-448c-96f5-2440c2e243f2" containerID="a48379ed6aa8003713adc5f3726b706529f649e8177a5d07e91f56612288af49" exitCode=0 Mar 09 13:42:53 crc kubenswrapper[4764]: I0309 13:42:53.633307 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" event={"ID":"dd806e2d-2675-448c-96f5-2440c2e243f2","Type":"ContainerDied","Data":"a48379ed6aa8003713adc5f3726b706529f649e8177a5d07e91f56612288af49"} Mar 09 13:42:53 crc kubenswrapper[4764]: I0309 13:42:53.633679 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" event={"ID":"dd806e2d-2675-448c-96f5-2440c2e243f2","Type":"ContainerStarted","Data":"725b7755875ebd6a6866a71e5813a9d3f7def7b5fe8e24fd807f8fb4b2f4627a"} Mar 09 13:42:53 crc kubenswrapper[4764]: I0309 13:42:53.642187 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-nlxjx" event={"ID":"b60c99da-3ae5-4340-bcb0-870731679c16","Type":"ContainerStarted","Data":"83d6da43ad98ba05734f9db9d33ae5993158ca7ef62b6b91c4677e94546cef00"} Mar 09 13:42:53 crc kubenswrapper[4764]: I0309 13:42:53.642270 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-nlxjx" event={"ID":"b60c99da-3ae5-4340-bcb0-870731679c16","Type":"ContainerStarted","Data":"aaf11202ef0774e446d6c1ce96fa33d0ce40eb52f56f28a40e2337ab3da0d2c0"} Mar 09 13:42:53 crc kubenswrapper[4764]: I0309 13:42:53.729326 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-nlxjx" podStartSLOduration=1.729293192 podStartE2EDuration="1.729293192s" podCreationTimestamp="2026-03-09 13:42:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:42:53.698613663 +0000 UTC m=+1328.948785571" watchObservedRunningTime="2026-03-09 13:42:53.729293192 +0000 UTC m=+1328.979465100" Mar 09 13:42:54 crc kubenswrapper[4764]: I0309 13:42:54.854917 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:42:54 crc kubenswrapper[4764]: I0309 13:42:54.867118 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 13:42:56 crc kubenswrapper[4764]: I0309 13:42:56.690577 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959","Type":"ContainerStarted","Data":"9b4ad77f575f902209b59697f790a142bfdc593a1dbbfea9490ebe9d7bdb60c7"} Mar 09 13:42:56 crc kubenswrapper[4764]: I0309 13:42:56.693046 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"971aa9eb-f331-425d-bf49-d626f3552480","Type":"ContainerStarted","Data":"fb0bb019badceace7ef69226556c5b5a38430ea28c0d0fd7f30e7e3614d9a9fb"} Mar 09 13:42:56 crc kubenswrapper[4764]: I0309 13:42:56.693199 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="971aa9eb-f331-425d-bf49-d626f3552480" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://fb0bb019badceace7ef69226556c5b5a38430ea28c0d0fd7f30e7e3614d9a9fb" gracePeriod=30 Mar 09 13:42:56 crc kubenswrapper[4764]: I0309 13:42:56.696349 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2","Type":"ContainerStarted","Data":"81d1407cfee85c86ab81cdf9ec86e68979d06f5a586c4a54b9f7e6a537e80948"} Mar 09 13:42:56 crc kubenswrapper[4764]: I0309 13:42:56.699361 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" event={"ID":"dd806e2d-2675-448c-96f5-2440c2e243f2","Type":"ContainerStarted","Data":"d735f180db949f6ebbd416610517dc4e197003ab27c3b1031338a802031ce748"} Mar 09 13:42:56 crc kubenswrapper[4764]: I0309 13:42:56.699473 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:42:56 crc kubenswrapper[4764]: I0309 13:42:56.707841 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cf59ae6-37a7-49a9-846d-e7815a57bda6","Type":"ContainerStarted","Data":"c32ce8455281ec944af9437b6a70a622f18333dc492467a4ab2c17a62209e49d"} Mar 09 13:42:56 crc kubenswrapper[4764]: I0309 13:42:56.718494 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.752068961 podStartE2EDuration="6.718465179s" podCreationTimestamp="2026-03-09 13:42:50 +0000 UTC" firstStartedPulling="2026-03-09 13:42:52.210895553 +0000 UTC m=+1327.461067461" lastFinishedPulling="2026-03-09 13:42:56.177291771 +0000 UTC m=+1331.427463679" observedRunningTime="2026-03-09 13:42:56.712161131 +0000 UTC m=+1331.962333059" watchObservedRunningTime="2026-03-09 13:42:56.718465179 +0000 UTC m=+1331.968637087" Mar 09 13:42:56 crc kubenswrapper[4764]: I0309 13:42:56.742691 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" podStartSLOduration=5.742669756 podStartE2EDuration="5.742669756s" podCreationTimestamp="2026-03-09 13:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:42:56.738839453 +0000 UTC m=+1331.989011361" watchObservedRunningTime="2026-03-09 13:42:56.742669756 +0000 UTC m=+1331.992841664" Mar 09 13:42:56 crc kubenswrapper[4764]: I0309 13:42:56.760440 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:42:56 crc kubenswrapper[4764]: I0309 13:42:56.770119 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.009900846 podStartE2EDuration="5.770097758s" podCreationTimestamp="2026-03-09 13:42:51 +0000 UTC" firstStartedPulling="2026-03-09 13:42:52.422365449 +0000 UTC m=+1327.672537357" lastFinishedPulling="2026-03-09 13:42:56.182562361 +0000 UTC m=+1331.432734269" observedRunningTime="2026-03-09 13:42:56.760331077 +0000 UTC m=+1332.010502995" watchObservedRunningTime="2026-03-09 13:42:56.770097758 +0000 UTC m=+1332.020269656" Mar 09 13:42:57 crc kubenswrapper[4764]: I0309 13:42:57.720415 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2","Type":"ContainerStarted","Data":"eb86255e700a4d6357cce0dc3140544f102a32bacc79bd2992bea6bc6385fe75"} Mar 09 13:42:57 crc kubenswrapper[4764]: I0309 13:42:57.720597 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2" containerName="nova-metadata-log" containerID="cri-o://81d1407cfee85c86ab81cdf9ec86e68979d06f5a586c4a54b9f7e6a537e80948" gracePeriod=30 Mar 09 13:42:57 crc kubenswrapper[4764]: I0309 13:42:57.720705 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2" containerName="nova-metadata-metadata" containerID="cri-o://eb86255e700a4d6357cce0dc3140544f102a32bacc79bd2992bea6bc6385fe75" gracePeriod=30 Mar 09 13:42:57 crc kubenswrapper[4764]: I0309 13:42:57.724256 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cf59ae6-37a7-49a9-846d-e7815a57bda6","Type":"ContainerStarted","Data":"d605216502245286a1e6a03c9b52f7f75118161d38b5ee337edb5e76def5a35a"} Mar 09 13:42:57 crc kubenswrapper[4764]: I0309 13:42:57.766758 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.925768507 podStartE2EDuration="7.766738726s" podCreationTimestamp="2026-03-09 13:42:50 +0000 UTC" firstStartedPulling="2026-03-09 13:42:52.335708025 +0000 UTC m=+1327.585879933" lastFinishedPulling="2026-03-09 13:42:56.176678244 +0000 UTC m=+1331.426850152" observedRunningTime="2026-03-09 13:42:57.746682671 +0000 UTC m=+1332.996854579" watchObservedRunningTime="2026-03-09 13:42:57.766738726 +0000 UTC m=+1333.016910634" Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.736663 4764 generic.go:334] "Generic (PLEG): container finished" podID="a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2" containerID="eb86255e700a4d6357cce0dc3140544f102a32bacc79bd2992bea6bc6385fe75" exitCode=0 Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.738038 4764 generic.go:334] "Generic (PLEG): container finished" podID="a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2" containerID="81d1407cfee85c86ab81cdf9ec86e68979d06f5a586c4a54b9f7e6a537e80948" exitCode=143 Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.736842 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2","Type":"ContainerDied","Data":"eb86255e700a4d6357cce0dc3140544f102a32bacc79bd2992bea6bc6385fe75"} Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.738219 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2","Type":"ContainerDied","Data":"81d1407cfee85c86ab81cdf9ec86e68979d06f5a586c4a54b9f7e6a537e80948"} Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.738237 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2","Type":"ContainerDied","Data":"fd0fea3ed4c6e561e840066a42f896e2884b69902ca33275962f893fe59f7641"} Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.738250 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd0fea3ed4c6e561e840066a42f896e2884b69902ca33275962f893fe59f7641" Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.803613 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.837971 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.676291276 podStartE2EDuration="8.837948977s" podCreationTimestamp="2026-03-09 13:42:50 +0000 UTC" firstStartedPulling="2026-03-09 13:42:52.013551814 +0000 UTC m=+1327.263723732" lastFinishedPulling="2026-03-09 13:42:56.175209525 +0000 UTC m=+1331.425381433" observedRunningTime="2026-03-09 13:42:57.779691352 +0000 UTC m=+1333.029863280" watchObservedRunningTime="2026-03-09 13:42:58.837948977 +0000 UTC m=+1334.088120875" Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.859520 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnhnz\" (UniqueName: \"kubernetes.io/projected/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-kube-api-access-nnhnz\") pod \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.859624 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-logs\") pod \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.859778 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-config-data\") pod \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.859949 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-combined-ca-bundle\") pod \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\" (UID: \"a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2\") " Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.860934 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-logs" (OuterVolumeSpecName: "logs") pod "a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2" (UID: "a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.868781 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-kube-api-access-nnhnz" (OuterVolumeSpecName: "kube-api-access-nnhnz") pod "a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2" (UID: "a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2"). InnerVolumeSpecName "kube-api-access-nnhnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.899688 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2" (UID: "a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.902568 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-config-data" (OuterVolumeSpecName: "config-data") pod "a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2" (UID: "a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.962383 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.962436 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.962455 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnhnz\" (UniqueName: \"kubernetes.io/projected/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-kube-api-access-nnhnz\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:58 crc kubenswrapper[4764]: I0309 13:42:58.962469 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.764679 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.814466 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.841066 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.853112 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:42:59 crc kubenswrapper[4764]: E0309 13:42:59.853924 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2" containerName="nova-metadata-metadata" Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.854051 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2" containerName="nova-metadata-metadata" Mar 09 13:42:59 crc kubenswrapper[4764]: E0309 13:42:59.854213 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2" containerName="nova-metadata-log" Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.854330 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2" containerName="nova-metadata-log" Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.854749 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2" containerName="nova-metadata-metadata" Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.854857 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2" containerName="nova-metadata-log" Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.856489 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.863394 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.865055 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.869290 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.994434 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " pod="openstack/nova-metadata-0" Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.994951 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24nvs\" (UniqueName: \"kubernetes.io/projected/cd9d1249-acf4-4cf5-a350-d4669d003a62-kube-api-access-24nvs\") pod \"nova-metadata-0\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " pod="openstack/nova-metadata-0" Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.994991 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd9d1249-acf4-4cf5-a350-d4669d003a62-logs\") pod \"nova-metadata-0\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " pod="openstack/nova-metadata-0" Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.995046 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " pod="openstack/nova-metadata-0" Mar 09 13:42:59 crc kubenswrapper[4764]: I0309 13:42:59.995207 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-config-data\") pod \"nova-metadata-0\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " pod="openstack/nova-metadata-0" Mar 09 13:43:00 crc kubenswrapper[4764]: I0309 13:43:00.097580 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " pod="openstack/nova-metadata-0" Mar 09 13:43:00 crc kubenswrapper[4764]: I0309 13:43:00.097698 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24nvs\" (UniqueName: \"kubernetes.io/projected/cd9d1249-acf4-4cf5-a350-d4669d003a62-kube-api-access-24nvs\") pod \"nova-metadata-0\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " pod="openstack/nova-metadata-0" Mar 09 13:43:00 crc kubenswrapper[4764]: I0309 13:43:00.097743 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd9d1249-acf4-4cf5-a350-d4669d003a62-logs\") pod \"nova-metadata-0\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " pod="openstack/nova-metadata-0" Mar 09 13:43:00 crc kubenswrapper[4764]: I0309 13:43:00.097802 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " pod="openstack/nova-metadata-0" Mar 09 13:43:00 crc kubenswrapper[4764]: I0309 13:43:00.097835 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-config-data\") pod \"nova-metadata-0\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " pod="openstack/nova-metadata-0" Mar 09 13:43:00 crc kubenswrapper[4764]: I0309 13:43:00.098576 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd9d1249-acf4-4cf5-a350-d4669d003a62-logs\") pod \"nova-metadata-0\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " pod="openstack/nova-metadata-0" Mar 09 13:43:00 crc kubenswrapper[4764]: I0309 13:43:00.102406 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " pod="openstack/nova-metadata-0" Mar 09 13:43:00 crc kubenswrapper[4764]: I0309 13:43:00.103061 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-config-data\") pod \"nova-metadata-0\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " pod="openstack/nova-metadata-0" Mar 09 13:43:00 crc kubenswrapper[4764]: I0309 13:43:00.103947 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " pod="openstack/nova-metadata-0" Mar 09 13:43:00 crc kubenswrapper[4764]: I0309 13:43:00.123149 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24nvs\" (UniqueName: \"kubernetes.io/projected/cd9d1249-acf4-4cf5-a350-d4669d003a62-kube-api-access-24nvs\") pod \"nova-metadata-0\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " pod="openstack/nova-metadata-0" Mar 09 13:43:00 crc kubenswrapper[4764]: I0309 13:43:00.182408 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:43:00 crc kubenswrapper[4764]: I0309 13:43:00.691329 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:43:00 crc kubenswrapper[4764]: W0309 13:43:00.701992 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd9d1249_acf4_4cf5_a350_d4669d003a62.slice/crio-ed67ba834fc7606ce45fe1f977a16e9be6d0e085710cf0629ecabbea579cc2e4 WatchSource:0}: Error finding container ed67ba834fc7606ce45fe1f977a16e9be6d0e085710cf0629ecabbea579cc2e4: Status 404 returned error can't find the container with id ed67ba834fc7606ce45fe1f977a16e9be6d0e085710cf0629ecabbea579cc2e4 Mar 09 13:43:00 crc kubenswrapper[4764]: I0309 13:43:00.775252 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd9d1249-acf4-4cf5-a350-d4669d003a62","Type":"ContainerStarted","Data":"ed67ba834fc7606ce45fe1f977a16e9be6d0e085710cf0629ecabbea579cc2e4"} Mar 09 13:43:01 crc kubenswrapper[4764]: I0309 13:43:01.114085 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 13:43:01 crc kubenswrapper[4764]: I0309 13:43:01.114621 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 13:43:01 crc kubenswrapper[4764]: I0309 13:43:01.410070 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 09 13:43:01 crc kubenswrapper[4764]: I0309 13:43:01.412246 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 09 13:43:01 crc kubenswrapper[4764]: I0309 13:43:01.446187 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 09 13:43:01 crc kubenswrapper[4764]: I0309 13:43:01.580008 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2" path="/var/lib/kubelet/pods/a4a57a2f-3a75-4d6a-9fd1-046f26fb32d2/volumes" Mar 09 13:43:01 crc kubenswrapper[4764]: I0309 13:43:01.787745 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd9d1249-acf4-4cf5-a350-d4669d003a62","Type":"ContainerStarted","Data":"f25bf6e9a5305bba01f06fcad8a45aed9a3a13bbe0f4d9055f3324ca8a56f788"} Mar 09 13:43:01 crc kubenswrapper[4764]: I0309 13:43:01.787834 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd9d1249-acf4-4cf5-a350-d4669d003a62","Type":"ContainerStarted","Data":"b3430f3076fe3ab1c512978e7bbb98434aa249b306d65493cf9204eac67f6af3"} Mar 09 13:43:01 crc kubenswrapper[4764]: I0309 13:43:01.806806 4764 generic.go:334] "Generic (PLEG): container finished" podID="b60c99da-3ae5-4340-bcb0-870731679c16" containerID="83d6da43ad98ba05734f9db9d33ae5993158ca7ef62b6b91c4677e94546cef00" exitCode=0 Mar 09 13:43:01 crc kubenswrapper[4764]: I0309 13:43:01.808550 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-nlxjx" event={"ID":"b60c99da-3ae5-4340-bcb0-870731679c16","Type":"ContainerDied","Data":"83d6da43ad98ba05734f9db9d33ae5993158ca7ef62b6b91c4677e94546cef00"} Mar 09 13:43:01 crc kubenswrapper[4764]: I0309 13:43:01.820901 4764 generic.go:334] "Generic (PLEG): container finished" podID="9f09c604-028e-4965-aef8-6005ae365be9" containerID="d7855c57065ea118e0a7a66a2f52d85558ba845a318ccc1f11c06fba1afae771" exitCode=0 Mar 09 13:43:01 crc kubenswrapper[4764]: I0309 13:43:01.822057 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5k46p" event={"ID":"9f09c604-028e-4965-aef8-6005ae365be9","Type":"ContainerDied","Data":"d7855c57065ea118e0a7a66a2f52d85558ba845a318ccc1f11c06fba1afae771"} Mar 09 13:43:01 crc kubenswrapper[4764]: I0309 13:43:01.824026 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.82399874 podStartE2EDuration="2.82399874s" podCreationTimestamp="2026-03-09 13:42:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:43:01.817454516 +0000 UTC m=+1337.067626444" watchObservedRunningTime="2026-03-09 13:43:01.82399874 +0000 UTC m=+1337.074170648" Mar 09 13:43:01 crc kubenswrapper[4764]: I0309 13:43:01.875959 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:43:01 crc kubenswrapper[4764]: I0309 13:43:01.884991 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.001175 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-d2v25"] Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.001575 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58db5546cc-d2v25" podUID="4e886001-842a-4f97-b4c3-d088d80e6a45" containerName="dnsmasq-dns" containerID="cri-o://f733fa97c901fa8349b84558eaddb5374df152c045dfc4a2b694279f5d3fa434" gracePeriod=10 Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.202000 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7cf59ae6-37a7-49a9-846d-e7815a57bda6" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.177:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.202292 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7cf59ae6-37a7-49a9-846d-e7815a57bda6" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.177:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.596895 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.664352 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-ovsdbserver-nb\") pod \"4e886001-842a-4f97-b4c3-d088d80e6a45\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.664447 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-dns-svc\") pod \"4e886001-842a-4f97-b4c3-d088d80e6a45\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.664525 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcpch\" (UniqueName: \"kubernetes.io/projected/4e886001-842a-4f97-b4c3-d088d80e6a45-kube-api-access-zcpch\") pod \"4e886001-842a-4f97-b4c3-d088d80e6a45\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.664614 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-config\") pod \"4e886001-842a-4f97-b4c3-d088d80e6a45\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.664853 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-ovsdbserver-sb\") pod \"4e886001-842a-4f97-b4c3-d088d80e6a45\" (UID: \"4e886001-842a-4f97-b4c3-d088d80e6a45\") " Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.678571 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e886001-842a-4f97-b4c3-d088d80e6a45-kube-api-access-zcpch" (OuterVolumeSpecName: "kube-api-access-zcpch") pod "4e886001-842a-4f97-b4c3-d088d80e6a45" (UID: "4e886001-842a-4f97-b4c3-d088d80e6a45"). InnerVolumeSpecName "kube-api-access-zcpch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.731635 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4e886001-842a-4f97-b4c3-d088d80e6a45" (UID: "4e886001-842a-4f97-b4c3-d088d80e6a45"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.768994 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.769219 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcpch\" (UniqueName: \"kubernetes.io/projected/4e886001-842a-4f97-b4c3-d088d80e6a45-kube-api-access-zcpch\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.776888 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4e886001-842a-4f97-b4c3-d088d80e6a45" (UID: "4e886001-842a-4f97-b4c3-d088d80e6a45"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.776920 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4e886001-842a-4f97-b4c3-d088d80e6a45" (UID: "4e886001-842a-4f97-b4c3-d088d80e6a45"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.780265 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-config" (OuterVolumeSpecName: "config") pod "4e886001-842a-4f97-b4c3-d088d80e6a45" (UID: "4e886001-842a-4f97-b4c3-d088d80e6a45"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.834536 4764 generic.go:334] "Generic (PLEG): container finished" podID="4e886001-842a-4f97-b4c3-d088d80e6a45" containerID="f733fa97c901fa8349b84558eaddb5374df152c045dfc4a2b694279f5d3fa434" exitCode=0 Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.834633 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58db5546cc-d2v25" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.834779 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-d2v25" event={"ID":"4e886001-842a-4f97-b4c3-d088d80e6a45","Type":"ContainerDied","Data":"f733fa97c901fa8349b84558eaddb5374df152c045dfc4a2b694279f5d3fa434"} Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.834890 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-d2v25" event={"ID":"4e886001-842a-4f97-b4c3-d088d80e6a45","Type":"ContainerDied","Data":"f8318b8e268cec9ccfcf591135ec8e9761aa9bf10f09e2ff5ebd0b76bbd7c843"} Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.834936 4764 scope.go:117] "RemoveContainer" containerID="f733fa97c901fa8349b84558eaddb5374df152c045dfc4a2b694279f5d3fa434" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.871345 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.871387 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.871398 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e886001-842a-4f97-b4c3-d088d80e6a45-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.884453 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-d2v25"] Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.894160 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-d2v25"] Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.898009 4764 scope.go:117] "RemoveContainer" containerID="d406e10da6d16d2f9b6ec89cf139706a3d0d52a75646e6e8156525cfc445bd01" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.933978 4764 scope.go:117] "RemoveContainer" containerID="f733fa97c901fa8349b84558eaddb5374df152c045dfc4a2b694279f5d3fa434" Mar 09 13:43:02 crc kubenswrapper[4764]: E0309 13:43:02.935326 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f733fa97c901fa8349b84558eaddb5374df152c045dfc4a2b694279f5d3fa434\": container with ID starting with f733fa97c901fa8349b84558eaddb5374df152c045dfc4a2b694279f5d3fa434 not found: ID does not exist" containerID="f733fa97c901fa8349b84558eaddb5374df152c045dfc4a2b694279f5d3fa434" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.935364 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f733fa97c901fa8349b84558eaddb5374df152c045dfc4a2b694279f5d3fa434"} err="failed to get container status \"f733fa97c901fa8349b84558eaddb5374df152c045dfc4a2b694279f5d3fa434\": rpc error: code = NotFound desc = could not find container \"f733fa97c901fa8349b84558eaddb5374df152c045dfc4a2b694279f5d3fa434\": container with ID starting with f733fa97c901fa8349b84558eaddb5374df152c045dfc4a2b694279f5d3fa434 not found: ID does not exist" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.935388 4764 scope.go:117] "RemoveContainer" containerID="d406e10da6d16d2f9b6ec89cf139706a3d0d52a75646e6e8156525cfc445bd01" Mar 09 13:43:02 crc kubenswrapper[4764]: E0309 13:43:02.945323 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d406e10da6d16d2f9b6ec89cf139706a3d0d52a75646e6e8156525cfc445bd01\": container with ID starting with d406e10da6d16d2f9b6ec89cf139706a3d0d52a75646e6e8156525cfc445bd01 not found: ID does not exist" containerID="d406e10da6d16d2f9b6ec89cf139706a3d0d52a75646e6e8156525cfc445bd01" Mar 09 13:43:02 crc kubenswrapper[4764]: I0309 13:43:02.945363 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d406e10da6d16d2f9b6ec89cf139706a3d0d52a75646e6e8156525cfc445bd01"} err="failed to get container status \"d406e10da6d16d2f9b6ec89cf139706a3d0d52a75646e6e8156525cfc445bd01\": rpc error: code = NotFound desc = could not find container \"d406e10da6d16d2f9b6ec89cf139706a3d0d52a75646e6e8156525cfc445bd01\": container with ID starting with d406e10da6d16d2f9b6ec89cf139706a3d0d52a75646e6e8156525cfc445bd01 not found: ID does not exist" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.372591 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.380000 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.493430 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-scripts\") pod \"b60c99da-3ae5-4340-bcb0-870731679c16\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.493605 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-combined-ca-bundle\") pod \"b60c99da-3ae5-4340-bcb0-870731679c16\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.494836 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6mc\" (UniqueName: \"kubernetes.io/projected/9f09c604-028e-4965-aef8-6005ae365be9-kube-api-access-sb6mc\") pod \"9f09c604-028e-4965-aef8-6005ae365be9\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.494952 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-scripts\") pod \"9f09c604-028e-4965-aef8-6005ae365be9\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.494987 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-config-data\") pod \"9f09c604-028e-4965-aef8-6005ae365be9\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.495017 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlrlv\" (UniqueName: \"kubernetes.io/projected/b60c99da-3ae5-4340-bcb0-870731679c16-kube-api-access-xlrlv\") pod \"b60c99da-3ae5-4340-bcb0-870731679c16\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.495044 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-combined-ca-bundle\") pod \"9f09c604-028e-4965-aef8-6005ae365be9\" (UID: \"9f09c604-028e-4965-aef8-6005ae365be9\") " Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.495095 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-config-data\") pod \"b60c99da-3ae5-4340-bcb0-870731679c16\" (UID: \"b60c99da-3ae5-4340-bcb0-870731679c16\") " Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.499346 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-scripts" (OuterVolumeSpecName: "scripts") pod "b60c99da-3ae5-4340-bcb0-870731679c16" (UID: "b60c99da-3ae5-4340-bcb0-870731679c16"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.507451 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f09c604-028e-4965-aef8-6005ae365be9-kube-api-access-sb6mc" (OuterVolumeSpecName: "kube-api-access-sb6mc") pod "9f09c604-028e-4965-aef8-6005ae365be9" (UID: "9f09c604-028e-4965-aef8-6005ae365be9"). InnerVolumeSpecName "kube-api-access-sb6mc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.507725 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b60c99da-3ae5-4340-bcb0-870731679c16-kube-api-access-xlrlv" (OuterVolumeSpecName: "kube-api-access-xlrlv") pod "b60c99da-3ae5-4340-bcb0-870731679c16" (UID: "b60c99da-3ae5-4340-bcb0-870731679c16"). InnerVolumeSpecName "kube-api-access-xlrlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.513777 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-scripts" (OuterVolumeSpecName: "scripts") pod "9f09c604-028e-4965-aef8-6005ae365be9" (UID: "9f09c604-028e-4965-aef8-6005ae365be9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.537254 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b60c99da-3ae5-4340-bcb0-870731679c16" (UID: "b60c99da-3ae5-4340-bcb0-870731679c16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.539895 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-config-data" (OuterVolumeSpecName: "config-data") pod "9f09c604-028e-4965-aef8-6005ae365be9" (UID: "9f09c604-028e-4965-aef8-6005ae365be9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.548860 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f09c604-028e-4965-aef8-6005ae365be9" (UID: "9f09c604-028e-4965-aef8-6005ae365be9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.555859 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-config-data" (OuterVolumeSpecName: "config-data") pod "b60c99da-3ae5-4340-bcb0-870731679c16" (UID: "b60c99da-3ae5-4340-bcb0-870731679c16"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.571489 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e886001-842a-4f97-b4c3-d088d80e6a45" path="/var/lib/kubelet/pods/4e886001-842a-4f97-b4c3-d088d80e6a45/volumes" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.598517 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6mc\" (UniqueName: \"kubernetes.io/projected/9f09c604-028e-4965-aef8-6005ae365be9-kube-api-access-sb6mc\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.598554 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.598564 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.598601 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlrlv\" (UniqueName: \"kubernetes.io/projected/b60c99da-3ae5-4340-bcb0-870731679c16-kube-api-access-xlrlv\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.598609 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f09c604-028e-4965-aef8-6005ae365be9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.598618 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.598626 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.598638 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b60c99da-3ae5-4340-bcb0-870731679c16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.845829 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-nlxjx" event={"ID":"b60c99da-3ae5-4340-bcb0-870731679c16","Type":"ContainerDied","Data":"aaf11202ef0774e446d6c1ce96fa33d0ce40eb52f56f28a40e2337ab3da0d2c0"} Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.845885 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aaf11202ef0774e446d6c1ce96fa33d0ce40eb52f56f28a40e2337ab3da0d2c0" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.845970 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-nlxjx" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.849032 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5k46p" event={"ID":"9f09c604-028e-4965-aef8-6005ae365be9","Type":"ContainerDied","Data":"1b7c3f7b353ab36332a603793952cde39e961f03595f37d7c599b56a31606280"} Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.849065 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b7c3f7b353ab36332a603793952cde39e961f03595f37d7c599b56a31606280" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.849116 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5k46p" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.863454 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.993489 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 09 13:43:03 crc kubenswrapper[4764]: E0309 13:43:03.994441 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f09c604-028e-4965-aef8-6005ae365be9" containerName="nova-manage" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.994526 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f09c604-028e-4965-aef8-6005ae365be9" containerName="nova-manage" Mar 09 13:43:03 crc kubenswrapper[4764]: E0309 13:43:03.994623 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e886001-842a-4f97-b4c3-d088d80e6a45" containerName="init" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.994708 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e886001-842a-4f97-b4c3-d088d80e6a45" containerName="init" Mar 09 13:43:03 crc kubenswrapper[4764]: E0309 13:43:03.994811 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e886001-842a-4f97-b4c3-d088d80e6a45" containerName="dnsmasq-dns" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.994893 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e886001-842a-4f97-b4c3-d088d80e6a45" containerName="dnsmasq-dns" Mar 09 13:43:03 crc kubenswrapper[4764]: E0309 13:43:03.994973 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b60c99da-3ae5-4340-bcb0-870731679c16" containerName="nova-cell1-conductor-db-sync" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.995051 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b60c99da-3ae5-4340-bcb0-870731679c16" containerName="nova-cell1-conductor-db-sync" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.995343 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e886001-842a-4f97-b4c3-d088d80e6a45" containerName="dnsmasq-dns" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.995431 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f09c604-028e-4965-aef8-6005ae365be9" containerName="nova-manage" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.995516 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b60c99da-3ae5-4340-bcb0-870731679c16" containerName="nova-cell1-conductor-db-sync" Mar 09 13:43:03 crc kubenswrapper[4764]: I0309 13:43:03.996674 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.001344 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.020998 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.109666 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbl7j\" (UniqueName: \"kubernetes.io/projected/959eb23f-c4b4-4f35-b284-38212848a084-kube-api-access-wbl7j\") pod \"nova-cell1-conductor-0\" (UID: \"959eb23f-c4b4-4f35-b284-38212848a084\") " pod="openstack/nova-cell1-conductor-0" Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.109734 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/959eb23f-c4b4-4f35-b284-38212848a084-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"959eb23f-c4b4-4f35-b284-38212848a084\") " pod="openstack/nova-cell1-conductor-0" Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.109819 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/959eb23f-c4b4-4f35-b284-38212848a084-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"959eb23f-c4b4-4f35-b284-38212848a084\") " pod="openstack/nova-cell1-conductor-0" Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.194767 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.195156 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7cf59ae6-37a7-49a9-846d-e7815a57bda6" containerName="nova-api-log" containerID="cri-o://c32ce8455281ec944af9437b6a70a622f18333dc492467a4ab2c17a62209e49d" gracePeriod=30 Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.195405 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7cf59ae6-37a7-49a9-846d-e7815a57bda6" containerName="nova-api-api" containerID="cri-o://d605216502245286a1e6a03c9b52f7f75118161d38b5ee337edb5e76def5a35a" gracePeriod=30 Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.211975 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbl7j\" (UniqueName: \"kubernetes.io/projected/959eb23f-c4b4-4f35-b284-38212848a084-kube-api-access-wbl7j\") pod \"nova-cell1-conductor-0\" (UID: \"959eb23f-c4b4-4f35-b284-38212848a084\") " pod="openstack/nova-cell1-conductor-0" Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.212058 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/959eb23f-c4b4-4f35-b284-38212848a084-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"959eb23f-c4b4-4f35-b284-38212848a084\") " pod="openstack/nova-cell1-conductor-0" Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.212135 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/959eb23f-c4b4-4f35-b284-38212848a084-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"959eb23f-c4b4-4f35-b284-38212848a084\") " pod="openstack/nova-cell1-conductor-0" Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.219553 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/959eb23f-c4b4-4f35-b284-38212848a084-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"959eb23f-c4b4-4f35-b284-38212848a084\") " pod="openstack/nova-cell1-conductor-0" Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.220791 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.237357 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/959eb23f-c4b4-4f35-b284-38212848a084-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"959eb23f-c4b4-4f35-b284-38212848a084\") " pod="openstack/nova-cell1-conductor-0" Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.237576 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.237958 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cd9d1249-acf4-4cf5-a350-d4669d003a62" containerName="nova-metadata-log" containerID="cri-o://b3430f3076fe3ab1c512978e7bbb98434aa249b306d65493cf9204eac67f6af3" gracePeriod=30 Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.238235 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cd9d1249-acf4-4cf5-a350-d4669d003a62" containerName="nova-metadata-metadata" containerID="cri-o://f25bf6e9a5305bba01f06fcad8a45aed9a3a13bbe0f4d9055f3324ca8a56f788" gracePeriod=30 Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.242868 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbl7j\" (UniqueName: \"kubernetes.io/projected/959eb23f-c4b4-4f35-b284-38212848a084-kube-api-access-wbl7j\") pod \"nova-cell1-conductor-0\" (UID: \"959eb23f-c4b4-4f35-b284-38212848a084\") " pod="openstack/nova-cell1-conductor-0" Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.327332 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.361624 4764 scope.go:117] "RemoveContainer" containerID="b34325cd4e2f6811cadb8554b8a6fb24248546f3c4182376f60b7c3268c9f6c7" Mar 09 13:43:04 crc kubenswrapper[4764]: E0309 13:43:04.641176 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd9d1249_acf4_4cf5_a350_d4669d003a62.slice/crio-conmon-f25bf6e9a5305bba01f06fcad8a45aed9a3a13bbe0f4d9055f3324ca8a56f788.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd9d1249_acf4_4cf5_a350_d4669d003a62.slice/crio-conmon-b3430f3076fe3ab1c512978e7bbb98434aa249b306d65493cf9204eac67f6af3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd9d1249_acf4_4cf5_a350_d4669d003a62.slice/crio-b3430f3076fe3ab1c512978e7bbb98434aa249b306d65493cf9204eac67f6af3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd9d1249_acf4_4cf5_a350_d4669d003a62.slice/crio-f25bf6e9a5305bba01f06fcad8a45aed9a3a13bbe0f4d9055f3324ca8a56f788.scope\": RecentStats: unable to find data in memory cache]" Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.863061 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.869686 4764 generic.go:334] "Generic (PLEG): container finished" podID="7cf59ae6-37a7-49a9-846d-e7815a57bda6" containerID="c32ce8455281ec944af9437b6a70a622f18333dc492467a4ab2c17a62209e49d" exitCode=143 Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.869794 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cf59ae6-37a7-49a9-846d-e7815a57bda6","Type":"ContainerDied","Data":"c32ce8455281ec944af9437b6a70a622f18333dc492467a4ab2c17a62209e49d"} Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.872518 4764 generic.go:334] "Generic (PLEG): container finished" podID="cd9d1249-acf4-4cf5-a350-d4669d003a62" containerID="f25bf6e9a5305bba01f06fcad8a45aed9a3a13bbe0f4d9055f3324ca8a56f788" exitCode=0 Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.872617 4764 generic.go:334] "Generic (PLEG): container finished" podID="cd9d1249-acf4-4cf5-a350-d4669d003a62" containerID="b3430f3076fe3ab1c512978e7bbb98434aa249b306d65493cf9204eac67f6af3" exitCode=143 Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.872892 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="bd0fc8c3-6a60-4629-8b8d-8dd8b471f959" containerName="nova-scheduler-scheduler" containerID="cri-o://9b4ad77f575f902209b59697f790a142bfdc593a1dbbfea9490ebe9d7bdb60c7" gracePeriod=30 Mar 09 13:43:04 crc kubenswrapper[4764]: W0309 13:43:04.873795 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod959eb23f_c4b4_4f35_b284_38212848a084.slice/crio-7a1dac56a13025152565d05f1934efe1d8bdc00ab66969ded2d836713d40b181 WatchSource:0}: Error finding container 7a1dac56a13025152565d05f1934efe1d8bdc00ab66969ded2d836713d40b181: Status 404 returned error can't find the container with id 7a1dac56a13025152565d05f1934efe1d8bdc00ab66969ded2d836713d40b181 Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.872561 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd9d1249-acf4-4cf5-a350-d4669d003a62","Type":"ContainerDied","Data":"f25bf6e9a5305bba01f06fcad8a45aed9a3a13bbe0f4d9055f3324ca8a56f788"} Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.874081 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd9d1249-acf4-4cf5-a350-d4669d003a62","Type":"ContainerDied","Data":"b3430f3076fe3ab1c512978e7bbb98434aa249b306d65493cf9204eac67f6af3"} Mar 09 13:43:04 crc kubenswrapper[4764]: I0309 13:43:04.893397 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.038843 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-nova-metadata-tls-certs\") pod \"cd9d1249-acf4-4cf5-a350-d4669d003a62\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.038911 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-combined-ca-bundle\") pod \"cd9d1249-acf4-4cf5-a350-d4669d003a62\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.039030 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd9d1249-acf4-4cf5-a350-d4669d003a62-logs\") pod \"cd9d1249-acf4-4cf5-a350-d4669d003a62\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.039235 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24nvs\" (UniqueName: \"kubernetes.io/projected/cd9d1249-acf4-4cf5-a350-d4669d003a62-kube-api-access-24nvs\") pod \"cd9d1249-acf4-4cf5-a350-d4669d003a62\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.039277 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-config-data\") pod \"cd9d1249-acf4-4cf5-a350-d4669d003a62\" (UID: \"cd9d1249-acf4-4cf5-a350-d4669d003a62\") " Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.040038 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd9d1249-acf4-4cf5-a350-d4669d003a62-logs" (OuterVolumeSpecName: "logs") pod "cd9d1249-acf4-4cf5-a350-d4669d003a62" (UID: "cd9d1249-acf4-4cf5-a350-d4669d003a62"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.050182 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd9d1249-acf4-4cf5-a350-d4669d003a62-kube-api-access-24nvs" (OuterVolumeSpecName: "kube-api-access-24nvs") pod "cd9d1249-acf4-4cf5-a350-d4669d003a62" (UID: "cd9d1249-acf4-4cf5-a350-d4669d003a62"). InnerVolumeSpecName "kube-api-access-24nvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.089779 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd9d1249-acf4-4cf5-a350-d4669d003a62" (UID: "cd9d1249-acf4-4cf5-a350-d4669d003a62"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.092890 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-config-data" (OuterVolumeSpecName: "config-data") pod "cd9d1249-acf4-4cf5-a350-d4669d003a62" (UID: "cd9d1249-acf4-4cf5-a350-d4669d003a62"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.114065 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "cd9d1249-acf4-4cf5-a350-d4669d003a62" (UID: "cd9d1249-acf4-4cf5-a350-d4669d003a62"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.142436 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24nvs\" (UniqueName: \"kubernetes.io/projected/cd9d1249-acf4-4cf5-a350-d4669d003a62-kube-api-access-24nvs\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.142492 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.142504 4764 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.142513 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd9d1249-acf4-4cf5-a350-d4669d003a62-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.142525 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd9d1249-acf4-4cf5-a350-d4669d003a62-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.885628 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"959eb23f-c4b4-4f35-b284-38212848a084","Type":"ContainerStarted","Data":"77d74bd2b6a4ce5fd56558be984d8ab6788ecfa94ffff85f54a47633040f07ad"} Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.886059 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.886073 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"959eb23f-c4b4-4f35-b284-38212848a084","Type":"ContainerStarted","Data":"7a1dac56a13025152565d05f1934efe1d8bdc00ab66969ded2d836713d40b181"} Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.887826 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd9d1249-acf4-4cf5-a350-d4669d003a62","Type":"ContainerDied","Data":"ed67ba834fc7606ce45fe1f977a16e9be6d0e085710cf0629ecabbea579cc2e4"} Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.887917 4764 scope.go:117] "RemoveContainer" containerID="f25bf6e9a5305bba01f06fcad8a45aed9a3a13bbe0f4d9055f3324ca8a56f788" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.888221 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.910524 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.9105020550000003 podStartE2EDuration="2.910502055s" podCreationTimestamp="2026-03-09 13:43:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:43:05.908506482 +0000 UTC m=+1341.158678410" watchObservedRunningTime="2026-03-09 13:43:05.910502055 +0000 UTC m=+1341.160673973" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.920021 4764 scope.go:117] "RemoveContainer" containerID="b3430f3076fe3ab1c512978e7bbb98434aa249b306d65493cf9204eac67f6af3" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.936747 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.952129 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.965305 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:43:05 crc kubenswrapper[4764]: E0309 13:43:05.965871 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd9d1249-acf4-4cf5-a350-d4669d003a62" containerName="nova-metadata-log" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.965893 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd9d1249-acf4-4cf5-a350-d4669d003a62" containerName="nova-metadata-log" Mar 09 13:43:05 crc kubenswrapper[4764]: E0309 13:43:05.965920 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd9d1249-acf4-4cf5-a350-d4669d003a62" containerName="nova-metadata-metadata" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.965930 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd9d1249-acf4-4cf5-a350-d4669d003a62" containerName="nova-metadata-metadata" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.966104 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd9d1249-acf4-4cf5-a350-d4669d003a62" containerName="nova-metadata-metadata" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.966124 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd9d1249-acf4-4cf5-a350-d4669d003a62" containerName="nova-metadata-log" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.967233 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.971466 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.971871 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 09 13:43:05 crc kubenswrapper[4764]: I0309 13:43:05.976328 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.059545 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aba6bca-21f2-4e18-90e7-098c8541a4f4-logs\") pod \"nova-metadata-0\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.059685 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.059730 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.060090 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-config-data\") pod \"nova-metadata-0\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.060483 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6cx7\" (UniqueName: \"kubernetes.io/projected/8aba6bca-21f2-4e18-90e7-098c8541a4f4-kube-api-access-h6cx7\") pod \"nova-metadata-0\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.162376 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.162485 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-config-data\") pod \"nova-metadata-0\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.162554 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6cx7\" (UniqueName: \"kubernetes.io/projected/8aba6bca-21f2-4e18-90e7-098c8541a4f4-kube-api-access-h6cx7\") pod \"nova-metadata-0\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.162577 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aba6bca-21f2-4e18-90e7-098c8541a4f4-logs\") pod \"nova-metadata-0\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.162621 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.163023 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aba6bca-21f2-4e18-90e7-098c8541a4f4-logs\") pod \"nova-metadata-0\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.168385 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-config-data\") pod \"nova-metadata-0\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.171120 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.181113 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.189249 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6cx7\" (UniqueName: \"kubernetes.io/projected/8aba6bca-21f2-4e18-90e7-098c8541a4f4-kube-api-access-h6cx7\") pod \"nova-metadata-0\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.308763 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:43:06 crc kubenswrapper[4764]: E0309 13:43:06.448008 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9b4ad77f575f902209b59697f790a142bfdc593a1dbbfea9490ebe9d7bdb60c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 09 13:43:06 crc kubenswrapper[4764]: E0309 13:43:06.463768 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9b4ad77f575f902209b59697f790a142bfdc593a1dbbfea9490ebe9d7bdb60c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 09 13:43:06 crc kubenswrapper[4764]: E0309 13:43:06.495694 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9b4ad77f575f902209b59697f790a142bfdc593a1dbbfea9490ebe9d7bdb60c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 09 13:43:06 crc kubenswrapper[4764]: E0309 13:43:06.495801 4764 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="bd0fc8c3-6a60-4629-8b8d-8dd8b471f959" containerName="nova-scheduler-scheduler" Mar 09 13:43:06 crc kubenswrapper[4764]: I0309 13:43:06.942588 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:43:07 crc kubenswrapper[4764]: I0309 13:43:07.573905 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd9d1249-acf4-4cf5-a350-d4669d003a62" path="/var/lib/kubelet/pods/cd9d1249-acf4-4cf5-a350-d4669d003a62/volumes" Mar 09 13:43:07 crc kubenswrapper[4764]: I0309 13:43:07.922731 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8aba6bca-21f2-4e18-90e7-098c8541a4f4","Type":"ContainerStarted","Data":"43bb793d9ae7aef4ac5c3135ad4636253715b28d8b5d66da3fd63d1b86a1f016"} Mar 09 13:43:07 crc kubenswrapper[4764]: I0309 13:43:07.922784 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8aba6bca-21f2-4e18-90e7-098c8541a4f4","Type":"ContainerStarted","Data":"ce8b8b59c7d046943c3a2a1f0ccc0e08b931a62a55c138ad63284ef6b7ad9a9a"} Mar 09 13:43:07 crc kubenswrapper[4764]: I0309 13:43:07.922797 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8aba6bca-21f2-4e18-90e7-098c8541a4f4","Type":"ContainerStarted","Data":"b10db7fc62dc747c8a3073bba39f8052766584cab6d39aec677be986aaca3d56"} Mar 09 13:43:08 crc kubenswrapper[4764]: I0309 13:43:08.942238 4764 generic.go:334] "Generic (PLEG): container finished" podID="7cf59ae6-37a7-49a9-846d-e7815a57bda6" containerID="d605216502245286a1e6a03c9b52f7f75118161d38b5ee337edb5e76def5a35a" exitCode=0 Mar 09 13:43:08 crc kubenswrapper[4764]: I0309 13:43:08.942330 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cf59ae6-37a7-49a9-846d-e7815a57bda6","Type":"ContainerDied","Data":"d605216502245286a1e6a03c9b52f7f75118161d38b5ee337edb5e76def5a35a"} Mar 09 13:43:08 crc kubenswrapper[4764]: I0309 13:43:08.954084 4764 generic.go:334] "Generic (PLEG): container finished" podID="bd0fc8c3-6a60-4629-8b8d-8dd8b471f959" containerID="9b4ad77f575f902209b59697f790a142bfdc593a1dbbfea9490ebe9d7bdb60c7" exitCode=0 Mar 09 13:43:08 crc kubenswrapper[4764]: I0309 13:43:08.954215 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959","Type":"ContainerDied","Data":"9b4ad77f575f902209b59697f790a142bfdc593a1dbbfea9490ebe9d7bdb60c7"} Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.066519 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.079067 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.112857 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.112825983 podStartE2EDuration="4.112825983s" podCreationTimestamp="2026-03-09 13:43:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:43:07.952108694 +0000 UTC m=+1343.202280602" watchObservedRunningTime="2026-03-09 13:43:09.112825983 +0000 UTC m=+1344.362997901" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.133102 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-combined-ca-bundle\") pod \"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959\" (UID: \"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959\") " Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.133175 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-config-data\") pod \"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959\" (UID: \"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959\") " Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.133304 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h84hj\" (UniqueName: \"kubernetes.io/projected/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-kube-api-access-h84hj\") pod \"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959\" (UID: \"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959\") " Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.133332 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cf59ae6-37a7-49a9-846d-e7815a57bda6-logs\") pod \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.133366 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkp7s\" (UniqueName: \"kubernetes.io/projected/7cf59ae6-37a7-49a9-846d-e7815a57bda6-kube-api-access-nkp7s\") pod \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.133392 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cf59ae6-37a7-49a9-846d-e7815a57bda6-combined-ca-bundle\") pod \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.133422 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cf59ae6-37a7-49a9-846d-e7815a57bda6-config-data\") pod \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\" (UID: \"7cf59ae6-37a7-49a9-846d-e7815a57bda6\") " Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.134418 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cf59ae6-37a7-49a9-846d-e7815a57bda6-logs" (OuterVolumeSpecName: "logs") pod "7cf59ae6-37a7-49a9-846d-e7815a57bda6" (UID: "7cf59ae6-37a7-49a9-846d-e7815a57bda6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.146261 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cf59ae6-37a7-49a9-846d-e7815a57bda6-kube-api-access-nkp7s" (OuterVolumeSpecName: "kube-api-access-nkp7s") pod "7cf59ae6-37a7-49a9-846d-e7815a57bda6" (UID: "7cf59ae6-37a7-49a9-846d-e7815a57bda6"). InnerVolumeSpecName "kube-api-access-nkp7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.154119 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-kube-api-access-h84hj" (OuterVolumeSpecName: "kube-api-access-h84hj") pod "bd0fc8c3-6a60-4629-8b8d-8dd8b471f959" (UID: "bd0fc8c3-6a60-4629-8b8d-8dd8b471f959"). InnerVolumeSpecName "kube-api-access-h84hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.165184 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cf59ae6-37a7-49a9-846d-e7815a57bda6-config-data" (OuterVolumeSpecName: "config-data") pod "7cf59ae6-37a7-49a9-846d-e7815a57bda6" (UID: "7cf59ae6-37a7-49a9-846d-e7815a57bda6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.169004 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd0fc8c3-6a60-4629-8b8d-8dd8b471f959" (UID: "bd0fc8c3-6a60-4629-8b8d-8dd8b471f959"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.169129 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cf59ae6-37a7-49a9-846d-e7815a57bda6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cf59ae6-37a7-49a9-846d-e7815a57bda6" (UID: "7cf59ae6-37a7-49a9-846d-e7815a57bda6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.181510 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-config-data" (OuterVolumeSpecName: "config-data") pod "bd0fc8c3-6a60-4629-8b8d-8dd8b471f959" (UID: "bd0fc8c3-6a60-4629-8b8d-8dd8b471f959"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.236010 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h84hj\" (UniqueName: \"kubernetes.io/projected/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-kube-api-access-h84hj\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.236056 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cf59ae6-37a7-49a9-846d-e7815a57bda6-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.236067 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkp7s\" (UniqueName: \"kubernetes.io/projected/7cf59ae6-37a7-49a9-846d-e7815a57bda6-kube-api-access-nkp7s\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.236081 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cf59ae6-37a7-49a9-846d-e7815a57bda6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.236093 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cf59ae6-37a7-49a9-846d-e7815a57bda6-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.236106 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.236117 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.967437 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cf59ae6-37a7-49a9-846d-e7815a57bda6","Type":"ContainerDied","Data":"572746371ca77f76f3f1aebe0923635e0af71fe2c66089e7070ed987afb57c36"} Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.968051 4764 scope.go:117] "RemoveContainer" containerID="d605216502245286a1e6a03c9b52f7f75118161d38b5ee337edb5e76def5a35a" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.967465 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.972136 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bd0fc8c3-6a60-4629-8b8d-8dd8b471f959","Type":"ContainerDied","Data":"6920f7803183ef0808ee3ca6aca18179666a5ad88d1d672e13be3328cbb65468"} Mar 09 13:43:09 crc kubenswrapper[4764]: I0309 13:43:09.972310 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.001881 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.015130 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.016686 4764 scope.go:117] "RemoveContainer" containerID="c32ce8455281ec944af9437b6a70a622f18333dc492467a4ab2c17a62209e49d" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.029818 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.041913 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.051827 4764 scope.go:117] "RemoveContainer" containerID="9b4ad77f575f902209b59697f790a142bfdc593a1dbbfea9490ebe9d7bdb60c7" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.056148 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:10 crc kubenswrapper[4764]: E0309 13:43:10.056630 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf59ae6-37a7-49a9-846d-e7815a57bda6" containerName="nova-api-api" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.056661 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf59ae6-37a7-49a9-846d-e7815a57bda6" containerName="nova-api-api" Mar 09 13:43:10 crc kubenswrapper[4764]: E0309 13:43:10.056690 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf59ae6-37a7-49a9-846d-e7815a57bda6" containerName="nova-api-log" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.056697 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf59ae6-37a7-49a9-846d-e7815a57bda6" containerName="nova-api-log" Mar 09 13:43:10 crc kubenswrapper[4764]: E0309 13:43:10.056709 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd0fc8c3-6a60-4629-8b8d-8dd8b471f959" containerName="nova-scheduler-scheduler" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.056716 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd0fc8c3-6a60-4629-8b8d-8dd8b471f959" containerName="nova-scheduler-scheduler" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.056913 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf59ae6-37a7-49a9-846d-e7815a57bda6" containerName="nova-api-log" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.056935 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd0fc8c3-6a60-4629-8b8d-8dd8b471f959" containerName="nova-scheduler-scheduler" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.056948 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf59ae6-37a7-49a9-846d-e7815a57bda6" containerName="nova-api-api" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.058003 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.060788 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.064982 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.067463 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.072110 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.075984 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.084975 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.154913 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtx9b\" (UniqueName: \"kubernetes.io/projected/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-kube-api-access-qtx9b\") pod \"nova-scheduler-0\" (UID: \"e3d70a54-660f-4ef9-bd2a-ed16699d8d66\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.154993 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-logs\") pod \"nova-api-0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " pod="openstack/nova-api-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.155573 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " pod="openstack/nova-api-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.155663 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-config-data\") pod \"nova-scheduler-0\" (UID: \"e3d70a54-660f-4ef9-bd2a-ed16699d8d66\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.155734 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-config-data\") pod \"nova-api-0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " pod="openstack/nova-api-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.156033 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfl66\" (UniqueName: \"kubernetes.io/projected/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-kube-api-access-pfl66\") pod \"nova-api-0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " pod="openstack/nova-api-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.156145 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e3d70a54-660f-4ef9-bd2a-ed16699d8d66\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.258384 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtx9b\" (UniqueName: \"kubernetes.io/projected/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-kube-api-access-qtx9b\") pod \"nova-scheduler-0\" (UID: \"e3d70a54-660f-4ef9-bd2a-ed16699d8d66\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.258470 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-logs\") pod \"nova-api-0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " pod="openstack/nova-api-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.258508 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " pod="openstack/nova-api-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.258530 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-config-data\") pod \"nova-scheduler-0\" (UID: \"e3d70a54-660f-4ef9-bd2a-ed16699d8d66\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.258556 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-config-data\") pod \"nova-api-0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " pod="openstack/nova-api-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.259201 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-logs\") pod \"nova-api-0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " pod="openstack/nova-api-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.259335 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfl66\" (UniqueName: \"kubernetes.io/projected/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-kube-api-access-pfl66\") pod \"nova-api-0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " pod="openstack/nova-api-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.259412 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e3d70a54-660f-4ef9-bd2a-ed16699d8d66\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.263943 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " pod="openstack/nova-api-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.264004 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-config-data\") pod \"nova-scheduler-0\" (UID: \"e3d70a54-660f-4ef9-bd2a-ed16699d8d66\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.264567 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-config-data\") pod \"nova-api-0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " pod="openstack/nova-api-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.265370 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e3d70a54-660f-4ef9-bd2a-ed16699d8d66\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.278597 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtx9b\" (UniqueName: \"kubernetes.io/projected/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-kube-api-access-qtx9b\") pod \"nova-scheduler-0\" (UID: \"e3d70a54-660f-4ef9-bd2a-ed16699d8d66\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.279710 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfl66\" (UniqueName: \"kubernetes.io/projected/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-kube-api-access-pfl66\") pod \"nova-api-0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " pod="openstack/nova-api-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.380915 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.393009 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.901495 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.968774 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:10 crc kubenswrapper[4764]: W0309 13:43:10.969193 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15e658d7_b575_4e5a_a0f2_3d1adcc41cc0.slice/crio-6553270ae008524ac846200ba9108d557920af3b1024ead659efdad57728b5bc WatchSource:0}: Error finding container 6553270ae008524ac846200ba9108d557920af3b1024ead659efdad57728b5bc: Status 404 returned error can't find the container with id 6553270ae008524ac846200ba9108d557920af3b1024ead659efdad57728b5bc Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.989336 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e3d70a54-660f-4ef9-bd2a-ed16699d8d66","Type":"ContainerStarted","Data":"737be67408dde868ad9928c7e4b5b5a92634607014e02a50994bdc6c48b356c6"} Mar 09 13:43:10 crc kubenswrapper[4764]: I0309 13:43:10.990610 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0","Type":"ContainerStarted","Data":"6553270ae008524ac846200ba9108d557920af3b1024ead659efdad57728b5bc"} Mar 09 13:43:11 crc kubenswrapper[4764]: I0309 13:43:11.309540 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 13:43:11 crc kubenswrapper[4764]: I0309 13:43:11.309615 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 13:43:11 crc kubenswrapper[4764]: I0309 13:43:11.572793 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cf59ae6-37a7-49a9-846d-e7815a57bda6" path="/var/lib/kubelet/pods/7cf59ae6-37a7-49a9-846d-e7815a57bda6/volumes" Mar 09 13:43:11 crc kubenswrapper[4764]: I0309 13:43:11.573406 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd0fc8c3-6a60-4629-8b8d-8dd8b471f959" path="/var/lib/kubelet/pods/bd0fc8c3-6a60-4629-8b8d-8dd8b471f959/volumes" Mar 09 13:43:12 crc kubenswrapper[4764]: I0309 13:43:12.002527 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0","Type":"ContainerStarted","Data":"e8c68c03da7f4747913f3409f5ace642ba6a37cb32da7a694e3af908791475c8"} Mar 09 13:43:12 crc kubenswrapper[4764]: I0309 13:43:12.003087 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0","Type":"ContainerStarted","Data":"62f7feefecbbe169538a57f544a5a783ae3f5d36a7e66415eb161a48e16ae91a"} Mar 09 13:43:12 crc kubenswrapper[4764]: I0309 13:43:12.004826 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e3d70a54-660f-4ef9-bd2a-ed16699d8d66","Type":"ContainerStarted","Data":"dc09c0ba380d56f5c82ec457ccdcea9f947bd8b08307a77c8684608027a7c1ac"} Mar 09 13:43:12 crc kubenswrapper[4764]: I0309 13:43:12.044540 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.044503735 podStartE2EDuration="2.044503735s" podCreationTimestamp="2026-03-09 13:43:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:43:12.024426059 +0000 UTC m=+1347.274597967" watchObservedRunningTime="2026-03-09 13:43:12.044503735 +0000 UTC m=+1347.294675663" Mar 09 13:43:12 crc kubenswrapper[4764]: I0309 13:43:12.050091 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.050070493 podStartE2EDuration="2.050070493s" podCreationTimestamp="2026-03-09 13:43:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:43:12.045958844 +0000 UTC m=+1347.296130752" watchObservedRunningTime="2026-03-09 13:43:12.050070493 +0000 UTC m=+1347.300242401" Mar 09 13:43:14 crc kubenswrapper[4764]: I0309 13:43:14.365136 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 09 13:43:15 crc kubenswrapper[4764]: I0309 13:43:15.393782 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 09 13:43:16 crc kubenswrapper[4764]: I0309 13:43:16.309417 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 09 13:43:16 crc kubenswrapper[4764]: I0309 13:43:16.309809 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 09 13:43:17 crc kubenswrapper[4764]: I0309 13:43:17.323860 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8aba6bca-21f2-4e18-90e7-098c8541a4f4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.185:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 13:43:17 crc kubenswrapper[4764]: I0309 13:43:17.323860 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8aba6bca-21f2-4e18-90e7-098c8541a4f4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.185:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 13:43:20 crc kubenswrapper[4764]: I0309 13:43:20.381732 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 13:43:20 crc kubenswrapper[4764]: I0309 13:43:20.382228 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 13:43:20 crc kubenswrapper[4764]: I0309 13:43:20.394083 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 09 13:43:20 crc kubenswrapper[4764]: I0309 13:43:20.437876 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 09 13:43:21 crc kubenswrapper[4764]: I0309 13:43:21.132009 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 09 13:43:21 crc kubenswrapper[4764]: I0309 13:43:21.464999 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 13:43:21 crc kubenswrapper[4764]: I0309 13:43:21.465062 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 13:43:26 crc kubenswrapper[4764]: I0309 13:43:26.316748 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 09 13:43:26 crc kubenswrapper[4764]: I0309 13:43:26.320802 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 09 13:43:26 crc kubenswrapper[4764]: I0309 13:43:26.326656 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 09 13:43:26 crc kubenswrapper[4764]: I0309 13:43:26.327346 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.091292 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.128433 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971aa9eb-f331-425d-bf49-d626f3552480-combined-ca-bundle\") pod \"971aa9eb-f331-425d-bf49-d626f3552480\" (UID: \"971aa9eb-f331-425d-bf49-d626f3552480\") " Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.128504 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971aa9eb-f331-425d-bf49-d626f3552480-config-data\") pod \"971aa9eb-f331-425d-bf49-d626f3552480\" (UID: \"971aa9eb-f331-425d-bf49-d626f3552480\") " Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.128716 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntmvb\" (UniqueName: \"kubernetes.io/projected/971aa9eb-f331-425d-bf49-d626f3552480-kube-api-access-ntmvb\") pod \"971aa9eb-f331-425d-bf49-d626f3552480\" (UID: \"971aa9eb-f331-425d-bf49-d626f3552480\") " Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.136087 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/971aa9eb-f331-425d-bf49-d626f3552480-kube-api-access-ntmvb" (OuterVolumeSpecName: "kube-api-access-ntmvb") pod "971aa9eb-f331-425d-bf49-d626f3552480" (UID: "971aa9eb-f331-425d-bf49-d626f3552480"). InnerVolumeSpecName "kube-api-access-ntmvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.156635 4764 generic.go:334] "Generic (PLEG): container finished" podID="971aa9eb-f331-425d-bf49-d626f3552480" containerID="fb0bb019badceace7ef69226556c5b5a38430ea28c0d0fd7f30e7e3614d9a9fb" exitCode=137 Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.157964 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.158141 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"971aa9eb-f331-425d-bf49-d626f3552480","Type":"ContainerDied","Data":"fb0bb019badceace7ef69226556c5b5a38430ea28c0d0fd7f30e7e3614d9a9fb"} Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.158185 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"971aa9eb-f331-425d-bf49-d626f3552480","Type":"ContainerDied","Data":"bd5e9cec9ddcd16266d8f6864eddb8fb5fac434d1ee835511cb22e846c128223"} Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.158209 4764 scope.go:117] "RemoveContainer" containerID="fb0bb019badceace7ef69226556c5b5a38430ea28c0d0fd7f30e7e3614d9a9fb" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.162050 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/971aa9eb-f331-425d-bf49-d626f3552480-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "971aa9eb-f331-425d-bf49-d626f3552480" (UID: "971aa9eb-f331-425d-bf49-d626f3552480"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.162879 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/971aa9eb-f331-425d-bf49-d626f3552480-config-data" (OuterVolumeSpecName: "config-data") pod "971aa9eb-f331-425d-bf49-d626f3552480" (UID: "971aa9eb-f331-425d-bf49-d626f3552480"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.230169 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntmvb\" (UniqueName: \"kubernetes.io/projected/971aa9eb-f331-425d-bf49-d626f3552480-kube-api-access-ntmvb\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.230213 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971aa9eb-f331-425d-bf49-d626f3552480-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.230228 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971aa9eb-f331-425d-bf49-d626f3552480-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.248129 4764 scope.go:117] "RemoveContainer" containerID="fb0bb019badceace7ef69226556c5b5a38430ea28c0d0fd7f30e7e3614d9a9fb" Mar 09 13:43:27 crc kubenswrapper[4764]: E0309 13:43:27.248809 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb0bb019badceace7ef69226556c5b5a38430ea28c0d0fd7f30e7e3614d9a9fb\": container with ID starting with fb0bb019badceace7ef69226556c5b5a38430ea28c0d0fd7f30e7e3614d9a9fb not found: ID does not exist" containerID="fb0bb019badceace7ef69226556c5b5a38430ea28c0d0fd7f30e7e3614d9a9fb" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.248872 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb0bb019badceace7ef69226556c5b5a38430ea28c0d0fd7f30e7e3614d9a9fb"} err="failed to get container status \"fb0bb019badceace7ef69226556c5b5a38430ea28c0d0fd7f30e7e3614d9a9fb\": rpc error: code = NotFound desc = could not find container \"fb0bb019badceace7ef69226556c5b5a38430ea28c0d0fd7f30e7e3614d9a9fb\": container with ID starting with fb0bb019badceace7ef69226556c5b5a38430ea28c0d0fd7f30e7e3614d9a9fb not found: ID does not exist" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.497078 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.521800 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.537052 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 13:43:27 crc kubenswrapper[4764]: E0309 13:43:27.537717 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="971aa9eb-f331-425d-bf49-d626f3552480" containerName="nova-cell1-novncproxy-novncproxy" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.537746 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="971aa9eb-f331-425d-bf49-d626f3552480" containerName="nova-cell1-novncproxy-novncproxy" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.537995 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="971aa9eb-f331-425d-bf49-d626f3552480" containerName="nova-cell1-novncproxy-novncproxy" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.539029 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.543403 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.543677 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.543804 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.547475 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.576844 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="971aa9eb-f331-425d-bf49-d626f3552480" path="/var/lib/kubelet/pods/971aa9eb-f331-425d-bf49-d626f3552480/volumes" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.640270 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6932dd15-578a-4965-bcb9-b506d4e3cd2f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932dd15-578a-4965-bcb9-b506d4e3cd2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.640323 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jknct\" (UniqueName: \"kubernetes.io/projected/6932dd15-578a-4965-bcb9-b506d4e3cd2f-kube-api-access-jknct\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932dd15-578a-4965-bcb9-b506d4e3cd2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.640357 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6932dd15-578a-4965-bcb9-b506d4e3cd2f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932dd15-578a-4965-bcb9-b506d4e3cd2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.640506 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6932dd15-578a-4965-bcb9-b506d4e3cd2f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932dd15-578a-4965-bcb9-b506d4e3cd2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.640560 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6932dd15-578a-4965-bcb9-b506d4e3cd2f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932dd15-578a-4965-bcb9-b506d4e3cd2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.741832 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6932dd15-578a-4965-bcb9-b506d4e3cd2f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932dd15-578a-4965-bcb9-b506d4e3cd2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.741900 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6932dd15-578a-4965-bcb9-b506d4e3cd2f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932dd15-578a-4965-bcb9-b506d4e3cd2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.741929 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6932dd15-578a-4965-bcb9-b506d4e3cd2f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932dd15-578a-4965-bcb9-b506d4e3cd2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.741954 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jknct\" (UniqueName: \"kubernetes.io/projected/6932dd15-578a-4965-bcb9-b506d4e3cd2f-kube-api-access-jknct\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932dd15-578a-4965-bcb9-b506d4e3cd2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.741974 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6932dd15-578a-4965-bcb9-b506d4e3cd2f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932dd15-578a-4965-bcb9-b506d4e3cd2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.747046 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6932dd15-578a-4965-bcb9-b506d4e3cd2f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932dd15-578a-4965-bcb9-b506d4e3cd2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.747432 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6932dd15-578a-4965-bcb9-b506d4e3cd2f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932dd15-578a-4965-bcb9-b506d4e3cd2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.747767 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6932dd15-578a-4965-bcb9-b506d4e3cd2f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932dd15-578a-4965-bcb9-b506d4e3cd2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.748274 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6932dd15-578a-4965-bcb9-b506d4e3cd2f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932dd15-578a-4965-bcb9-b506d4e3cd2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.764292 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jknct\" (UniqueName: \"kubernetes.io/projected/6932dd15-578a-4965-bcb9-b506d4e3cd2f-kube-api-access-jknct\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932dd15-578a-4965-bcb9-b506d4e3cd2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:27 crc kubenswrapper[4764]: I0309 13:43:27.859405 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:28 crc kubenswrapper[4764]: I0309 13:43:28.155416 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 13:43:28 crc kubenswrapper[4764]: I0309 13:43:28.370456 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:43:28 crc kubenswrapper[4764]: I0309 13:43:28.370577 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:43:29 crc kubenswrapper[4764]: I0309 13:43:29.186837 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6932dd15-578a-4965-bcb9-b506d4e3cd2f","Type":"ContainerStarted","Data":"0acc8aaa267e07479eac94d51afefd6535810f193ba4dfbfd67a21ef8822e104"} Mar 09 13:43:29 crc kubenswrapper[4764]: I0309 13:43:29.187485 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6932dd15-578a-4965-bcb9-b506d4e3cd2f","Type":"ContainerStarted","Data":"3d495470829f66481099a9f4e12b11b8d834944a8778a8e0089111f310a90d03"} Mar 09 13:43:29 crc kubenswrapper[4764]: I0309 13:43:29.222014 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.221980193 podStartE2EDuration="2.221980193s" podCreationTimestamp="2026-03-09 13:43:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:43:29.21213971 +0000 UTC m=+1364.462311648" watchObservedRunningTime="2026-03-09 13:43:29.221980193 +0000 UTC m=+1364.472152111" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.389730 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.390311 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.392099 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.392149 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.399195 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.403245 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.643041 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-bwpkr"] Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.644554 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.711308 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-bwpkr"] Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.730062 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-dns-svc\") pod \"dnsmasq-dns-68d4b6d797-bwpkr\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.730150 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-config\") pod \"dnsmasq-dns-68d4b6d797-bwpkr\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.730184 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-ovsdbserver-sb\") pod \"dnsmasq-dns-68d4b6d797-bwpkr\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.730312 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-ovsdbserver-nb\") pod \"dnsmasq-dns-68d4b6d797-bwpkr\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.730354 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jzfc\" (UniqueName: \"kubernetes.io/projected/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-kube-api-access-8jzfc\") pod \"dnsmasq-dns-68d4b6d797-bwpkr\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.835447 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-ovsdbserver-nb\") pod \"dnsmasq-dns-68d4b6d797-bwpkr\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.835538 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jzfc\" (UniqueName: \"kubernetes.io/projected/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-kube-api-access-8jzfc\") pod \"dnsmasq-dns-68d4b6d797-bwpkr\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.836981 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-ovsdbserver-nb\") pod \"dnsmasq-dns-68d4b6d797-bwpkr\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.837548 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-dns-svc\") pod \"dnsmasq-dns-68d4b6d797-bwpkr\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.835657 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-dns-svc\") pod \"dnsmasq-dns-68d4b6d797-bwpkr\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.841047 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-config\") pod \"dnsmasq-dns-68d4b6d797-bwpkr\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.841100 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-ovsdbserver-sb\") pod \"dnsmasq-dns-68d4b6d797-bwpkr\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.841940 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-ovsdbserver-sb\") pod \"dnsmasq-dns-68d4b6d797-bwpkr\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.842521 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-config\") pod \"dnsmasq-dns-68d4b6d797-bwpkr\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.871350 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jzfc\" (UniqueName: \"kubernetes.io/projected/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-kube-api-access-8jzfc\") pod \"dnsmasq-dns-68d4b6d797-bwpkr\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:30 crc kubenswrapper[4764]: I0309 13:43:30.972052 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:31 crc kubenswrapper[4764]: I0309 13:43:31.575925 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-bwpkr"] Mar 09 13:43:32 crc kubenswrapper[4764]: I0309 13:43:32.237290 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" event={"ID":"4ae25624-74af-4de4-8aa1-14ea5dbc7b68","Type":"ContainerStarted","Data":"3ac89812f31874fa083542c67d0593b58a24f3774ac9cde06937d2e9c1a94aaf"} Mar 09 13:43:32 crc kubenswrapper[4764]: I0309 13:43:32.859919 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:33 crc kubenswrapper[4764]: I0309 13:43:33.247924 4764 generic.go:334] "Generic (PLEG): container finished" podID="4ae25624-74af-4de4-8aa1-14ea5dbc7b68" containerID="1417bd2805b59981136a1861cd39d3dca99a51d7e7919e8b5c01839da7f91123" exitCode=0 Mar 09 13:43:33 crc kubenswrapper[4764]: I0309 13:43:33.247972 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" event={"ID":"4ae25624-74af-4de4-8aa1-14ea5dbc7b68","Type":"ContainerDied","Data":"1417bd2805b59981136a1861cd39d3dca99a51d7e7919e8b5c01839da7f91123"} Mar 09 13:43:33 crc kubenswrapper[4764]: I0309 13:43:33.519801 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:43:33 crc kubenswrapper[4764]: I0309 13:43:33.520721 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="ceilometer-central-agent" containerID="cri-o://83e456ab335c632ce0189226c9d2e7f0e9ecfe5a406e7edda24caa25bafb3539" gracePeriod=30 Mar 09 13:43:33 crc kubenswrapper[4764]: I0309 13:43:33.520804 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="ceilometer-notification-agent" containerID="cri-o://14dd461ed43101072708519038bb785a1ba517ec81427d8fb63817a3eca45624" gracePeriod=30 Mar 09 13:43:33 crc kubenswrapper[4764]: I0309 13:43:33.520804 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="sg-core" containerID="cri-o://b944ec589887c7cb6534662085f3e5e58d2f1dce03f0aa12b1ed3e01a649c84e" gracePeriod=30 Mar 09 13:43:33 crc kubenswrapper[4764]: I0309 13:43:33.520835 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="proxy-httpd" containerID="cri-o://6967373dce85bc726e8932709b98fc503ab51918ee1216795cbe6f5192aa2e00" gracePeriod=30 Mar 09 13:43:33 crc kubenswrapper[4764]: I0309 13:43:33.647424 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:33 crc kubenswrapper[4764]: I0309 13:43:33.647782 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" containerName="nova-api-log" containerID="cri-o://62f7feefecbbe169538a57f544a5a783ae3f5d36a7e66415eb161a48e16ae91a" gracePeriod=30 Mar 09 13:43:33 crc kubenswrapper[4764]: I0309 13:43:33.647991 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" containerName="nova-api-api" containerID="cri-o://e8c68c03da7f4747913f3409f5ace642ba6a37cb32da7a694e3af908791475c8" gracePeriod=30 Mar 09 13:43:33 crc kubenswrapper[4764]: I0309 13:43:33.845156 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.174:3000/\": dial tcp 10.217.0.174:3000: connect: connection refused" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.260561 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" event={"ID":"4ae25624-74af-4de4-8aa1-14ea5dbc7b68","Type":"ContainerStarted","Data":"37fa28134b3929cb4f988c4f648598adc00dd97c039b66d6a61153de431009b4"} Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.262186 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.265571 4764 generic.go:334] "Generic (PLEG): container finished" podID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerID="6967373dce85bc726e8932709b98fc503ab51918ee1216795cbe6f5192aa2e00" exitCode=0 Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.265595 4764 generic.go:334] "Generic (PLEG): container finished" podID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerID="b944ec589887c7cb6534662085f3e5e58d2f1dce03f0aa12b1ed3e01a649c84e" exitCode=2 Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.265604 4764 generic.go:334] "Generic (PLEG): container finished" podID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerID="14dd461ed43101072708519038bb785a1ba517ec81427d8fb63817a3eca45624" exitCode=0 Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.265613 4764 generic.go:334] "Generic (PLEG): container finished" podID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerID="83e456ab335c632ce0189226c9d2e7f0e9ecfe5a406e7edda24caa25bafb3539" exitCode=0 Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.265666 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85e5a081-f0bc-457a-a3f0-9f9152f942c3","Type":"ContainerDied","Data":"6967373dce85bc726e8932709b98fc503ab51918ee1216795cbe6f5192aa2e00"} Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.265685 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85e5a081-f0bc-457a-a3f0-9f9152f942c3","Type":"ContainerDied","Data":"b944ec589887c7cb6534662085f3e5e58d2f1dce03f0aa12b1ed3e01a649c84e"} Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.265697 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85e5a081-f0bc-457a-a3f0-9f9152f942c3","Type":"ContainerDied","Data":"14dd461ed43101072708519038bb785a1ba517ec81427d8fb63817a3eca45624"} Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.265707 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85e5a081-f0bc-457a-a3f0-9f9152f942c3","Type":"ContainerDied","Data":"83e456ab335c632ce0189226c9d2e7f0e9ecfe5a406e7edda24caa25bafb3539"} Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.268112 4764 generic.go:334] "Generic (PLEG): container finished" podID="15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" containerID="62f7feefecbbe169538a57f544a5a783ae3f5d36a7e66415eb161a48e16ae91a" exitCode=143 Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.268172 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0","Type":"ContainerDied","Data":"62f7feefecbbe169538a57f544a5a783ae3f5d36a7e66415eb161a48e16ae91a"} Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.285754 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" podStartSLOduration=4.2857334080000005 podStartE2EDuration="4.285733408s" podCreationTimestamp="2026-03-09 13:43:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:43:34.284388193 +0000 UTC m=+1369.534560101" watchObservedRunningTime="2026-03-09 13:43:34.285733408 +0000 UTC m=+1369.535905316" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.451837 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.539486 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-config-data\") pod \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.539675 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-scripts\") pod \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.539733 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-ceilometer-tls-certs\") pod \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.539769 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9xwx\" (UniqueName: \"kubernetes.io/projected/85e5a081-f0bc-457a-a3f0-9f9152f942c3-kube-api-access-k9xwx\") pod \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.539806 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85e5a081-f0bc-457a-a3f0-9f9152f942c3-run-httpd\") pod \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.539850 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-combined-ca-bundle\") pod \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.539876 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-sg-core-conf-yaml\") pod \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.539938 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85e5a081-f0bc-457a-a3f0-9f9152f942c3-log-httpd\") pod \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\" (UID: \"85e5a081-f0bc-457a-a3f0-9f9152f942c3\") " Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.540224 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85e5a081-f0bc-457a-a3f0-9f9152f942c3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "85e5a081-f0bc-457a-a3f0-9f9152f942c3" (UID: "85e5a081-f0bc-457a-a3f0-9f9152f942c3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.540705 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85e5a081-f0bc-457a-a3f0-9f9152f942c3-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.541016 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85e5a081-f0bc-457a-a3f0-9f9152f942c3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "85e5a081-f0bc-457a-a3f0-9f9152f942c3" (UID: "85e5a081-f0bc-457a-a3f0-9f9152f942c3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.562161 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85e5a081-f0bc-457a-a3f0-9f9152f942c3-kube-api-access-k9xwx" (OuterVolumeSpecName: "kube-api-access-k9xwx") pod "85e5a081-f0bc-457a-a3f0-9f9152f942c3" (UID: "85e5a081-f0bc-457a-a3f0-9f9152f942c3"). InnerVolumeSpecName "kube-api-access-k9xwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.570190 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-scripts" (OuterVolumeSpecName: "scripts") pod "85e5a081-f0bc-457a-a3f0-9f9152f942c3" (UID: "85e5a081-f0bc-457a-a3f0-9f9152f942c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.599680 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "85e5a081-f0bc-457a-a3f0-9f9152f942c3" (UID: "85e5a081-f0bc-457a-a3f0-9f9152f942c3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.625919 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "85e5a081-f0bc-457a-a3f0-9f9152f942c3" (UID: "85e5a081-f0bc-457a-a3f0-9f9152f942c3"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.643739 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.643779 4764 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.643794 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9xwx\" (UniqueName: \"kubernetes.io/projected/85e5a081-f0bc-457a-a3f0-9f9152f942c3-kube-api-access-k9xwx\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.643804 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.643820 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85e5a081-f0bc-457a-a3f0-9f9152f942c3-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.660392 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85e5a081-f0bc-457a-a3f0-9f9152f942c3" (UID: "85e5a081-f0bc-457a-a3f0-9f9152f942c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.670223 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-config-data" (OuterVolumeSpecName: "config-data") pod "85e5a081-f0bc-457a-a3f0-9f9152f942c3" (UID: "85e5a081-f0bc-457a-a3f0-9f9152f942c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.745366 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:34 crc kubenswrapper[4764]: I0309 13:43:34.745404 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85e5a081-f0bc-457a-a3f0-9f9152f942c3-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.281991 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85e5a081-f0bc-457a-a3f0-9f9152f942c3","Type":"ContainerDied","Data":"d2d07148bb1bfe75a3a625df2d14aba687a70d2b7d04873d08b4da416d307e9c"} Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.282097 4764 scope.go:117] "RemoveContainer" containerID="6967373dce85bc726e8932709b98fc503ab51918ee1216795cbe6f5192aa2e00" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.282155 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.345507 4764 scope.go:117] "RemoveContainer" containerID="b944ec589887c7cb6534662085f3e5e58d2f1dce03f0aa12b1ed3e01a649c84e" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.347622 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.355970 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.376437 4764 scope.go:117] "RemoveContainer" containerID="14dd461ed43101072708519038bb785a1ba517ec81427d8fb63817a3eca45624" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.379722 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:43:35 crc kubenswrapper[4764]: E0309 13:43:35.380181 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="ceilometer-notification-agent" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.380198 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="ceilometer-notification-agent" Mar 09 13:43:35 crc kubenswrapper[4764]: E0309 13:43:35.380222 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="ceilometer-central-agent" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.380229 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="ceilometer-central-agent" Mar 09 13:43:35 crc kubenswrapper[4764]: E0309 13:43:35.380259 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="proxy-httpd" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.380267 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="proxy-httpd" Mar 09 13:43:35 crc kubenswrapper[4764]: E0309 13:43:35.380279 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="sg-core" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.380288 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="sg-core" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.380461 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="proxy-httpd" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.380480 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="ceilometer-notification-agent" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.380501 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="sg-core" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.380509 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" containerName="ceilometer-central-agent" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.383808 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.391476 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.392138 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.392676 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.398057 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.417508 4764 scope.go:117] "RemoveContainer" containerID="83e456ab335c632ce0189226c9d2e7f0e9ecfe5a406e7edda24caa25bafb3539" Mar 09 13:43:35 crc kubenswrapper[4764]: E0309 13:43:35.426128 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85e5a081_f0bc_457a_a3f0_9f9152f942c3.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85e5a081_f0bc_457a_a3f0_9f9152f942c3.slice/crio-d2d07148bb1bfe75a3a625df2d14aba687a70d2b7d04873d08b4da416d307e9c\": RecentStats: unable to find data in memory cache]" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.459195 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.459653 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkrrk\" (UniqueName: \"kubernetes.io/projected/2897dc65-e596-414b-b73e-172b0042b6cd-kube-api-access-dkrrk\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.459686 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-config-data\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.459758 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-scripts\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.459837 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.459873 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2897dc65-e596-414b-b73e-172b0042b6cd-run-httpd\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.459890 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.459953 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2897dc65-e596-414b-b73e-172b0042b6cd-log-httpd\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.561480 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.561531 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2897dc65-e596-414b-b73e-172b0042b6cd-run-httpd\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.561579 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2897dc65-e596-414b-b73e-172b0042b6cd-log-httpd\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.561683 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.561738 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkrrk\" (UniqueName: \"kubernetes.io/projected/2897dc65-e596-414b-b73e-172b0042b6cd-kube-api-access-dkrrk\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.561769 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-config-data\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.561842 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-scripts\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.561898 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.563398 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2897dc65-e596-414b-b73e-172b0042b6cd-run-httpd\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.563389 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2897dc65-e596-414b-b73e-172b0042b6cd-log-httpd\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.566084 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.567401 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-config-data\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.569259 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-scripts\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.577562 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85e5a081-f0bc-457a-a3f0-9f9152f942c3" path="/var/lib/kubelet/pods/85e5a081-f0bc-457a-a3f0-9f9152f942c3/volumes" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.583958 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.584409 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.602401 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkrrk\" (UniqueName: \"kubernetes.io/projected/2897dc65-e596-414b-b73e-172b0042b6cd-kube-api-access-dkrrk\") pod \"ceilometer-0\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.714474 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:43:35 crc kubenswrapper[4764]: I0309 13:43:35.817588 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:43:36 crc kubenswrapper[4764]: I0309 13:43:36.230348 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:43:36 crc kubenswrapper[4764]: W0309 13:43:36.234867 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2897dc65_e596_414b_b73e_172b0042b6cd.slice/crio-515481ed93c967d04868f361f4da1be6b976ca58590cd914664e6651afcae34c WatchSource:0}: Error finding container 515481ed93c967d04868f361f4da1be6b976ca58590cd914664e6651afcae34c: Status 404 returned error can't find the container with id 515481ed93c967d04868f361f4da1be6b976ca58590cd914664e6651afcae34c Mar 09 13:43:36 crc kubenswrapper[4764]: I0309 13:43:36.299155 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2897dc65-e596-414b-b73e-172b0042b6cd","Type":"ContainerStarted","Data":"515481ed93c967d04868f361f4da1be6b976ca58590cd914664e6651afcae34c"} Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.219389 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.307221 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-combined-ca-bundle\") pod \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.307305 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfl66\" (UniqueName: \"kubernetes.io/projected/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-kube-api-access-pfl66\") pod \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.307338 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-logs\") pod \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.308448 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-logs" (OuterVolumeSpecName: "logs") pod "15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" (UID: "15e658d7-b575-4e5a-a0f2-3d1adcc41cc0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.312101 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-config-data\") pod \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\" (UID: \"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0\") " Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.314078 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.332981 4764 generic.go:334] "Generic (PLEG): container finished" podID="15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" containerID="e8c68c03da7f4747913f3409f5ace642ba6a37cb32da7a694e3af908791475c8" exitCode=0 Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.333059 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0","Type":"ContainerDied","Data":"e8c68c03da7f4747913f3409f5ace642ba6a37cb32da7a694e3af908791475c8"} Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.333093 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"15e658d7-b575-4e5a-a0f2-3d1adcc41cc0","Type":"ContainerDied","Data":"6553270ae008524ac846200ba9108d557920af3b1024ead659efdad57728b5bc"} Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.333116 4764 scope.go:117] "RemoveContainer" containerID="e8c68c03da7f4747913f3409f5ace642ba6a37cb32da7a694e3af908791475c8" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.333281 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.341264 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-kube-api-access-pfl66" (OuterVolumeSpecName: "kube-api-access-pfl66") pod "15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" (UID: "15e658d7-b575-4e5a-a0f2-3d1adcc41cc0"). InnerVolumeSpecName "kube-api-access-pfl66". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.350298 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2897dc65-e596-414b-b73e-172b0042b6cd","Type":"ContainerStarted","Data":"6aba9d2ac154d93bccd8c03a964b0ce43be5d47c4542ac63eafc38ff3e190148"} Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.359108 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" (UID: "15e658d7-b575-4e5a-a0f2-3d1adcc41cc0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.383093 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-config-data" (OuterVolumeSpecName: "config-data") pod "15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" (UID: "15e658d7-b575-4e5a-a0f2-3d1adcc41cc0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.419108 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.419161 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfl66\" (UniqueName: \"kubernetes.io/projected/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-kube-api-access-pfl66\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.419174 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.455566 4764 scope.go:117] "RemoveContainer" containerID="62f7feefecbbe169538a57f544a5a783ae3f5d36a7e66415eb161a48e16ae91a" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.480146 4764 scope.go:117] "RemoveContainer" containerID="e8c68c03da7f4747913f3409f5ace642ba6a37cb32da7a694e3af908791475c8" Mar 09 13:43:37 crc kubenswrapper[4764]: E0309 13:43:37.488179 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8c68c03da7f4747913f3409f5ace642ba6a37cb32da7a694e3af908791475c8\": container with ID starting with e8c68c03da7f4747913f3409f5ace642ba6a37cb32da7a694e3af908791475c8 not found: ID does not exist" containerID="e8c68c03da7f4747913f3409f5ace642ba6a37cb32da7a694e3af908791475c8" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.488273 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8c68c03da7f4747913f3409f5ace642ba6a37cb32da7a694e3af908791475c8"} err="failed to get container status \"e8c68c03da7f4747913f3409f5ace642ba6a37cb32da7a694e3af908791475c8\": rpc error: code = NotFound desc = could not find container \"e8c68c03da7f4747913f3409f5ace642ba6a37cb32da7a694e3af908791475c8\": container with ID starting with e8c68c03da7f4747913f3409f5ace642ba6a37cb32da7a694e3af908791475c8 not found: ID does not exist" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.488315 4764 scope.go:117] "RemoveContainer" containerID="62f7feefecbbe169538a57f544a5a783ae3f5d36a7e66415eb161a48e16ae91a" Mar 09 13:43:37 crc kubenswrapper[4764]: E0309 13:43:37.489150 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62f7feefecbbe169538a57f544a5a783ae3f5d36a7e66415eb161a48e16ae91a\": container with ID starting with 62f7feefecbbe169538a57f544a5a783ae3f5d36a7e66415eb161a48e16ae91a not found: ID does not exist" containerID="62f7feefecbbe169538a57f544a5a783ae3f5d36a7e66415eb161a48e16ae91a" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.489175 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62f7feefecbbe169538a57f544a5a783ae3f5d36a7e66415eb161a48e16ae91a"} err="failed to get container status \"62f7feefecbbe169538a57f544a5a783ae3f5d36a7e66415eb161a48e16ae91a\": rpc error: code = NotFound desc = could not find container \"62f7feefecbbe169538a57f544a5a783ae3f5d36a7e66415eb161a48e16ae91a\": container with ID starting with 62f7feefecbbe169538a57f544a5a783ae3f5d36a7e66415eb161a48e16ae91a not found: ID does not exist" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.666566 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.683874 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.690334 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:37 crc kubenswrapper[4764]: E0309 13:43:37.691158 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" containerName="nova-api-api" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.691178 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" containerName="nova-api-api" Mar 09 13:43:37 crc kubenswrapper[4764]: E0309 13:43:37.691210 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" containerName="nova-api-log" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.691217 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" containerName="nova-api-log" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.691393 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" containerName="nova-api-log" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.691410 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" containerName="nova-api-api" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.692361 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.697266 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.697542 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.697736 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.706361 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.833522 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wznv\" (UniqueName: \"kubernetes.io/projected/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-kube-api-access-7wznv\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.833637 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.833719 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-logs\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.833848 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-config-data\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.833881 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.833955 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-public-tls-certs\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.860311 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.882450 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.936501 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wznv\" (UniqueName: \"kubernetes.io/projected/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-kube-api-access-7wznv\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.936626 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.936722 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-logs\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.936929 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-config-data\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.936962 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.937065 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-public-tls-certs\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.937353 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-logs\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.942067 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-public-tls-certs\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.942264 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.943520 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.948126 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-config-data\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:37 crc kubenswrapper[4764]: I0309 13:43:37.959125 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wznv\" (UniqueName: \"kubernetes.io/projected/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-kube-api-access-7wznv\") pod \"nova-api-0\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " pod="openstack/nova-api-0" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.043862 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.374073 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2897dc65-e596-414b-b73e-172b0042b6cd","Type":"ContainerStarted","Data":"b1056172a7b512fb661d17bf402edffaf8b9a666bfde122f2c185a7c23090848"} Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.405514 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.619939 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-q8h4m"] Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.621425 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.626195 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.632200 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.636351 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-q8h4m"] Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.730085 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.776445 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-config-data\") pod \"nova-cell1-cell-mapping-q8h4m\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.776546 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-q8h4m\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.776587 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-scripts\") pod \"nova-cell1-cell-mapping-q8h4m\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.776659 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrhwm\" (UniqueName: \"kubernetes.io/projected/d98526d5-8eaa-44a7-a25d-662a4fc8758b-kube-api-access-vrhwm\") pod \"nova-cell1-cell-mapping-q8h4m\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.878791 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-config-data\") pod \"nova-cell1-cell-mapping-q8h4m\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.878879 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-q8h4m\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.878916 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-scripts\") pod \"nova-cell1-cell-mapping-q8h4m\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.878987 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrhwm\" (UniqueName: \"kubernetes.io/projected/d98526d5-8eaa-44a7-a25d-662a4fc8758b-kube-api-access-vrhwm\") pod \"nova-cell1-cell-mapping-q8h4m\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.885891 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-config-data\") pod \"nova-cell1-cell-mapping-q8h4m\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.886257 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-scripts\") pod \"nova-cell1-cell-mapping-q8h4m\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.886900 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-q8h4m\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.900202 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrhwm\" (UniqueName: \"kubernetes.io/projected/d98526d5-8eaa-44a7-a25d-662a4fc8758b-kube-api-access-vrhwm\") pod \"nova-cell1-cell-mapping-q8h4m\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:38 crc kubenswrapper[4764]: I0309 13:43:38.957128 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:39 crc kubenswrapper[4764]: I0309 13:43:39.439727 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2897dc65-e596-414b-b73e-172b0042b6cd","Type":"ContainerStarted","Data":"550d0daaa1bf282375c8ca306a313e035fcd02974a76611ae4861ffff324f8a9"} Mar 09 13:43:39 crc kubenswrapper[4764]: I0309 13:43:39.441742 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf561cd8-b441-4efe-8f37-9c925d1f7aa9","Type":"ContainerStarted","Data":"8800be21f6e5e98b3173a95394f9eca30f4eb9aa26729be4bf34f42eed9c86fc"} Mar 09 13:43:39 crc kubenswrapper[4764]: I0309 13:43:39.441843 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf561cd8-b441-4efe-8f37-9c925d1f7aa9","Type":"ContainerStarted","Data":"8364f048c74ee582bf3a1b604c7622e3c5a53940323ec1e7e02d222016811233"} Mar 09 13:43:39 crc kubenswrapper[4764]: I0309 13:43:39.441862 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf561cd8-b441-4efe-8f37-9c925d1f7aa9","Type":"ContainerStarted","Data":"fa48d3e5b6386cde541f520160503cb71e6ebccb4522e2395f1e89b01ebb551e"} Mar 09 13:43:39 crc kubenswrapper[4764]: I0309 13:43:39.471311 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.4712800169999998 podStartE2EDuration="2.471280017s" podCreationTimestamp="2026-03-09 13:43:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:43:39.463771086 +0000 UTC m=+1374.713943014" watchObservedRunningTime="2026-03-09 13:43:39.471280017 +0000 UTC m=+1374.721451925" Mar 09 13:43:39 crc kubenswrapper[4764]: I0309 13:43:39.502522 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-q8h4m"] Mar 09 13:43:39 crc kubenswrapper[4764]: I0309 13:43:39.573899 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15e658d7-b575-4e5a-a0f2-3d1adcc41cc0" path="/var/lib/kubelet/pods/15e658d7-b575-4e5a-a0f2-3d1adcc41cc0/volumes" Mar 09 13:43:40 crc kubenswrapper[4764]: I0309 13:43:40.454627 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q8h4m" event={"ID":"d98526d5-8eaa-44a7-a25d-662a4fc8758b","Type":"ContainerStarted","Data":"858a961159a6d1cec6dd22f3429517714ebd1b6a16fd98b28235cb60fc1a94ee"} Mar 09 13:43:40 crc kubenswrapper[4764]: I0309 13:43:40.455154 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q8h4m" event={"ID":"d98526d5-8eaa-44a7-a25d-662a4fc8758b","Type":"ContainerStarted","Data":"0ad74f9523811644a338c930f4e1d9354e736d92d5564d6b6688e12a4bedac66"} Mar 09 13:43:40 crc kubenswrapper[4764]: I0309 13:43:40.479197 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-q8h4m" podStartSLOduration=2.4791733049999998 podStartE2EDuration="2.479173305s" podCreationTimestamp="2026-03-09 13:43:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:43:40.477564343 +0000 UTC m=+1375.727736251" watchObservedRunningTime="2026-03-09 13:43:40.479173305 +0000 UTC m=+1375.729345213" Mar 09 13:43:40 crc kubenswrapper[4764]: I0309 13:43:40.974090 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.060484 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-gsgp2"] Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.062002 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" podUID="dd806e2d-2675-448c-96f5-2440c2e243f2" containerName="dnsmasq-dns" containerID="cri-o://d735f180db949f6ebbd416610517dc4e197003ab27c3b1031338a802031ce748" gracePeriod=10 Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.469223 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd806e2d-2675-448c-96f5-2440c2e243f2" containerID="d735f180db949f6ebbd416610517dc4e197003ab27c3b1031338a802031ce748" exitCode=0 Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.469303 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" event={"ID":"dd806e2d-2675-448c-96f5-2440c2e243f2","Type":"ContainerDied","Data":"d735f180db949f6ebbd416610517dc4e197003ab27c3b1031338a802031ce748"} Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.473939 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="ceilometer-central-agent" containerID="cri-o://6aba9d2ac154d93bccd8c03a964b0ce43be5d47c4542ac63eafc38ff3e190148" gracePeriod=30 Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.474114 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2897dc65-e596-414b-b73e-172b0042b6cd","Type":"ContainerStarted","Data":"b7616cfa2d6717183764d5b731f3e4c011f99816e53a80c47d3844cc65c6452d"} Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.474174 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.474661 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="proxy-httpd" containerID="cri-o://b7616cfa2d6717183764d5b731f3e4c011f99816e53a80c47d3844cc65c6452d" gracePeriod=30 Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.474715 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="sg-core" containerID="cri-o://550d0daaa1bf282375c8ca306a313e035fcd02974a76611ae4861ffff324f8a9" gracePeriod=30 Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.474776 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="ceilometer-notification-agent" containerID="cri-o://b1056172a7b512fb661d17bf402edffaf8b9a666bfde122f2c185a7c23090848" gracePeriod=30 Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.519611 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.019439885 podStartE2EDuration="6.519579023s" podCreationTimestamp="2026-03-09 13:43:35 +0000 UTC" firstStartedPulling="2026-03-09 13:43:36.237448537 +0000 UTC m=+1371.487620445" lastFinishedPulling="2026-03-09 13:43:40.737587665 +0000 UTC m=+1375.987759583" observedRunningTime="2026-03-09 13:43:41.5067273 +0000 UTC m=+1376.756899228" watchObservedRunningTime="2026-03-09 13:43:41.519579023 +0000 UTC m=+1376.769750931" Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.571507 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.575965 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-config\") pod \"dd806e2d-2675-448c-96f5-2440c2e243f2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.576433 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-ovsdbserver-sb\") pod \"dd806e2d-2675-448c-96f5-2440c2e243f2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.576529 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-ovsdbserver-nb\") pod \"dd806e2d-2675-448c-96f5-2440c2e243f2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.576760 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xl8t\" (UniqueName: \"kubernetes.io/projected/dd806e2d-2675-448c-96f5-2440c2e243f2-kube-api-access-4xl8t\") pod \"dd806e2d-2675-448c-96f5-2440c2e243f2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.576899 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-dns-svc\") pod \"dd806e2d-2675-448c-96f5-2440c2e243f2\" (UID: \"dd806e2d-2675-448c-96f5-2440c2e243f2\") " Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.583585 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd806e2d-2675-448c-96f5-2440c2e243f2-kube-api-access-4xl8t" (OuterVolumeSpecName: "kube-api-access-4xl8t") pod "dd806e2d-2675-448c-96f5-2440c2e243f2" (UID: "dd806e2d-2675-448c-96f5-2440c2e243f2"). InnerVolumeSpecName "kube-api-access-4xl8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.659249 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dd806e2d-2675-448c-96f5-2440c2e243f2" (UID: "dd806e2d-2675-448c-96f5-2440c2e243f2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.663187 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dd806e2d-2675-448c-96f5-2440c2e243f2" (UID: "dd806e2d-2675-448c-96f5-2440c2e243f2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.666139 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dd806e2d-2675-448c-96f5-2440c2e243f2" (UID: "dd806e2d-2675-448c-96f5-2440c2e243f2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.668628 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-config" (OuterVolumeSpecName: "config") pod "dd806e2d-2675-448c-96f5-2440c2e243f2" (UID: "dd806e2d-2675-448c-96f5-2440c2e243f2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.680727 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.680776 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.680788 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xl8t\" (UniqueName: \"kubernetes.io/projected/dd806e2d-2675-448c-96f5-2440c2e243f2-kube-api-access-4xl8t\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.680804 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:41 crc kubenswrapper[4764]: I0309 13:43:41.680819 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd806e2d-2675-448c-96f5-2440c2e243f2-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:42 crc kubenswrapper[4764]: I0309 13:43:42.491953 4764 generic.go:334] "Generic (PLEG): container finished" podID="2897dc65-e596-414b-b73e-172b0042b6cd" containerID="b7616cfa2d6717183764d5b731f3e4c011f99816e53a80c47d3844cc65c6452d" exitCode=0 Mar 09 13:43:42 crc kubenswrapper[4764]: I0309 13:43:42.492002 4764 generic.go:334] "Generic (PLEG): container finished" podID="2897dc65-e596-414b-b73e-172b0042b6cd" containerID="550d0daaa1bf282375c8ca306a313e035fcd02974a76611ae4861ffff324f8a9" exitCode=2 Mar 09 13:43:42 crc kubenswrapper[4764]: I0309 13:43:42.492009 4764 generic.go:334] "Generic (PLEG): container finished" podID="2897dc65-e596-414b-b73e-172b0042b6cd" containerID="b1056172a7b512fb661d17bf402edffaf8b9a666bfde122f2c185a7c23090848" exitCode=0 Mar 09 13:43:42 crc kubenswrapper[4764]: I0309 13:43:42.493135 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2897dc65-e596-414b-b73e-172b0042b6cd","Type":"ContainerDied","Data":"b7616cfa2d6717183764d5b731f3e4c011f99816e53a80c47d3844cc65c6452d"} Mar 09 13:43:42 crc kubenswrapper[4764]: I0309 13:43:42.493298 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2897dc65-e596-414b-b73e-172b0042b6cd","Type":"ContainerDied","Data":"550d0daaa1bf282375c8ca306a313e035fcd02974a76611ae4861ffff324f8a9"} Mar 09 13:43:42 crc kubenswrapper[4764]: I0309 13:43:42.493426 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2897dc65-e596-414b-b73e-172b0042b6cd","Type":"ContainerDied","Data":"b1056172a7b512fb661d17bf402edffaf8b9a666bfde122f2c185a7c23090848"} Mar 09 13:43:42 crc kubenswrapper[4764]: I0309 13:43:42.497280 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" event={"ID":"dd806e2d-2675-448c-96f5-2440c2e243f2","Type":"ContainerDied","Data":"725b7755875ebd6a6866a71e5813a9d3f7def7b5fe8e24fd807f8fb4b2f4627a"} Mar 09 13:43:42 crc kubenswrapper[4764]: I0309 13:43:42.497362 4764 scope.go:117] "RemoveContainer" containerID="d735f180db949f6ebbd416610517dc4e197003ab27c3b1031338a802031ce748" Mar 09 13:43:42 crc kubenswrapper[4764]: I0309 13:43:42.497587 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8cf6657-gsgp2" Mar 09 13:43:42 crc kubenswrapper[4764]: I0309 13:43:42.533872 4764 scope.go:117] "RemoveContainer" containerID="a48379ed6aa8003713adc5f3726b706529f649e8177a5d07e91f56612288af49" Mar 09 13:43:42 crc kubenswrapper[4764]: I0309 13:43:42.553303 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-gsgp2"] Mar 09 13:43:42 crc kubenswrapper[4764]: I0309 13:43:42.566337 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-gsgp2"] Mar 09 13:43:43 crc kubenswrapper[4764]: I0309 13:43:43.571795 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd806e2d-2675-448c-96f5-2440c2e243f2" path="/var/lib/kubelet/pods/dd806e2d-2675-448c-96f5-2440c2e243f2/volumes" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.362966 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.439318 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-config-data\") pod \"2897dc65-e596-414b-b73e-172b0042b6cd\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.439800 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-sg-core-conf-yaml\") pod \"2897dc65-e596-414b-b73e-172b0042b6cd\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.439835 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-scripts\") pod \"2897dc65-e596-414b-b73e-172b0042b6cd\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.439878 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-ceilometer-tls-certs\") pod \"2897dc65-e596-414b-b73e-172b0042b6cd\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.439935 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-combined-ca-bundle\") pod \"2897dc65-e596-414b-b73e-172b0042b6cd\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.439983 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2897dc65-e596-414b-b73e-172b0042b6cd-log-httpd\") pod \"2897dc65-e596-414b-b73e-172b0042b6cd\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.440115 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkrrk\" (UniqueName: \"kubernetes.io/projected/2897dc65-e596-414b-b73e-172b0042b6cd-kube-api-access-dkrrk\") pod \"2897dc65-e596-414b-b73e-172b0042b6cd\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.440149 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2897dc65-e596-414b-b73e-172b0042b6cd-run-httpd\") pod \"2897dc65-e596-414b-b73e-172b0042b6cd\" (UID: \"2897dc65-e596-414b-b73e-172b0042b6cd\") " Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.441562 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2897dc65-e596-414b-b73e-172b0042b6cd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2897dc65-e596-414b-b73e-172b0042b6cd" (UID: "2897dc65-e596-414b-b73e-172b0042b6cd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.441920 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2897dc65-e596-414b-b73e-172b0042b6cd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2897dc65-e596-414b-b73e-172b0042b6cd" (UID: "2897dc65-e596-414b-b73e-172b0042b6cd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.447571 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2897dc65-e596-414b-b73e-172b0042b6cd-kube-api-access-dkrrk" (OuterVolumeSpecName: "kube-api-access-dkrrk") pod "2897dc65-e596-414b-b73e-172b0042b6cd" (UID: "2897dc65-e596-414b-b73e-172b0042b6cd"). InnerVolumeSpecName "kube-api-access-dkrrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.462222 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-scripts" (OuterVolumeSpecName: "scripts") pod "2897dc65-e596-414b-b73e-172b0042b6cd" (UID: "2897dc65-e596-414b-b73e-172b0042b6cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.482144 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2897dc65-e596-414b-b73e-172b0042b6cd" (UID: "2897dc65-e596-414b-b73e-172b0042b6cd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.518601 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2897dc65-e596-414b-b73e-172b0042b6cd" (UID: "2897dc65-e596-414b-b73e-172b0042b6cd"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.531163 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.530501 4764 generic.go:334] "Generic (PLEG): container finished" podID="2897dc65-e596-414b-b73e-172b0042b6cd" containerID="6aba9d2ac154d93bccd8c03a964b0ce43be5d47c4542ac63eafc38ff3e190148" exitCode=0 Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.531549 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2897dc65-e596-414b-b73e-172b0042b6cd","Type":"ContainerDied","Data":"6aba9d2ac154d93bccd8c03a964b0ce43be5d47c4542ac63eafc38ff3e190148"} Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.531594 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2897dc65-e596-414b-b73e-172b0042b6cd","Type":"ContainerDied","Data":"515481ed93c967d04868f361f4da1be6b976ca58590cd914664e6651afcae34c"} Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.531619 4764 scope.go:117] "RemoveContainer" containerID="b7616cfa2d6717183764d5b731f3e4c011f99816e53a80c47d3844cc65c6452d" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.535851 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2897dc65-e596-414b-b73e-172b0042b6cd" (UID: "2897dc65-e596-414b-b73e-172b0042b6cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.542889 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.542942 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2897dc65-e596-414b-b73e-172b0042b6cd-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.542956 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkrrk\" (UniqueName: \"kubernetes.io/projected/2897dc65-e596-414b-b73e-172b0042b6cd-kube-api-access-dkrrk\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.542971 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2897dc65-e596-414b-b73e-172b0042b6cd-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.542983 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.542993 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.543003 4764 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.562602 4764 scope.go:117] "RemoveContainer" containerID="550d0daaa1bf282375c8ca306a313e035fcd02974a76611ae4861ffff324f8a9" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.574780 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-config-data" (OuterVolumeSpecName: "config-data") pod "2897dc65-e596-414b-b73e-172b0042b6cd" (UID: "2897dc65-e596-414b-b73e-172b0042b6cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.585308 4764 scope.go:117] "RemoveContainer" containerID="b1056172a7b512fb661d17bf402edffaf8b9a666bfde122f2c185a7c23090848" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.607219 4764 scope.go:117] "RemoveContainer" containerID="6aba9d2ac154d93bccd8c03a964b0ce43be5d47c4542ac63eafc38ff3e190148" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.642806 4764 scope.go:117] "RemoveContainer" containerID="b7616cfa2d6717183764d5b731f3e4c011f99816e53a80c47d3844cc65c6452d" Mar 09 13:43:44 crc kubenswrapper[4764]: E0309 13:43:44.643542 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7616cfa2d6717183764d5b731f3e4c011f99816e53a80c47d3844cc65c6452d\": container with ID starting with b7616cfa2d6717183764d5b731f3e4c011f99816e53a80c47d3844cc65c6452d not found: ID does not exist" containerID="b7616cfa2d6717183764d5b731f3e4c011f99816e53a80c47d3844cc65c6452d" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.643604 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7616cfa2d6717183764d5b731f3e4c011f99816e53a80c47d3844cc65c6452d"} err="failed to get container status \"b7616cfa2d6717183764d5b731f3e4c011f99816e53a80c47d3844cc65c6452d\": rpc error: code = NotFound desc = could not find container \"b7616cfa2d6717183764d5b731f3e4c011f99816e53a80c47d3844cc65c6452d\": container with ID starting with b7616cfa2d6717183764d5b731f3e4c011f99816e53a80c47d3844cc65c6452d not found: ID does not exist" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.643664 4764 scope.go:117] "RemoveContainer" containerID="550d0daaa1bf282375c8ca306a313e035fcd02974a76611ae4861ffff324f8a9" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.644386 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2897dc65-e596-414b-b73e-172b0042b6cd-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:44 crc kubenswrapper[4764]: E0309 13:43:44.644405 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"550d0daaa1bf282375c8ca306a313e035fcd02974a76611ae4861ffff324f8a9\": container with ID starting with 550d0daaa1bf282375c8ca306a313e035fcd02974a76611ae4861ffff324f8a9 not found: ID does not exist" containerID="550d0daaa1bf282375c8ca306a313e035fcd02974a76611ae4861ffff324f8a9" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.644443 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"550d0daaa1bf282375c8ca306a313e035fcd02974a76611ae4861ffff324f8a9"} err="failed to get container status \"550d0daaa1bf282375c8ca306a313e035fcd02974a76611ae4861ffff324f8a9\": rpc error: code = NotFound desc = could not find container \"550d0daaa1bf282375c8ca306a313e035fcd02974a76611ae4861ffff324f8a9\": container with ID starting with 550d0daaa1bf282375c8ca306a313e035fcd02974a76611ae4861ffff324f8a9 not found: ID does not exist" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.644480 4764 scope.go:117] "RemoveContainer" containerID="b1056172a7b512fb661d17bf402edffaf8b9a666bfde122f2c185a7c23090848" Mar 09 13:43:44 crc kubenswrapper[4764]: E0309 13:43:44.645345 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1056172a7b512fb661d17bf402edffaf8b9a666bfde122f2c185a7c23090848\": container with ID starting with b1056172a7b512fb661d17bf402edffaf8b9a666bfde122f2c185a7c23090848 not found: ID does not exist" containerID="b1056172a7b512fb661d17bf402edffaf8b9a666bfde122f2c185a7c23090848" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.645387 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1056172a7b512fb661d17bf402edffaf8b9a666bfde122f2c185a7c23090848"} err="failed to get container status \"b1056172a7b512fb661d17bf402edffaf8b9a666bfde122f2c185a7c23090848\": rpc error: code = NotFound desc = could not find container \"b1056172a7b512fb661d17bf402edffaf8b9a666bfde122f2c185a7c23090848\": container with ID starting with b1056172a7b512fb661d17bf402edffaf8b9a666bfde122f2c185a7c23090848 not found: ID does not exist" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.645412 4764 scope.go:117] "RemoveContainer" containerID="6aba9d2ac154d93bccd8c03a964b0ce43be5d47c4542ac63eafc38ff3e190148" Mar 09 13:43:44 crc kubenswrapper[4764]: E0309 13:43:44.645873 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aba9d2ac154d93bccd8c03a964b0ce43be5d47c4542ac63eafc38ff3e190148\": container with ID starting with 6aba9d2ac154d93bccd8c03a964b0ce43be5d47c4542ac63eafc38ff3e190148 not found: ID does not exist" containerID="6aba9d2ac154d93bccd8c03a964b0ce43be5d47c4542ac63eafc38ff3e190148" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.645905 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aba9d2ac154d93bccd8c03a964b0ce43be5d47c4542ac63eafc38ff3e190148"} err="failed to get container status \"6aba9d2ac154d93bccd8c03a964b0ce43be5d47c4542ac63eafc38ff3e190148\": rpc error: code = NotFound desc = could not find container \"6aba9d2ac154d93bccd8c03a964b0ce43be5d47c4542ac63eafc38ff3e190148\": container with ID starting with 6aba9d2ac154d93bccd8c03a964b0ce43be5d47c4542ac63eafc38ff3e190148 not found: ID does not exist" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.910700 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.926154 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.939033 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:43:44 crc kubenswrapper[4764]: E0309 13:43:44.940051 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="sg-core" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.940083 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="sg-core" Mar 09 13:43:44 crc kubenswrapper[4764]: E0309 13:43:44.940117 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="ceilometer-central-agent" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.940134 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="ceilometer-central-agent" Mar 09 13:43:44 crc kubenswrapper[4764]: E0309 13:43:44.940166 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd806e2d-2675-448c-96f5-2440c2e243f2" containerName="init" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.940182 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd806e2d-2675-448c-96f5-2440c2e243f2" containerName="init" Mar 09 13:43:44 crc kubenswrapper[4764]: E0309 13:43:44.940211 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="ceilometer-notification-agent" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.940229 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="ceilometer-notification-agent" Mar 09 13:43:44 crc kubenswrapper[4764]: E0309 13:43:44.940269 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd806e2d-2675-448c-96f5-2440c2e243f2" containerName="dnsmasq-dns" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.940288 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd806e2d-2675-448c-96f5-2440c2e243f2" containerName="dnsmasq-dns" Mar 09 13:43:44 crc kubenswrapper[4764]: E0309 13:43:44.940314 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="proxy-httpd" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.940329 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="proxy-httpd" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.940838 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="sg-core" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.940880 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="ceilometer-central-agent" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.940904 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd806e2d-2675-448c-96f5-2440c2e243f2" containerName="dnsmasq-dns" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.940932 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="ceilometer-notification-agent" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.940956 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" containerName="proxy-httpd" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.945433 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.951513 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.954949 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.955004 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 13:43:44 crc kubenswrapper[4764]: I0309 13:43:44.955156 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.055236 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-config-data\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.055362 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-log-httpd\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.055405 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.055441 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.055491 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-scripts\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.055526 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-run-httpd\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.055574 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxjf8\" (UniqueName: \"kubernetes.io/projected/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-kube-api-access-kxjf8\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.055598 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.157854 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxjf8\" (UniqueName: \"kubernetes.io/projected/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-kube-api-access-kxjf8\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.157922 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.158012 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-config-data\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.158073 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-log-httpd\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.158103 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.158138 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.158182 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-scripts\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.158224 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-run-httpd\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.158856 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-run-httpd\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.158910 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-log-httpd\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.163151 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.163207 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-scripts\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.164594 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-config-data\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.165402 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.173568 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.176636 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxjf8\" (UniqueName: \"kubernetes.io/projected/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-kube-api-access-kxjf8\") pod \"ceilometer-0\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.274334 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.580573 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2897dc65-e596-414b-b73e-172b0042b6cd" path="/var/lib/kubelet/pods/2897dc65-e596-414b-b73e-172b0042b6cd/volumes" Mar 09 13:43:45 crc kubenswrapper[4764]: E0309 13:43:45.710289 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd98526d5_8eaa_44a7_a25d_662a4fc8758b.slice/crio-conmon-858a961159a6d1cec6dd22f3429517714ebd1b6a16fd98b28235cb60fc1a94ee.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd98526d5_8eaa_44a7_a25d_662a4fc8758b.slice/crio-858a961159a6d1cec6dd22f3429517714ebd1b6a16fd98b28235cb60fc1a94ee.scope\": RecentStats: unable to find data in memory cache]" Mar 09 13:43:45 crc kubenswrapper[4764]: I0309 13:43:45.753451 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 13:43:46 crc kubenswrapper[4764]: I0309 13:43:46.559132 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7220ea42-daf2-4c41-85c9-0d2bda6d24eb","Type":"ContainerStarted","Data":"ff9bf0b1ebb34e175d58582b99ddb3f1097e979301d7cd95e80e6e75808835d6"} Mar 09 13:43:46 crc kubenswrapper[4764]: I0309 13:43:46.559572 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7220ea42-daf2-4c41-85c9-0d2bda6d24eb","Type":"ContainerStarted","Data":"83415e6b4960e541c9fc0ec3cd4865cce73b704e20a342572bf182a8978c8bc9"} Mar 09 13:43:46 crc kubenswrapper[4764]: I0309 13:43:46.561975 4764 generic.go:334] "Generic (PLEG): container finished" podID="d98526d5-8eaa-44a7-a25d-662a4fc8758b" containerID="858a961159a6d1cec6dd22f3429517714ebd1b6a16fd98b28235cb60fc1a94ee" exitCode=0 Mar 09 13:43:46 crc kubenswrapper[4764]: I0309 13:43:46.562009 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q8h4m" event={"ID":"d98526d5-8eaa-44a7-a25d-662a4fc8758b","Type":"ContainerDied","Data":"858a961159a6d1cec6dd22f3429517714ebd1b6a16fd98b28235cb60fc1a94ee"} Mar 09 13:43:47 crc kubenswrapper[4764]: I0309 13:43:47.592968 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7220ea42-daf2-4c41-85c9-0d2bda6d24eb","Type":"ContainerStarted","Data":"cb0403e3d98f6b11f541bb3ed05951b3aa36eed0ed52f4863be171ce5eaf63e2"} Mar 09 13:43:47 crc kubenswrapper[4764]: I0309 13:43:47.972542 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.044549 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.044608 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.136161 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-scripts\") pod \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.136255 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-combined-ca-bundle\") pod \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.136375 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrhwm\" (UniqueName: \"kubernetes.io/projected/d98526d5-8eaa-44a7-a25d-662a4fc8758b-kube-api-access-vrhwm\") pod \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.136681 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-config-data\") pod \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\" (UID: \"d98526d5-8eaa-44a7-a25d-662a4fc8758b\") " Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.144428 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-scripts" (OuterVolumeSpecName: "scripts") pod "d98526d5-8eaa-44a7-a25d-662a4fc8758b" (UID: "d98526d5-8eaa-44a7-a25d-662a4fc8758b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.144739 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d98526d5-8eaa-44a7-a25d-662a4fc8758b-kube-api-access-vrhwm" (OuterVolumeSpecName: "kube-api-access-vrhwm") pod "d98526d5-8eaa-44a7-a25d-662a4fc8758b" (UID: "d98526d5-8eaa-44a7-a25d-662a4fc8758b"). InnerVolumeSpecName "kube-api-access-vrhwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.167517 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-config-data" (OuterVolumeSpecName: "config-data") pod "d98526d5-8eaa-44a7-a25d-662a4fc8758b" (UID: "d98526d5-8eaa-44a7-a25d-662a4fc8758b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.174776 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d98526d5-8eaa-44a7-a25d-662a4fc8758b" (UID: "d98526d5-8eaa-44a7-a25d-662a4fc8758b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.240355 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.240412 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.240426 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d98526d5-8eaa-44a7-a25d-662a4fc8758b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.240445 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrhwm\" (UniqueName: \"kubernetes.io/projected/d98526d5-8eaa-44a7-a25d-662a4fc8758b-kube-api-access-vrhwm\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.629693 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7220ea42-daf2-4c41-85c9-0d2bda6d24eb","Type":"ContainerStarted","Data":"c44f22885081c03e82b6817f6378545ad23fd40c3fd48004324856781c5d89d8"} Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.634412 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q8h4m" event={"ID":"d98526d5-8eaa-44a7-a25d-662a4fc8758b","Type":"ContainerDied","Data":"0ad74f9523811644a338c930f4e1d9354e736d92d5564d6b6688e12a4bedac66"} Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.634463 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ad74f9523811644a338c930f4e1d9354e736d92d5564d6b6688e12a4bedac66" Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.634532 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q8h4m" Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.775471 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.776192 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="cf561cd8-b441-4efe-8f37-9c925d1f7aa9" containerName="nova-api-log" containerID="cri-o://8364f048c74ee582bf3a1b604c7622e3c5a53940323ec1e7e02d222016811233" gracePeriod=30 Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.776229 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="cf561cd8-b441-4efe-8f37-9c925d1f7aa9" containerName="nova-api-api" containerID="cri-o://8800be21f6e5e98b3173a95394f9eca30f4eb9aa26729be4bf34f42eed9c86fc" gracePeriod=30 Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.786270 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cf561cd8-b441-4efe-8f37-9c925d1f7aa9" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.191:8774/\": EOF" Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.790878 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cf561cd8-b441-4efe-8f37-9c925d1f7aa9" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.191:8774/\": EOF" Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.798682 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.799077 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e3d70a54-660f-4ef9-bd2a-ed16699d8d66" containerName="nova-scheduler-scheduler" containerID="cri-o://dc09c0ba380d56f5c82ec457ccdcea9f947bd8b08307a77c8684608027a7c1ac" gracePeriod=30 Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.849921 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.850592 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8aba6bca-21f2-4e18-90e7-098c8541a4f4" containerName="nova-metadata-log" containerID="cri-o://ce8b8b59c7d046943c3a2a1f0ccc0e08b931a62a55c138ad63284ef6b7ad9a9a" gracePeriod=30 Mar 09 13:43:48 crc kubenswrapper[4764]: I0309 13:43:48.850875 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8aba6bca-21f2-4e18-90e7-098c8541a4f4" containerName="nova-metadata-metadata" containerID="cri-o://43bb793d9ae7aef4ac5c3135ad4636253715b28d8b5d66da3fd63d1b86a1f016" gracePeriod=30 Mar 09 13:43:49 crc kubenswrapper[4764]: I0309 13:43:49.655852 4764 generic.go:334] "Generic (PLEG): container finished" podID="8aba6bca-21f2-4e18-90e7-098c8541a4f4" containerID="ce8b8b59c7d046943c3a2a1f0ccc0e08b931a62a55c138ad63284ef6b7ad9a9a" exitCode=143 Mar 09 13:43:49 crc kubenswrapper[4764]: I0309 13:43:49.656467 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8aba6bca-21f2-4e18-90e7-098c8541a4f4","Type":"ContainerDied","Data":"ce8b8b59c7d046943c3a2a1f0ccc0e08b931a62a55c138ad63284ef6b7ad9a9a"} Mar 09 13:43:49 crc kubenswrapper[4764]: I0309 13:43:49.672400 4764 generic.go:334] "Generic (PLEG): container finished" podID="cf561cd8-b441-4efe-8f37-9c925d1f7aa9" containerID="8364f048c74ee582bf3a1b604c7622e3c5a53940323ec1e7e02d222016811233" exitCode=143 Mar 09 13:43:49 crc kubenswrapper[4764]: I0309 13:43:49.672467 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf561cd8-b441-4efe-8f37-9c925d1f7aa9","Type":"ContainerDied","Data":"8364f048c74ee582bf3a1b604c7622e3c5a53940323ec1e7e02d222016811233"} Mar 09 13:43:49 crc kubenswrapper[4764]: I0309 13:43:49.692223 4764 generic.go:334] "Generic (PLEG): container finished" podID="e3d70a54-660f-4ef9-bd2a-ed16699d8d66" containerID="dc09c0ba380d56f5c82ec457ccdcea9f947bd8b08307a77c8684608027a7c1ac" exitCode=0 Mar 09 13:43:49 crc kubenswrapper[4764]: I0309 13:43:49.692269 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e3d70a54-660f-4ef9-bd2a-ed16699d8d66","Type":"ContainerDied","Data":"dc09c0ba380d56f5c82ec457ccdcea9f947bd8b08307a77c8684608027a7c1ac"} Mar 09 13:43:49 crc kubenswrapper[4764]: I0309 13:43:49.871391 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 13:43:49 crc kubenswrapper[4764]: I0309 13:43:49.992498 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtx9b\" (UniqueName: \"kubernetes.io/projected/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-kube-api-access-qtx9b\") pod \"e3d70a54-660f-4ef9-bd2a-ed16699d8d66\" (UID: \"e3d70a54-660f-4ef9-bd2a-ed16699d8d66\") " Mar 09 13:43:49 crc kubenswrapper[4764]: I0309 13:43:49.993172 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-config-data\") pod \"e3d70a54-660f-4ef9-bd2a-ed16699d8d66\" (UID: \"e3d70a54-660f-4ef9-bd2a-ed16699d8d66\") " Mar 09 13:43:49 crc kubenswrapper[4764]: I0309 13:43:49.993461 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-combined-ca-bundle\") pod \"e3d70a54-660f-4ef9-bd2a-ed16699d8d66\" (UID: \"e3d70a54-660f-4ef9-bd2a-ed16699d8d66\") " Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.003795 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-kube-api-access-qtx9b" (OuterVolumeSpecName: "kube-api-access-qtx9b") pod "e3d70a54-660f-4ef9-bd2a-ed16699d8d66" (UID: "e3d70a54-660f-4ef9-bd2a-ed16699d8d66"). InnerVolumeSpecName "kube-api-access-qtx9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.033823 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3d70a54-660f-4ef9-bd2a-ed16699d8d66" (UID: "e3d70a54-660f-4ef9-bd2a-ed16699d8d66"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.036352 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-config-data" (OuterVolumeSpecName: "config-data") pod "e3d70a54-660f-4ef9-bd2a-ed16699d8d66" (UID: "e3d70a54-660f-4ef9-bd2a-ed16699d8d66"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.096508 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.096546 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.096560 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtx9b\" (UniqueName: \"kubernetes.io/projected/e3d70a54-660f-4ef9-bd2a-ed16699d8d66-kube-api-access-qtx9b\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.711804 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7220ea42-daf2-4c41-85c9-0d2bda6d24eb","Type":"ContainerStarted","Data":"5ee7ad3de1f9928c6b59a6676d88b6e84ba3c1fa18e74865387fac5e3b0fa60c"} Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.713503 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.718846 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e3d70a54-660f-4ef9-bd2a-ed16699d8d66","Type":"ContainerDied","Data":"737be67408dde868ad9928c7e4b5b5a92634607014e02a50994bdc6c48b356c6"} Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.718899 4764 scope.go:117] "RemoveContainer" containerID="dc09c0ba380d56f5c82ec457ccdcea9f947bd8b08307a77c8684608027a7c1ac" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.719038 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.812080 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.016131235 podStartE2EDuration="6.811946719s" podCreationTimestamp="2026-03-09 13:43:44 +0000 UTC" firstStartedPulling="2026-03-09 13:43:45.761364894 +0000 UTC m=+1381.011536802" lastFinishedPulling="2026-03-09 13:43:49.557180378 +0000 UTC m=+1384.807352286" observedRunningTime="2026-03-09 13:43:50.755615974 +0000 UTC m=+1386.005787882" watchObservedRunningTime="2026-03-09 13:43:50.811946719 +0000 UTC m=+1386.062118637" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.827706 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.847900 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.861530 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:43:50 crc kubenswrapper[4764]: E0309 13:43:50.862277 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d98526d5-8eaa-44a7-a25d-662a4fc8758b" containerName="nova-manage" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.862302 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d98526d5-8eaa-44a7-a25d-662a4fc8758b" containerName="nova-manage" Mar 09 13:43:50 crc kubenswrapper[4764]: E0309 13:43:50.862355 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3d70a54-660f-4ef9-bd2a-ed16699d8d66" containerName="nova-scheduler-scheduler" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.862366 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3d70a54-660f-4ef9-bd2a-ed16699d8d66" containerName="nova-scheduler-scheduler" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.862621 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3d70a54-660f-4ef9-bd2a-ed16699d8d66" containerName="nova-scheduler-scheduler" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.862672 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d98526d5-8eaa-44a7-a25d-662a4fc8758b" containerName="nova-manage" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.863804 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.868469 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.887704 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.930412 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d26ba33-e370-4bc8-bb15-b727c0c9c97f-config-data\") pod \"nova-scheduler-0\" (UID: \"7d26ba33-e370-4bc8-bb15-b727c0c9c97f\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.930819 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp9k4\" (UniqueName: \"kubernetes.io/projected/7d26ba33-e370-4bc8-bb15-b727c0c9c97f-kube-api-access-jp9k4\") pod \"nova-scheduler-0\" (UID: \"7d26ba33-e370-4bc8-bb15-b727c0c9c97f\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:50 crc kubenswrapper[4764]: I0309 13:43:50.931037 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d26ba33-e370-4bc8-bb15-b727c0c9c97f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7d26ba33-e370-4bc8-bb15-b727c0c9c97f\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:51 crc kubenswrapper[4764]: I0309 13:43:51.033508 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp9k4\" (UniqueName: \"kubernetes.io/projected/7d26ba33-e370-4bc8-bb15-b727c0c9c97f-kube-api-access-jp9k4\") pod \"nova-scheduler-0\" (UID: \"7d26ba33-e370-4bc8-bb15-b727c0c9c97f\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:51 crc kubenswrapper[4764]: I0309 13:43:51.033625 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d26ba33-e370-4bc8-bb15-b727c0c9c97f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7d26ba33-e370-4bc8-bb15-b727c0c9c97f\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:51 crc kubenswrapper[4764]: I0309 13:43:51.033755 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d26ba33-e370-4bc8-bb15-b727c0c9c97f-config-data\") pod \"nova-scheduler-0\" (UID: \"7d26ba33-e370-4bc8-bb15-b727c0c9c97f\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:51 crc kubenswrapper[4764]: I0309 13:43:51.041893 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d26ba33-e370-4bc8-bb15-b727c0c9c97f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7d26ba33-e370-4bc8-bb15-b727c0c9c97f\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:51 crc kubenswrapper[4764]: I0309 13:43:51.044036 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d26ba33-e370-4bc8-bb15-b727c0c9c97f-config-data\") pod \"nova-scheduler-0\" (UID: \"7d26ba33-e370-4bc8-bb15-b727c0c9c97f\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:51 crc kubenswrapper[4764]: I0309 13:43:51.054929 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp9k4\" (UniqueName: \"kubernetes.io/projected/7d26ba33-e370-4bc8-bb15-b727c0c9c97f-kube-api-access-jp9k4\") pod \"nova-scheduler-0\" (UID: \"7d26ba33-e370-4bc8-bb15-b727c0c9c97f\") " pod="openstack/nova-scheduler-0" Mar 09 13:43:51 crc kubenswrapper[4764]: I0309 13:43:51.191024 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 13:43:51 crc kubenswrapper[4764]: I0309 13:43:51.587284 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3d70a54-660f-4ef9-bd2a-ed16699d8d66" path="/var/lib/kubelet/pods/e3d70a54-660f-4ef9-bd2a-ed16699d8d66/volumes" Mar 09 13:43:51 crc kubenswrapper[4764]: I0309 13:43:51.767630 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 13:43:51 crc kubenswrapper[4764]: W0309 13:43:51.772276 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d26ba33_e370_4bc8_bb15_b727c0c9c97f.slice/crio-e6917764081c089c5262d61abdf003690e988a0af3f0e46d443245865ac76c2d WatchSource:0}: Error finding container e6917764081c089c5262d61abdf003690e988a0af3f0e46d443245865ac76c2d: Status 404 returned error can't find the container with id e6917764081c089c5262d61abdf003690e988a0af3f0e46d443245865ac76c2d Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.021265 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="8aba6bca-21f2-4e18-90e7-098c8541a4f4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.185:8775/\": read tcp 10.217.0.2:51062->10.217.0.185:8775: read: connection reset by peer" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.021442 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="8aba6bca-21f2-4e18-90e7-098c8541a4f4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.185:8775/\": read tcp 10.217.0.2:51066->10.217.0.185:8775: read: connection reset by peer" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.509667 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.674510 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-combined-ca-bundle\") pod \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.674782 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-config-data\") pod \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.674818 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aba6bca-21f2-4e18-90e7-098c8541a4f4-logs\") pod \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.674894 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6cx7\" (UniqueName: \"kubernetes.io/projected/8aba6bca-21f2-4e18-90e7-098c8541a4f4-kube-api-access-h6cx7\") pod \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.674967 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-nova-metadata-tls-certs\") pod \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\" (UID: \"8aba6bca-21f2-4e18-90e7-098c8541a4f4\") " Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.677421 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8aba6bca-21f2-4e18-90e7-098c8541a4f4-logs" (OuterVolumeSpecName: "logs") pod "8aba6bca-21f2-4e18-90e7-098c8541a4f4" (UID: "8aba6bca-21f2-4e18-90e7-098c8541a4f4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.683165 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aba6bca-21f2-4e18-90e7-098c8541a4f4-kube-api-access-h6cx7" (OuterVolumeSpecName: "kube-api-access-h6cx7") pod "8aba6bca-21f2-4e18-90e7-098c8541a4f4" (UID: "8aba6bca-21f2-4e18-90e7-098c8541a4f4"). InnerVolumeSpecName "kube-api-access-h6cx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.711422 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8aba6bca-21f2-4e18-90e7-098c8541a4f4" (UID: "8aba6bca-21f2-4e18-90e7-098c8541a4f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.711824 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-config-data" (OuterVolumeSpecName: "config-data") pod "8aba6bca-21f2-4e18-90e7-098c8541a4f4" (UID: "8aba6bca-21f2-4e18-90e7-098c8541a4f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.737689 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "8aba6bca-21f2-4e18-90e7-098c8541a4f4" (UID: "8aba6bca-21f2-4e18-90e7-098c8541a4f4"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.756983 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7d26ba33-e370-4bc8-bb15-b727c0c9c97f","Type":"ContainerStarted","Data":"d1a93c5bd15cd6153ac10b5f15ea2af29110e60ec1c31734a03ea8c5b7b054ec"} Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.757091 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7d26ba33-e370-4bc8-bb15-b727c0c9c97f","Type":"ContainerStarted","Data":"e6917764081c089c5262d61abdf003690e988a0af3f0e46d443245865ac76c2d"} Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.766329 4764 generic.go:334] "Generic (PLEG): container finished" podID="8aba6bca-21f2-4e18-90e7-098c8541a4f4" containerID="43bb793d9ae7aef4ac5c3135ad4636253715b28d8b5d66da3fd63d1b86a1f016" exitCode=0 Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.767141 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8aba6bca-21f2-4e18-90e7-098c8541a4f4","Type":"ContainerDied","Data":"43bb793d9ae7aef4ac5c3135ad4636253715b28d8b5d66da3fd63d1b86a1f016"} Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.767215 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8aba6bca-21f2-4e18-90e7-098c8541a4f4","Type":"ContainerDied","Data":"b10db7fc62dc747c8a3073bba39f8052766584cab6d39aec677be986aaca3d56"} Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.767236 4764 scope.go:117] "RemoveContainer" containerID="43bb793d9ae7aef4ac5c3135ad4636253715b28d8b5d66da3fd63d1b86a1f016" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.767173 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.788610 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.788661 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.788672 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aba6bca-21f2-4e18-90e7-098c8541a4f4-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.788682 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6cx7\" (UniqueName: \"kubernetes.io/projected/8aba6bca-21f2-4e18-90e7-098c8541a4f4-kube-api-access-h6cx7\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.788693 4764 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8aba6bca-21f2-4e18-90e7-098c8541a4f4-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.790706 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.7906797279999997 podStartE2EDuration="2.790679728s" podCreationTimestamp="2026-03-09 13:43:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:43:52.780405604 +0000 UTC m=+1388.030577522" watchObservedRunningTime="2026-03-09 13:43:52.790679728 +0000 UTC m=+1388.040851636" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.854515 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.865899 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.876373 4764 scope.go:117] "RemoveContainer" containerID="ce8b8b59c7d046943c3a2a1f0ccc0e08b931a62a55c138ad63284ef6b7ad9a9a" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.892792 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:43:52 crc kubenswrapper[4764]: E0309 13:43:52.893366 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aba6bca-21f2-4e18-90e7-098c8541a4f4" containerName="nova-metadata-metadata" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.893391 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aba6bca-21f2-4e18-90e7-098c8541a4f4" containerName="nova-metadata-metadata" Mar 09 13:43:52 crc kubenswrapper[4764]: E0309 13:43:52.893415 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aba6bca-21f2-4e18-90e7-098c8541a4f4" containerName="nova-metadata-log" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.893424 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aba6bca-21f2-4e18-90e7-098c8541a4f4" containerName="nova-metadata-log" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.893622 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aba6bca-21f2-4e18-90e7-098c8541a4f4" containerName="nova-metadata-log" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.893665 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aba6bca-21f2-4e18-90e7-098c8541a4f4" containerName="nova-metadata-metadata" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.894845 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.903121 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.903314 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.906045 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.926856 4764 scope.go:117] "RemoveContainer" containerID="43bb793d9ae7aef4ac5c3135ad4636253715b28d8b5d66da3fd63d1b86a1f016" Mar 09 13:43:52 crc kubenswrapper[4764]: E0309 13:43:52.929325 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43bb793d9ae7aef4ac5c3135ad4636253715b28d8b5d66da3fd63d1b86a1f016\": container with ID starting with 43bb793d9ae7aef4ac5c3135ad4636253715b28d8b5d66da3fd63d1b86a1f016 not found: ID does not exist" containerID="43bb793d9ae7aef4ac5c3135ad4636253715b28d8b5d66da3fd63d1b86a1f016" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.934814 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43bb793d9ae7aef4ac5c3135ad4636253715b28d8b5d66da3fd63d1b86a1f016"} err="failed to get container status \"43bb793d9ae7aef4ac5c3135ad4636253715b28d8b5d66da3fd63d1b86a1f016\": rpc error: code = NotFound desc = could not find container \"43bb793d9ae7aef4ac5c3135ad4636253715b28d8b5d66da3fd63d1b86a1f016\": container with ID starting with 43bb793d9ae7aef4ac5c3135ad4636253715b28d8b5d66da3fd63d1b86a1f016 not found: ID does not exist" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.934984 4764 scope.go:117] "RemoveContainer" containerID="ce8b8b59c7d046943c3a2a1f0ccc0e08b931a62a55c138ad63284ef6b7ad9a9a" Mar 09 13:43:52 crc kubenswrapper[4764]: E0309 13:43:52.941282 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce8b8b59c7d046943c3a2a1f0ccc0e08b931a62a55c138ad63284ef6b7ad9a9a\": container with ID starting with ce8b8b59c7d046943c3a2a1f0ccc0e08b931a62a55c138ad63284ef6b7ad9a9a not found: ID does not exist" containerID="ce8b8b59c7d046943c3a2a1f0ccc0e08b931a62a55c138ad63284ef6b7ad9a9a" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.941335 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce8b8b59c7d046943c3a2a1f0ccc0e08b931a62a55c138ad63284ef6b7ad9a9a"} err="failed to get container status \"ce8b8b59c7d046943c3a2a1f0ccc0e08b931a62a55c138ad63284ef6b7ad9a9a\": rpc error: code = NotFound desc = could not find container \"ce8b8b59c7d046943c3a2a1f0ccc0e08b931a62a55c138ad63284ef6b7ad9a9a\": container with ID starting with ce8b8b59c7d046943c3a2a1f0ccc0e08b931a62a55c138ad63284ef6b7ad9a9a not found: ID does not exist" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.993677 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9226790-b0dc-460b-8c06-127effde8c19-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d9226790-b0dc-460b-8c06-127effde8c19\") " pod="openstack/nova-metadata-0" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.993738 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9226790-b0dc-460b-8c06-127effde8c19-logs\") pod \"nova-metadata-0\" (UID: \"d9226790-b0dc-460b-8c06-127effde8c19\") " pod="openstack/nova-metadata-0" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.993816 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9226790-b0dc-460b-8c06-127effde8c19-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d9226790-b0dc-460b-8c06-127effde8c19\") " pod="openstack/nova-metadata-0" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.993868 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9226790-b0dc-460b-8c06-127effde8c19-config-data\") pod \"nova-metadata-0\" (UID: \"d9226790-b0dc-460b-8c06-127effde8c19\") " pod="openstack/nova-metadata-0" Mar 09 13:43:52 crc kubenswrapper[4764]: I0309 13:43:52.993956 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9pmc\" (UniqueName: \"kubernetes.io/projected/d9226790-b0dc-460b-8c06-127effde8c19-kube-api-access-h9pmc\") pod \"nova-metadata-0\" (UID: \"d9226790-b0dc-460b-8c06-127effde8c19\") " pod="openstack/nova-metadata-0" Mar 09 13:43:53 crc kubenswrapper[4764]: I0309 13:43:53.096026 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9226790-b0dc-460b-8c06-127effde8c19-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d9226790-b0dc-460b-8c06-127effde8c19\") " pod="openstack/nova-metadata-0" Mar 09 13:43:53 crc kubenswrapper[4764]: I0309 13:43:53.096111 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9226790-b0dc-460b-8c06-127effde8c19-config-data\") pod \"nova-metadata-0\" (UID: \"d9226790-b0dc-460b-8c06-127effde8c19\") " pod="openstack/nova-metadata-0" Mar 09 13:43:53 crc kubenswrapper[4764]: I0309 13:43:53.096150 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9pmc\" (UniqueName: \"kubernetes.io/projected/d9226790-b0dc-460b-8c06-127effde8c19-kube-api-access-h9pmc\") pod \"nova-metadata-0\" (UID: \"d9226790-b0dc-460b-8c06-127effde8c19\") " pod="openstack/nova-metadata-0" Mar 09 13:43:53 crc kubenswrapper[4764]: I0309 13:43:53.097538 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9226790-b0dc-460b-8c06-127effde8c19-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d9226790-b0dc-460b-8c06-127effde8c19\") " pod="openstack/nova-metadata-0" Mar 09 13:43:53 crc kubenswrapper[4764]: I0309 13:43:53.097600 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9226790-b0dc-460b-8c06-127effde8c19-logs\") pod \"nova-metadata-0\" (UID: \"d9226790-b0dc-460b-8c06-127effde8c19\") " pod="openstack/nova-metadata-0" Mar 09 13:43:53 crc kubenswrapper[4764]: I0309 13:43:53.098139 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9226790-b0dc-460b-8c06-127effde8c19-logs\") pod \"nova-metadata-0\" (UID: \"d9226790-b0dc-460b-8c06-127effde8c19\") " pod="openstack/nova-metadata-0" Mar 09 13:43:53 crc kubenswrapper[4764]: I0309 13:43:53.101009 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9226790-b0dc-460b-8c06-127effde8c19-config-data\") pod \"nova-metadata-0\" (UID: \"d9226790-b0dc-460b-8c06-127effde8c19\") " pod="openstack/nova-metadata-0" Mar 09 13:43:53 crc kubenswrapper[4764]: I0309 13:43:53.101165 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9226790-b0dc-460b-8c06-127effde8c19-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d9226790-b0dc-460b-8c06-127effde8c19\") " pod="openstack/nova-metadata-0" Mar 09 13:43:53 crc kubenswrapper[4764]: I0309 13:43:53.102182 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9226790-b0dc-460b-8c06-127effde8c19-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d9226790-b0dc-460b-8c06-127effde8c19\") " pod="openstack/nova-metadata-0" Mar 09 13:43:53 crc kubenswrapper[4764]: I0309 13:43:53.115133 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9pmc\" (UniqueName: \"kubernetes.io/projected/d9226790-b0dc-460b-8c06-127effde8c19-kube-api-access-h9pmc\") pod \"nova-metadata-0\" (UID: \"d9226790-b0dc-460b-8c06-127effde8c19\") " pod="openstack/nova-metadata-0" Mar 09 13:43:53 crc kubenswrapper[4764]: I0309 13:43:53.239269 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 13:43:53 crc kubenswrapper[4764]: I0309 13:43:53.575976 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8aba6bca-21f2-4e18-90e7-098c8541a4f4" path="/var/lib/kubelet/pods/8aba6bca-21f2-4e18-90e7-098c8541a4f4/volumes" Mar 09 13:43:53 crc kubenswrapper[4764]: I0309 13:43:53.752073 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 13:43:53 crc kubenswrapper[4764]: I0309 13:43:53.790919 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d9226790-b0dc-460b-8c06-127effde8c19","Type":"ContainerStarted","Data":"57b71ff6dbb4dee3477ca54350bb356543e459e128cffe708fcbd88e2d54a9be"} Mar 09 13:43:54 crc kubenswrapper[4764]: I0309 13:43:54.803961 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d9226790-b0dc-460b-8c06-127effde8c19","Type":"ContainerStarted","Data":"fc5d9c90f1b6ba9be2e0c4ec1cfda566e29d87d53d6a66d366c6768440d3a6a6"} Mar 09 13:43:54 crc kubenswrapper[4764]: I0309 13:43:54.804356 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d9226790-b0dc-460b-8c06-127effde8c19","Type":"ContainerStarted","Data":"71a060b0e753606350d29e0d158aaee4cade05ca6590264314aa9f5561d7826a"} Mar 09 13:43:54 crc kubenswrapper[4764]: I0309 13:43:54.826407 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.826388628 podStartE2EDuration="2.826388628s" podCreationTimestamp="2026-03-09 13:43:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:43:54.822251048 +0000 UTC m=+1390.072422956" watchObservedRunningTime="2026-03-09 13:43:54.826388628 +0000 UTC m=+1390.076560536" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.760381 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.772371 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wznv\" (UniqueName: \"kubernetes.io/projected/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-kube-api-access-7wznv\") pod \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.772449 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-logs\") pod \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.772581 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-combined-ca-bundle\") pod \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.772626 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-public-tls-certs\") pod \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.772767 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-internal-tls-certs\") pod \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.772939 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-config-data\") pod \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.773442 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-logs" (OuterVolumeSpecName: "logs") pod "cf561cd8-b441-4efe-8f37-9c925d1f7aa9" (UID: "cf561cd8-b441-4efe-8f37-9c925d1f7aa9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.773604 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.780504 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-kube-api-access-7wznv" (OuterVolumeSpecName: "kube-api-access-7wznv") pod "cf561cd8-b441-4efe-8f37-9c925d1f7aa9" (UID: "cf561cd8-b441-4efe-8f37-9c925d1f7aa9"). InnerVolumeSpecName "kube-api-access-7wznv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.813982 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf561cd8-b441-4efe-8f37-9c925d1f7aa9" (UID: "cf561cd8-b441-4efe-8f37-9c925d1f7aa9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.838225 4764 generic.go:334] "Generic (PLEG): container finished" podID="cf561cd8-b441-4efe-8f37-9c925d1f7aa9" containerID="8800be21f6e5e98b3173a95394f9eca30f4eb9aa26729be4bf34f42eed9c86fc" exitCode=0 Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.838321 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.838387 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf561cd8-b441-4efe-8f37-9c925d1f7aa9","Type":"ContainerDied","Data":"8800be21f6e5e98b3173a95394f9eca30f4eb9aa26729be4bf34f42eed9c86fc"} Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.838429 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf561cd8-b441-4efe-8f37-9c925d1f7aa9","Type":"ContainerDied","Data":"fa48d3e5b6386cde541f520160503cb71e6ebccb4522e2395f1e89b01ebb551e"} Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.838458 4764 scope.go:117] "RemoveContainer" containerID="8800be21f6e5e98b3173a95394f9eca30f4eb9aa26729be4bf34f42eed9c86fc" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.859427 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-config-data" (OuterVolumeSpecName: "config-data") pod "cf561cd8-b441-4efe-8f37-9c925d1f7aa9" (UID: "cf561cd8-b441-4efe-8f37-9c925d1f7aa9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.864605 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cf561cd8-b441-4efe-8f37-9c925d1f7aa9" (UID: "cf561cd8-b441-4efe-8f37-9c925d1f7aa9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.883901 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cf561cd8-b441-4efe-8f37-9c925d1f7aa9" (UID: "cf561cd8-b441-4efe-8f37-9c925d1f7aa9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.884234 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-public-tls-certs\") pod \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\" (UID: \"cf561cd8-b441-4efe-8f37-9c925d1f7aa9\") " Mar 09 13:43:55 crc kubenswrapper[4764]: W0309 13:43:55.884798 4764 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/cf561cd8-b441-4efe-8f37-9c925d1f7aa9/volumes/kubernetes.io~secret/public-tls-certs Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.884875 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cf561cd8-b441-4efe-8f37-9c925d1f7aa9" (UID: "cf561cd8-b441-4efe-8f37-9c925d1f7aa9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.885147 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.885173 4764 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.885184 4764 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.885200 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.885212 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wznv\" (UniqueName: \"kubernetes.io/projected/cf561cd8-b441-4efe-8f37-9c925d1f7aa9-kube-api-access-7wznv\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.924149 4764 scope.go:117] "RemoveContainer" containerID="8364f048c74ee582bf3a1b604c7622e3c5a53940323ec1e7e02d222016811233" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.948209 4764 scope.go:117] "RemoveContainer" containerID="8800be21f6e5e98b3173a95394f9eca30f4eb9aa26729be4bf34f42eed9c86fc" Mar 09 13:43:55 crc kubenswrapper[4764]: E0309 13:43:55.948877 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8800be21f6e5e98b3173a95394f9eca30f4eb9aa26729be4bf34f42eed9c86fc\": container with ID starting with 8800be21f6e5e98b3173a95394f9eca30f4eb9aa26729be4bf34f42eed9c86fc not found: ID does not exist" containerID="8800be21f6e5e98b3173a95394f9eca30f4eb9aa26729be4bf34f42eed9c86fc" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.948995 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8800be21f6e5e98b3173a95394f9eca30f4eb9aa26729be4bf34f42eed9c86fc"} err="failed to get container status \"8800be21f6e5e98b3173a95394f9eca30f4eb9aa26729be4bf34f42eed9c86fc\": rpc error: code = NotFound desc = could not find container \"8800be21f6e5e98b3173a95394f9eca30f4eb9aa26729be4bf34f42eed9c86fc\": container with ID starting with 8800be21f6e5e98b3173a95394f9eca30f4eb9aa26729be4bf34f42eed9c86fc not found: ID does not exist" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.949089 4764 scope.go:117] "RemoveContainer" containerID="8364f048c74ee582bf3a1b604c7622e3c5a53940323ec1e7e02d222016811233" Mar 09 13:43:55 crc kubenswrapper[4764]: E0309 13:43:55.949602 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8364f048c74ee582bf3a1b604c7622e3c5a53940323ec1e7e02d222016811233\": container with ID starting with 8364f048c74ee582bf3a1b604c7622e3c5a53940323ec1e7e02d222016811233 not found: ID does not exist" containerID="8364f048c74ee582bf3a1b604c7622e3c5a53940323ec1e7e02d222016811233" Mar 09 13:43:55 crc kubenswrapper[4764]: I0309 13:43:55.949774 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8364f048c74ee582bf3a1b604c7622e3c5a53940323ec1e7e02d222016811233"} err="failed to get container status \"8364f048c74ee582bf3a1b604c7622e3c5a53940323ec1e7e02d222016811233\": rpc error: code = NotFound desc = could not find container \"8364f048c74ee582bf3a1b604c7622e3c5a53940323ec1e7e02d222016811233\": container with ID starting with 8364f048c74ee582bf3a1b604c7622e3c5a53940323ec1e7e02d222016811233 not found: ID does not exist" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.182724 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.191450 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.201487 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.215025 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:56 crc kubenswrapper[4764]: E0309 13:43:56.215698 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf561cd8-b441-4efe-8f37-9c925d1f7aa9" containerName="nova-api-api" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.215723 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf561cd8-b441-4efe-8f37-9c925d1f7aa9" containerName="nova-api-api" Mar 09 13:43:56 crc kubenswrapper[4764]: E0309 13:43:56.215773 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf561cd8-b441-4efe-8f37-9c925d1f7aa9" containerName="nova-api-log" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.215782 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf561cd8-b441-4efe-8f37-9c925d1f7aa9" containerName="nova-api-log" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.215988 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf561cd8-b441-4efe-8f37-9c925d1f7aa9" containerName="nova-api-api" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.216038 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf561cd8-b441-4efe-8f37-9c925d1f7aa9" containerName="nova-api-log" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.217428 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.235292 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.236542 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.238560 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.276932 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.292935 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec790643-05dd-4f21-82f8-ad1586087d85-logs\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.293327 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec790643-05dd-4f21-82f8-ad1586087d85-public-tls-certs\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.293450 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec790643-05dd-4f21-82f8-ad1586087d85-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.293565 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec790643-05dd-4f21-82f8-ad1586087d85-config-data\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.293708 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec790643-05dd-4f21-82f8-ad1586087d85-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.293826 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r27m\" (UniqueName: \"kubernetes.io/projected/ec790643-05dd-4f21-82f8-ad1586087d85-kube-api-access-2r27m\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.396135 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec790643-05dd-4f21-82f8-ad1586087d85-logs\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.396704 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec790643-05dd-4f21-82f8-ad1586087d85-public-tls-certs\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.396759 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec790643-05dd-4f21-82f8-ad1586087d85-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.396779 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec790643-05dd-4f21-82f8-ad1586087d85-config-data\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.396828 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec790643-05dd-4f21-82f8-ad1586087d85-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.396858 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r27m\" (UniqueName: \"kubernetes.io/projected/ec790643-05dd-4f21-82f8-ad1586087d85-kube-api-access-2r27m\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.397136 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec790643-05dd-4f21-82f8-ad1586087d85-logs\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.405153 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec790643-05dd-4f21-82f8-ad1586087d85-config-data\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.405951 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec790643-05dd-4f21-82f8-ad1586087d85-public-tls-certs\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.406104 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec790643-05dd-4f21-82f8-ad1586087d85-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.406445 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec790643-05dd-4f21-82f8-ad1586087d85-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.413284 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r27m\" (UniqueName: \"kubernetes.io/projected/ec790643-05dd-4f21-82f8-ad1586087d85-kube-api-access-2r27m\") pod \"nova-api-0\" (UID: \"ec790643-05dd-4f21-82f8-ad1586087d85\") " pod="openstack/nova-api-0" Mar 09 13:43:56 crc kubenswrapper[4764]: I0309 13:43:56.544385 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 13:43:57 crc kubenswrapper[4764]: I0309 13:43:57.019926 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 13:43:57 crc kubenswrapper[4764]: W0309 13:43:57.025422 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec790643_05dd_4f21_82f8_ad1586087d85.slice/crio-e5febb85c2e4ff2e87ece06adfc53fc61e86aa12892d3775be37ecfec0c5e2b3 WatchSource:0}: Error finding container e5febb85c2e4ff2e87ece06adfc53fc61e86aa12892d3775be37ecfec0c5e2b3: Status 404 returned error can't find the container with id e5febb85c2e4ff2e87ece06adfc53fc61e86aa12892d3775be37ecfec0c5e2b3 Mar 09 13:43:57 crc kubenswrapper[4764]: I0309 13:43:57.572973 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf561cd8-b441-4efe-8f37-9c925d1f7aa9" path="/var/lib/kubelet/pods/cf561cd8-b441-4efe-8f37-9c925d1f7aa9/volumes" Mar 09 13:43:57 crc kubenswrapper[4764]: I0309 13:43:57.923418 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec790643-05dd-4f21-82f8-ad1586087d85","Type":"ContainerStarted","Data":"10414c7ac610cd41780af7da76309dda5c434bf22ccc7520ea9f6e1988bd0034"} Mar 09 13:43:57 crc kubenswrapper[4764]: I0309 13:43:57.923993 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec790643-05dd-4f21-82f8-ad1586087d85","Type":"ContainerStarted","Data":"882c8c96a125a289675d95cbe0f9722a96ed40479aae5418f155662006868bf0"} Mar 09 13:43:57 crc kubenswrapper[4764]: I0309 13:43:57.924017 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec790643-05dd-4f21-82f8-ad1586087d85","Type":"ContainerStarted","Data":"e5febb85c2e4ff2e87ece06adfc53fc61e86aa12892d3775be37ecfec0c5e2b3"} Mar 09 13:43:57 crc kubenswrapper[4764]: I0309 13:43:57.959281 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.959243123 podStartE2EDuration="1.959243123s" podCreationTimestamp="2026-03-09 13:43:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:43:57.95051862 +0000 UTC m=+1393.200690538" watchObservedRunningTime="2026-03-09 13:43:57.959243123 +0000 UTC m=+1393.209415051" Mar 09 13:43:58 crc kubenswrapper[4764]: I0309 13:43:58.239991 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 13:43:58 crc kubenswrapper[4764]: I0309 13:43:58.240123 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 13:43:58 crc kubenswrapper[4764]: I0309 13:43:58.370852 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:43:58 crc kubenswrapper[4764]: I0309 13:43:58.370954 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:44:00 crc kubenswrapper[4764]: I0309 13:44:00.145934 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551064-mbrph"] Mar 09 13:44:00 crc kubenswrapper[4764]: I0309 13:44:00.147896 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551064-mbrph" Mar 09 13:44:00 crc kubenswrapper[4764]: I0309 13:44:00.149887 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:44:00 crc kubenswrapper[4764]: I0309 13:44:00.150882 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:44:00 crc kubenswrapper[4764]: I0309 13:44:00.151092 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:44:00 crc kubenswrapper[4764]: I0309 13:44:00.160139 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551064-mbrph"] Mar 09 13:44:00 crc kubenswrapper[4764]: I0309 13:44:00.290403 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzmqh\" (UniqueName: \"kubernetes.io/projected/77179ff3-861b-4aab-b1b2-db4d12041264-kube-api-access-kzmqh\") pod \"auto-csr-approver-29551064-mbrph\" (UID: \"77179ff3-861b-4aab-b1b2-db4d12041264\") " pod="openshift-infra/auto-csr-approver-29551064-mbrph" Mar 09 13:44:00 crc kubenswrapper[4764]: I0309 13:44:00.393094 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzmqh\" (UniqueName: \"kubernetes.io/projected/77179ff3-861b-4aab-b1b2-db4d12041264-kube-api-access-kzmqh\") pod \"auto-csr-approver-29551064-mbrph\" (UID: \"77179ff3-861b-4aab-b1b2-db4d12041264\") " pod="openshift-infra/auto-csr-approver-29551064-mbrph" Mar 09 13:44:00 crc kubenswrapper[4764]: I0309 13:44:00.414441 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzmqh\" (UniqueName: \"kubernetes.io/projected/77179ff3-861b-4aab-b1b2-db4d12041264-kube-api-access-kzmqh\") pod \"auto-csr-approver-29551064-mbrph\" (UID: \"77179ff3-861b-4aab-b1b2-db4d12041264\") " pod="openshift-infra/auto-csr-approver-29551064-mbrph" Mar 09 13:44:00 crc kubenswrapper[4764]: I0309 13:44:00.475717 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551064-mbrph" Mar 09 13:44:00 crc kubenswrapper[4764]: I0309 13:44:00.960632 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551064-mbrph"] Mar 09 13:44:00 crc kubenswrapper[4764]: W0309 13:44:00.970485 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77179ff3_861b_4aab_b1b2_db4d12041264.slice/crio-b59e2cc9c2be801b32ba93c61672b061be7f18045f91f26eaaff86ba50201fb0 WatchSource:0}: Error finding container b59e2cc9c2be801b32ba93c61672b061be7f18045f91f26eaaff86ba50201fb0: Status 404 returned error can't find the container with id b59e2cc9c2be801b32ba93c61672b061be7f18045f91f26eaaff86ba50201fb0 Mar 09 13:44:01 crc kubenswrapper[4764]: I0309 13:44:01.191374 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 09 13:44:01 crc kubenswrapper[4764]: I0309 13:44:01.226363 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 09 13:44:01 crc kubenswrapper[4764]: I0309 13:44:01.973300 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551064-mbrph" event={"ID":"77179ff3-861b-4aab-b1b2-db4d12041264","Type":"ContainerStarted","Data":"b59e2cc9c2be801b32ba93c61672b061be7f18045f91f26eaaff86ba50201fb0"} Mar 09 13:44:02 crc kubenswrapper[4764]: I0309 13:44:02.001749 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 09 13:44:02 crc kubenswrapper[4764]: I0309 13:44:02.985574 4764 generic.go:334] "Generic (PLEG): container finished" podID="77179ff3-861b-4aab-b1b2-db4d12041264" containerID="6b1dbed6bf6e61f7c12ad4f9dbf8d714fbaa8de7054f1b548bd8d7d1b560e4d6" exitCode=0 Mar 09 13:44:02 crc kubenswrapper[4764]: I0309 13:44:02.985694 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551064-mbrph" event={"ID":"77179ff3-861b-4aab-b1b2-db4d12041264","Type":"ContainerDied","Data":"6b1dbed6bf6e61f7c12ad4f9dbf8d714fbaa8de7054f1b548bd8d7d1b560e4d6"} Mar 09 13:44:03 crc kubenswrapper[4764]: I0309 13:44:03.240055 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 09 13:44:03 crc kubenswrapper[4764]: I0309 13:44:03.240488 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 09 13:44:04 crc kubenswrapper[4764]: I0309 13:44:04.252857 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d9226790-b0dc-460b-8c06-127effde8c19" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 13:44:04 crc kubenswrapper[4764]: I0309 13:44:04.252857 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d9226790-b0dc-460b-8c06-127effde8c19" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 13:44:04 crc kubenswrapper[4764]: I0309 13:44:04.363739 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551064-mbrph" Mar 09 13:44:04 crc kubenswrapper[4764]: I0309 13:44:04.497472 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzmqh\" (UniqueName: \"kubernetes.io/projected/77179ff3-861b-4aab-b1b2-db4d12041264-kube-api-access-kzmqh\") pod \"77179ff3-861b-4aab-b1b2-db4d12041264\" (UID: \"77179ff3-861b-4aab-b1b2-db4d12041264\") " Mar 09 13:44:04 crc kubenswrapper[4764]: I0309 13:44:04.511134 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77179ff3-861b-4aab-b1b2-db4d12041264-kube-api-access-kzmqh" (OuterVolumeSpecName: "kube-api-access-kzmqh") pod "77179ff3-861b-4aab-b1b2-db4d12041264" (UID: "77179ff3-861b-4aab-b1b2-db4d12041264"). InnerVolumeSpecName "kube-api-access-kzmqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:04 crc kubenswrapper[4764]: I0309 13:44:04.601253 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzmqh\" (UniqueName: \"kubernetes.io/projected/77179ff3-861b-4aab-b1b2-db4d12041264-kube-api-access-kzmqh\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:05 crc kubenswrapper[4764]: I0309 13:44:05.010772 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551064-mbrph" event={"ID":"77179ff3-861b-4aab-b1b2-db4d12041264","Type":"ContainerDied","Data":"b59e2cc9c2be801b32ba93c61672b061be7f18045f91f26eaaff86ba50201fb0"} Mar 09 13:44:05 crc kubenswrapper[4764]: I0309 13:44:05.010831 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b59e2cc9c2be801b32ba93c61672b061be7f18045f91f26eaaff86ba50201fb0" Mar 09 13:44:05 crc kubenswrapper[4764]: I0309 13:44:05.010915 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551064-mbrph" Mar 09 13:44:05 crc kubenswrapper[4764]: I0309 13:44:05.451465 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551058-6mlbf"] Mar 09 13:44:05 crc kubenswrapper[4764]: I0309 13:44:05.461489 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551058-6mlbf"] Mar 09 13:44:05 crc kubenswrapper[4764]: I0309 13:44:05.582234 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="175910d6-eb27-4000-ac8b-9ea49f05bb8b" path="/var/lib/kubelet/pods/175910d6-eb27-4000-ac8b-9ea49f05bb8b/volumes" Mar 09 13:44:06 crc kubenswrapper[4764]: I0309 13:44:06.545123 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 13:44:06 crc kubenswrapper[4764]: I0309 13:44:06.545203 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 13:44:07 crc kubenswrapper[4764]: I0309 13:44:07.552057 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ec790643-05dd-4f21-82f8-ad1586087d85" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.196:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 13:44:07 crc kubenswrapper[4764]: I0309 13:44:07.560044 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ec790643-05dd-4f21-82f8-ad1586087d85" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.196:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 13:44:13 crc kubenswrapper[4764]: I0309 13:44:13.247550 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 09 13:44:13 crc kubenswrapper[4764]: I0309 13:44:13.249243 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 09 13:44:13 crc kubenswrapper[4764]: I0309 13:44:13.259363 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 09 13:44:14 crc kubenswrapper[4764]: I0309 13:44:14.107878 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 09 13:44:15 crc kubenswrapper[4764]: I0309 13:44:15.287077 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 09 13:44:16 crc kubenswrapper[4764]: I0309 13:44:16.553263 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 09 13:44:16 crc kubenswrapper[4764]: I0309 13:44:16.553700 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 09 13:44:16 crc kubenswrapper[4764]: I0309 13:44:16.554127 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 09 13:44:16 crc kubenswrapper[4764]: I0309 13:44:16.554150 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 09 13:44:16 crc kubenswrapper[4764]: I0309 13:44:16.561267 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 09 13:44:16 crc kubenswrapper[4764]: I0309 13:44:16.561917 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 09 13:44:24 crc kubenswrapper[4764]: I0309 13:44:24.015990 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 13:44:24 crc kubenswrapper[4764]: I0309 13:44:24.939921 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 13:44:28 crc kubenswrapper[4764]: I0309 13:44:28.370332 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:44:28 crc kubenswrapper[4764]: I0309 13:44:28.371331 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:44:28 crc kubenswrapper[4764]: I0309 13:44:28.371422 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:44:28 crc kubenswrapper[4764]: I0309 13:44:28.372694 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"69810f80271be58962525ba5a5a37ce68d5e172f7c253c440a5a45f3f3beab77"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:44:28 crc kubenswrapper[4764]: I0309 13:44:28.372764 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://69810f80271be58962525ba5a5a37ce68d5e172f7c253c440a5a45f3f3beab77" gracePeriod=600 Mar 09 13:44:29 crc kubenswrapper[4764]: I0309 13:44:29.270955 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="69810f80271be58962525ba5a5a37ce68d5e172f7c253c440a5a45f3f3beab77" exitCode=0 Mar 09 13:44:29 crc kubenswrapper[4764]: I0309 13:44:29.271054 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"69810f80271be58962525ba5a5a37ce68d5e172f7c253c440a5a45f3f3beab77"} Mar 09 13:44:29 crc kubenswrapper[4764]: I0309 13:44:29.271508 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb"} Mar 09 13:44:29 crc kubenswrapper[4764]: I0309 13:44:29.271545 4764 scope.go:117] "RemoveContainer" containerID="089fede4f6de3963d0297d0b784543ed7a9d38cdefe66f6bbb9aab989dbffbb0" Mar 09 13:44:29 crc kubenswrapper[4764]: I0309 13:44:29.672306 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="507bcef1-e9ef-4eb1-85ce-358209b944bc" containerName="rabbitmq" containerID="cri-o://bed2e6d72dc23d14fbbd197ac01731271bf14e49fb44cf788ecc2cd8bf1b2828" gracePeriod=604795 Mar 09 13:44:29 crc kubenswrapper[4764]: I0309 13:44:29.795792 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="11c8bf9f-a031-4e56-b1d7-49b407eabaf7" containerName="rabbitmq" containerID="cri-o://c051454f6de8558a84ceb371da4edc9fd851c2e829bf18eb804c0c4e657ef0ee" gracePeriod=604796 Mar 09 13:44:34 crc kubenswrapper[4764]: I0309 13:44:34.829402 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="11c8bf9f-a031-4e56-b1d7-49b407eabaf7" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Mar 09 13:44:34 crc kubenswrapper[4764]: I0309 13:44:34.912708 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="507bcef1-e9ef-4eb1-85ce-358209b944bc" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.286078 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.353249 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/507bcef1-e9ef-4eb1-85ce-358209b944bc-pod-info\") pod \"507bcef1-e9ef-4eb1-85ce-358209b944bc\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.353350 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w75z\" (UniqueName: \"kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-kube-api-access-2w75z\") pod \"507bcef1-e9ef-4eb1-85ce-358209b944bc\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.353391 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-tls\") pod \"507bcef1-e9ef-4eb1-85ce-358209b944bc\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.353471 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-config-data\") pod \"507bcef1-e9ef-4eb1-85ce-358209b944bc\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.353509 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-erlang-cookie\") pod \"507bcef1-e9ef-4eb1-85ce-358209b944bc\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.353569 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-server-conf\") pod \"507bcef1-e9ef-4eb1-85ce-358209b944bc\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.353627 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-plugins\") pod \"507bcef1-e9ef-4eb1-85ce-358209b944bc\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.353754 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"507bcef1-e9ef-4eb1-85ce-358209b944bc\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.353783 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-plugins-conf\") pod \"507bcef1-e9ef-4eb1-85ce-358209b944bc\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.353871 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-confd\") pod \"507bcef1-e9ef-4eb1-85ce-358209b944bc\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.353921 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/507bcef1-e9ef-4eb1-85ce-358209b944bc-erlang-cookie-secret\") pod \"507bcef1-e9ef-4eb1-85ce-358209b944bc\" (UID: \"507bcef1-e9ef-4eb1-85ce-358209b944bc\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.354419 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "507bcef1-e9ef-4eb1-85ce-358209b944bc" (UID: "507bcef1-e9ef-4eb1-85ce-358209b944bc"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.354891 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.355541 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "507bcef1-e9ef-4eb1-85ce-358209b944bc" (UID: "507bcef1-e9ef-4eb1-85ce-358209b944bc"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.355703 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "507bcef1-e9ef-4eb1-85ce-358209b944bc" (UID: "507bcef1-e9ef-4eb1-85ce-358209b944bc"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.363715 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/507bcef1-e9ef-4eb1-85ce-358209b944bc-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "507bcef1-e9ef-4eb1-85ce-358209b944bc" (UID: "507bcef1-e9ef-4eb1-85ce-358209b944bc"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.366229 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/507bcef1-e9ef-4eb1-85ce-358209b944bc-pod-info" (OuterVolumeSpecName: "pod-info") pod "507bcef1-e9ef-4eb1-85ce-358209b944bc" (UID: "507bcef1-e9ef-4eb1-85ce-358209b944bc"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.367123 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-kube-api-access-2w75z" (OuterVolumeSpecName: "kube-api-access-2w75z") pod "507bcef1-e9ef-4eb1-85ce-358209b944bc" (UID: "507bcef1-e9ef-4eb1-85ce-358209b944bc"). InnerVolumeSpecName "kube-api-access-2w75z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.366049 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "507bcef1-e9ef-4eb1-85ce-358209b944bc" (UID: "507bcef1-e9ef-4eb1-85ce-358209b944bc"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.373425 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "507bcef1-e9ef-4eb1-85ce-358209b944bc" (UID: "507bcef1-e9ef-4eb1-85ce-358209b944bc"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.381256 4764 generic.go:334] "Generic (PLEG): container finished" podID="507bcef1-e9ef-4eb1-85ce-358209b944bc" containerID="bed2e6d72dc23d14fbbd197ac01731271bf14e49fb44cf788ecc2cd8bf1b2828" exitCode=0 Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.383697 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"507bcef1-e9ef-4eb1-85ce-358209b944bc","Type":"ContainerDied","Data":"bed2e6d72dc23d14fbbd197ac01731271bf14e49fb44cf788ecc2cd8bf1b2828"} Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.383992 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"507bcef1-e9ef-4eb1-85ce-358209b944bc","Type":"ContainerDied","Data":"ee9809e2cf751402688e9f6828a75759ba83ac17c29d13b65aa1aa2a2afdc207"} Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.384190 4764 scope.go:117] "RemoveContainer" containerID="bed2e6d72dc23d14fbbd197ac01731271bf14e49fb44cf788ecc2cd8bf1b2828" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.384700 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.390964 4764 generic.go:334] "Generic (PLEG): container finished" podID="11c8bf9f-a031-4e56-b1d7-49b407eabaf7" containerID="c051454f6de8558a84ceb371da4edc9fd851c2e829bf18eb804c0c4e657ef0ee" exitCode=0 Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.391029 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"11c8bf9f-a031-4e56-b1d7-49b407eabaf7","Type":"ContainerDied","Data":"c051454f6de8558a84ceb371da4edc9fd851c2e829bf18eb804c0c4e657ef0ee"} Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.408366 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-config-data" (OuterVolumeSpecName: "config-data") pod "507bcef1-e9ef-4eb1-85ce-358209b944bc" (UID: "507bcef1-e9ef-4eb1-85ce-358209b944bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.466179 4764 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/507bcef1-e9ef-4eb1-85ce-358209b944bc-pod-info\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.466229 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w75z\" (UniqueName: \"kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-kube-api-access-2w75z\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.466244 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.466257 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.466272 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.466303 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.466311 4764 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.466321 4764 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/507bcef1-e9ef-4eb1-85ce-358209b944bc-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.475261 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.475992 4764 scope.go:117] "RemoveContainer" containerID="f2c108c98ecfd46a140cf2428c8bd7a9e4bb88eab84b2da0f3783d087eb2e601" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.515611 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.522626 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-server-conf" (OuterVolumeSpecName: "server-conf") pod "507bcef1-e9ef-4eb1-85ce-358209b944bc" (UID: "507bcef1-e9ef-4eb1-85ce-358209b944bc"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.537699 4764 scope.go:117] "RemoveContainer" containerID="bed2e6d72dc23d14fbbd197ac01731271bf14e49fb44cf788ecc2cd8bf1b2828" Mar 09 13:44:36 crc kubenswrapper[4764]: E0309 13:44:36.543161 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bed2e6d72dc23d14fbbd197ac01731271bf14e49fb44cf788ecc2cd8bf1b2828\": container with ID starting with bed2e6d72dc23d14fbbd197ac01731271bf14e49fb44cf788ecc2cd8bf1b2828 not found: ID does not exist" containerID="bed2e6d72dc23d14fbbd197ac01731271bf14e49fb44cf788ecc2cd8bf1b2828" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.543231 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bed2e6d72dc23d14fbbd197ac01731271bf14e49fb44cf788ecc2cd8bf1b2828"} err="failed to get container status \"bed2e6d72dc23d14fbbd197ac01731271bf14e49fb44cf788ecc2cd8bf1b2828\": rpc error: code = NotFound desc = could not find container \"bed2e6d72dc23d14fbbd197ac01731271bf14e49fb44cf788ecc2cd8bf1b2828\": container with ID starting with bed2e6d72dc23d14fbbd197ac01731271bf14e49fb44cf788ecc2cd8bf1b2828 not found: ID does not exist" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.543265 4764 scope.go:117] "RemoveContainer" containerID="f2c108c98ecfd46a140cf2428c8bd7a9e4bb88eab84b2da0f3783d087eb2e601" Mar 09 13:44:36 crc kubenswrapper[4764]: E0309 13:44:36.551379 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2c108c98ecfd46a140cf2428c8bd7a9e4bb88eab84b2da0f3783d087eb2e601\": container with ID starting with f2c108c98ecfd46a140cf2428c8bd7a9e4bb88eab84b2da0f3783d087eb2e601 not found: ID does not exist" containerID="f2c108c98ecfd46a140cf2428c8bd7a9e4bb88eab84b2da0f3783d087eb2e601" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.551470 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2c108c98ecfd46a140cf2428c8bd7a9e4bb88eab84b2da0f3783d087eb2e601"} err="failed to get container status \"f2c108c98ecfd46a140cf2428c8bd7a9e4bb88eab84b2da0f3783d087eb2e601\": rpc error: code = NotFound desc = could not find container \"f2c108c98ecfd46a140cf2428c8bd7a9e4bb88eab84b2da0f3783d087eb2e601\": container with ID starting with f2c108c98ecfd46a140cf2428c8bd7a9e4bb88eab84b2da0f3783d087eb2e601 not found: ID does not exist" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.567833 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.568004 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-plugins\") pod \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.568088 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-erlang-cookie\") pod \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.568170 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-tls\") pod \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.568271 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-confd\") pod \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.568304 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-plugins-conf\") pod \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.568368 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-config-data\") pod \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.568464 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-erlang-cookie-secret\") pod \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.568505 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-pod-info\") pod \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.568561 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-server-conf\") pod \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.568586 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bzh8\" (UniqueName: \"kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-kube-api-access-4bzh8\") pod \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\" (UID: \"11c8bf9f-a031-4e56-b1d7-49b407eabaf7\") " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.569495 4764 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/507bcef1-e9ef-4eb1-85ce-358209b944bc-server-conf\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.569518 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.570466 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "11c8bf9f-a031-4e56-b1d7-49b407eabaf7" (UID: "11c8bf9f-a031-4e56-b1d7-49b407eabaf7"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.570718 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "507bcef1-e9ef-4eb1-85ce-358209b944bc" (UID: "507bcef1-e9ef-4eb1-85ce-358209b944bc"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.571466 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "11c8bf9f-a031-4e56-b1d7-49b407eabaf7" (UID: "11c8bf9f-a031-4e56-b1d7-49b407eabaf7"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.573496 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "11c8bf9f-a031-4e56-b1d7-49b407eabaf7" (UID: "11c8bf9f-a031-4e56-b1d7-49b407eabaf7"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.573935 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-kube-api-access-4bzh8" (OuterVolumeSpecName: "kube-api-access-4bzh8") pod "11c8bf9f-a031-4e56-b1d7-49b407eabaf7" (UID: "11c8bf9f-a031-4e56-b1d7-49b407eabaf7"). InnerVolumeSpecName "kube-api-access-4bzh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.576340 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "11c8bf9f-a031-4e56-b1d7-49b407eabaf7" (UID: "11c8bf9f-a031-4e56-b1d7-49b407eabaf7"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.578761 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-pod-info" (OuterVolumeSpecName: "pod-info") pod "11c8bf9f-a031-4e56-b1d7-49b407eabaf7" (UID: "11c8bf9f-a031-4e56-b1d7-49b407eabaf7"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.579910 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "11c8bf9f-a031-4e56-b1d7-49b407eabaf7" (UID: "11c8bf9f-a031-4e56-b1d7-49b407eabaf7"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.581311 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "11c8bf9f-a031-4e56-b1d7-49b407eabaf7" (UID: "11c8bf9f-a031-4e56-b1d7-49b407eabaf7"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.622064 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-config-data" (OuterVolumeSpecName: "config-data") pod "11c8bf9f-a031-4e56-b1d7-49b407eabaf7" (UID: "11c8bf9f-a031-4e56-b1d7-49b407eabaf7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.656068 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-server-conf" (OuterVolumeSpecName: "server-conf") pod "11c8bf9f-a031-4e56-b1d7-49b407eabaf7" (UID: "11c8bf9f-a031-4e56-b1d7-49b407eabaf7"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.671793 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.671840 4764 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.671850 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.671859 4764 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.671868 4764 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-pod-info\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.671881 4764 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-server-conf\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.671890 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bzh8\" (UniqueName: \"kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-kube-api-access-4bzh8\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.671939 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.671954 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.671967 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/507bcef1-e9ef-4eb1-85ce-358209b944bc-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.671977 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.739956 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "11c8bf9f-a031-4e56-b1d7-49b407eabaf7" (UID: "11c8bf9f-a031-4e56-b1d7-49b407eabaf7"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.743333 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.776250 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.776308 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/11c8bf9f-a031-4e56-b1d7-49b407eabaf7-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.778783 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.797924 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.807977 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 13:44:36 crc kubenswrapper[4764]: E0309 13:44:36.808564 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11c8bf9f-a031-4e56-b1d7-49b407eabaf7" containerName="setup-container" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.808610 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="11c8bf9f-a031-4e56-b1d7-49b407eabaf7" containerName="setup-container" Mar 09 13:44:36 crc kubenswrapper[4764]: E0309 13:44:36.808627 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="507bcef1-e9ef-4eb1-85ce-358209b944bc" containerName="rabbitmq" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.808635 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="507bcef1-e9ef-4eb1-85ce-358209b944bc" containerName="rabbitmq" Mar 09 13:44:36 crc kubenswrapper[4764]: E0309 13:44:36.808692 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77179ff3-861b-4aab-b1b2-db4d12041264" containerName="oc" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.808704 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="77179ff3-861b-4aab-b1b2-db4d12041264" containerName="oc" Mar 09 13:44:36 crc kubenswrapper[4764]: E0309 13:44:36.808717 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="507bcef1-e9ef-4eb1-85ce-358209b944bc" containerName="setup-container" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.808724 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="507bcef1-e9ef-4eb1-85ce-358209b944bc" containerName="setup-container" Mar 09 13:44:36 crc kubenswrapper[4764]: E0309 13:44:36.808739 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11c8bf9f-a031-4e56-b1d7-49b407eabaf7" containerName="rabbitmq" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.808747 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="11c8bf9f-a031-4e56-b1d7-49b407eabaf7" containerName="rabbitmq" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.808962 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="77179ff3-861b-4aab-b1b2-db4d12041264" containerName="oc" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.808989 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="11c8bf9f-a031-4e56-b1d7-49b407eabaf7" containerName="rabbitmq" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.809006 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="507bcef1-e9ef-4eb1-85ce-358209b944bc" containerName="rabbitmq" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.821470 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.830748 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.833735 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.833898 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.833753 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-8dlbf" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.834211 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.835735 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.835968 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.839623 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.878235 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b19144b6-cc4c-41d6-ad2e-409c021f657c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.878926 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b19144b6-cc4c-41d6-ad2e-409c021f657c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.879066 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.879185 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b19144b6-cc4c-41d6-ad2e-409c021f657c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.879308 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b19144b6-cc4c-41d6-ad2e-409c021f657c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.879420 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b19144b6-cc4c-41d6-ad2e-409c021f657c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.879535 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbghk\" (UniqueName: \"kubernetes.io/projected/b19144b6-cc4c-41d6-ad2e-409c021f657c-kube-api-access-dbghk\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.879724 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b19144b6-cc4c-41d6-ad2e-409c021f657c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.879909 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b19144b6-cc4c-41d6-ad2e-409c021f657c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.880035 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b19144b6-cc4c-41d6-ad2e-409c021f657c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.881104 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b19144b6-cc4c-41d6-ad2e-409c021f657c-config-data\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.983309 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b19144b6-cc4c-41d6-ad2e-409c021f657c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.983379 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b19144b6-cc4c-41d6-ad2e-409c021f657c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.983407 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b19144b6-cc4c-41d6-ad2e-409c021f657c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.983427 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b19144b6-cc4c-41d6-ad2e-409c021f657c-config-data\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.983465 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b19144b6-cc4c-41d6-ad2e-409c021f657c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.983536 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b19144b6-cc4c-41d6-ad2e-409c021f657c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.983567 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.983592 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b19144b6-cc4c-41d6-ad2e-409c021f657c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.983612 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b19144b6-cc4c-41d6-ad2e-409c021f657c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.983636 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b19144b6-cc4c-41d6-ad2e-409c021f657c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.983658 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbghk\" (UniqueName: \"kubernetes.io/projected/b19144b6-cc4c-41d6-ad2e-409c021f657c-kube-api-access-dbghk\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.985261 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.985877 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b19144b6-cc4c-41d6-ad2e-409c021f657c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.990249 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b19144b6-cc4c-41d6-ad2e-409c021f657c-config-data\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.991176 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b19144b6-cc4c-41d6-ad2e-409c021f657c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.994180 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b19144b6-cc4c-41d6-ad2e-409c021f657c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.994333 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b19144b6-cc4c-41d6-ad2e-409c021f657c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.995062 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b19144b6-cc4c-41d6-ad2e-409c021f657c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:36 crc kubenswrapper[4764]: I0309 13:44:36.995303 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b19144b6-cc4c-41d6-ad2e-409c021f657c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.000746 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b19144b6-cc4c-41d6-ad2e-409c021f657c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.001023 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b19144b6-cc4c-41d6-ad2e-409c021f657c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.003169 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbghk\" (UniqueName: \"kubernetes.io/projected/b19144b6-cc4c-41d6-ad2e-409c021f657c-kube-api-access-dbghk\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.028101 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"b19144b6-cc4c-41d6-ad2e-409c021f657c\") " pod="openstack/rabbitmq-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.150973 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.412651 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"11c8bf9f-a031-4e56-b1d7-49b407eabaf7","Type":"ContainerDied","Data":"b97c921b54e1f12956d845171f6d90fe64a80d32c024a23960cca4b47667dc15"} Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.413082 4764 scope.go:117] "RemoveContainer" containerID="c051454f6de8558a84ceb371da4edc9fd851c2e829bf18eb804c0c4e657ef0ee" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.413265 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.453487 4764 scope.go:117] "RemoveContainer" containerID="fd0a79b758702c401b2bbc87884a5f0ad37053c07ad4da0c2c69610d9e0509c2" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.476753 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.491380 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.510333 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.512095 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.517674 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.517826 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-6m67z" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.517878 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.518005 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.518098 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.518191 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.518225 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.526473 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.576049 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11c8bf9f-a031-4e56-b1d7-49b407eabaf7" path="/var/lib/kubelet/pods/11c8bf9f-a031-4e56-b1d7-49b407eabaf7/volumes" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.577326 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="507bcef1-e9ef-4eb1-85ce-358209b944bc" path="/var/lib/kubelet/pods/507bcef1-e9ef-4eb1-85ce-358209b944bc/volumes" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.685844 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.703784 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c5579dd7-5380-4042-8c78-c6837d841d5e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.703906 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c5579dd7-5380-4042-8c78-c6837d841d5e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.703995 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5579dd7-5380-4042-8c78-c6837d841d5e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.704067 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c5579dd7-5380-4042-8c78-c6837d841d5e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.704111 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.704183 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c5579dd7-5380-4042-8c78-c6837d841d5e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.704364 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c5579dd7-5380-4042-8c78-c6837d841d5e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.704413 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c5579dd7-5380-4042-8c78-c6837d841d5e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.704868 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c5579dd7-5380-4042-8c78-c6837d841d5e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.704976 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwxh2\" (UniqueName: \"kubernetes.io/projected/c5579dd7-5380-4042-8c78-c6837d841d5e-kube-api-access-qwxh2\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.705069 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c5579dd7-5380-4042-8c78-c6837d841d5e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.807284 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c5579dd7-5380-4042-8c78-c6837d841d5e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.807368 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c5579dd7-5380-4042-8c78-c6837d841d5e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.807412 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5579dd7-5380-4042-8c78-c6837d841d5e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.807430 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c5579dd7-5380-4042-8c78-c6837d841d5e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.807451 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.807481 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c5579dd7-5380-4042-8c78-c6837d841d5e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.807566 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c5579dd7-5380-4042-8c78-c6837d841d5e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.807588 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c5579dd7-5380-4042-8c78-c6837d841d5e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.807615 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c5579dd7-5380-4042-8c78-c6837d841d5e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.807642 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwxh2\" (UniqueName: \"kubernetes.io/projected/c5579dd7-5380-4042-8c78-c6837d841d5e-kube-api-access-qwxh2\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.807690 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c5579dd7-5380-4042-8c78-c6837d841d5e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.808062 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c5579dd7-5380-4042-8c78-c6837d841d5e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.808194 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c5579dd7-5380-4042-8c78-c6837d841d5e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.809029 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c5579dd7-5380-4042-8c78-c6837d841d5e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.809201 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5579dd7-5380-4042-8c78-c6837d841d5e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.809258 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.810082 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c5579dd7-5380-4042-8c78-c6837d841d5e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.813723 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c5579dd7-5380-4042-8c78-c6837d841d5e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.813809 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c5579dd7-5380-4042-8c78-c6837d841d5e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.816435 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c5579dd7-5380-4042-8c78-c6837d841d5e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.816661 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c5579dd7-5380-4042-8c78-c6837d841d5e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.833633 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwxh2\" (UniqueName: \"kubernetes.io/projected/c5579dd7-5380-4042-8c78-c6837d841d5e-kube-api-access-qwxh2\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.851547 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c5579dd7-5380-4042-8c78-c6837d841d5e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.876296 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-f8lbq"] Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.878094 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.880696 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.895755 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:44:37 crc kubenswrapper[4764]: I0309 13:44:37.899989 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-f8lbq"] Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.013893 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-ovsdbserver-nb\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.014670 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-config\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.014723 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-ovsdbserver-sb\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.015369 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws2cl\" (UniqueName: \"kubernetes.io/projected/3912c156-63a8-4756-bc55-4e403c3807f8-kube-api-access-ws2cl\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.015570 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-openstack-edpm-ipam\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.015595 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-dns-svc\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.132051 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-openstack-edpm-ipam\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.132240 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-dns-svc\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.132358 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-ovsdbserver-nb\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.132418 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-config\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.132466 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-ovsdbserver-sb\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.132534 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws2cl\" (UniqueName: \"kubernetes.io/projected/3912c156-63a8-4756-bc55-4e403c3807f8-kube-api-access-ws2cl\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.133255 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-openstack-edpm-ipam\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.137498 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-dns-svc\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.138345 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-config\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.138784 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-ovsdbserver-nb\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.140175 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-ovsdbserver-sb\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.162292 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws2cl\" (UniqueName: \"kubernetes.io/projected/3912c156-63a8-4756-bc55-4e403c3807f8-kube-api-access-ws2cl\") pod \"dnsmasq-dns-578b8d767c-f8lbq\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.201871 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.425892 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b19144b6-cc4c-41d6-ad2e-409c021f657c","Type":"ContainerStarted","Data":"d5545c1d9524a5328fcea840cbc5054f1b2e1c872c15c4e573f15f8e5aa3158c"} Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.437875 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 13:44:38 crc kubenswrapper[4764]: W0309 13:44:38.440973 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5579dd7_5380_4042_8c78_c6837d841d5e.slice/crio-351cf9a9ccaf4afea3cf8872dfac012cf03f69c5fcff9ceafa4f3c49e86bbd21 WatchSource:0}: Error finding container 351cf9a9ccaf4afea3cf8872dfac012cf03f69c5fcff9ceafa4f3c49e86bbd21: Status 404 returned error can't find the container with id 351cf9a9ccaf4afea3cf8872dfac012cf03f69c5fcff9ceafa4f3c49e86bbd21 Mar 09 13:44:38 crc kubenswrapper[4764]: I0309 13:44:38.694596 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-f8lbq"] Mar 09 13:44:39 crc kubenswrapper[4764]: I0309 13:44:39.445128 4764 generic.go:334] "Generic (PLEG): container finished" podID="3912c156-63a8-4756-bc55-4e403c3807f8" containerID="56f82ff470e1fffea03da2d37cd7b44a6189cee402a05697723d71f2a014b32a" exitCode=0 Mar 09 13:44:39 crc kubenswrapper[4764]: I0309 13:44:39.445258 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" event={"ID":"3912c156-63a8-4756-bc55-4e403c3807f8","Type":"ContainerDied","Data":"56f82ff470e1fffea03da2d37cd7b44a6189cee402a05697723d71f2a014b32a"} Mar 09 13:44:39 crc kubenswrapper[4764]: I0309 13:44:39.445836 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" event={"ID":"3912c156-63a8-4756-bc55-4e403c3807f8","Type":"ContainerStarted","Data":"05c41c0e7884eaf2ee86d117de131aea3691b93a29e2fa0912581efbd369cd9a"} Mar 09 13:44:39 crc kubenswrapper[4764]: I0309 13:44:39.447936 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c5579dd7-5380-4042-8c78-c6837d841d5e","Type":"ContainerStarted","Data":"351cf9a9ccaf4afea3cf8872dfac012cf03f69c5fcff9ceafa4f3c49e86bbd21"} Mar 09 13:44:40 crc kubenswrapper[4764]: I0309 13:44:40.469640 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" event={"ID":"3912c156-63a8-4756-bc55-4e403c3807f8","Type":"ContainerStarted","Data":"5a80bb357eacada450213ffb7482e6c0c27854fdde2db316d00ae017b880c5af"} Mar 09 13:44:40 crc kubenswrapper[4764]: I0309 13:44:40.470110 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:40 crc kubenswrapper[4764]: I0309 13:44:40.473443 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c5579dd7-5380-4042-8c78-c6837d841d5e","Type":"ContainerStarted","Data":"188de235de55b51ff40610026c7e52499fd41faec0f6ea44af2ec4f4625f4040"} Mar 09 13:44:40 crc kubenswrapper[4764]: I0309 13:44:40.478055 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b19144b6-cc4c-41d6-ad2e-409c021f657c","Type":"ContainerStarted","Data":"4ebc5d892f1f305b7293fcf5f466f2290393eed4c595d382d19c94d2aa4f9a50"} Mar 09 13:44:40 crc kubenswrapper[4764]: I0309 13:44:40.501368 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" podStartSLOduration=3.5013393539999997 podStartE2EDuration="3.501339354s" podCreationTimestamp="2026-03-09 13:44:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:44:40.489646972 +0000 UTC m=+1435.739818890" watchObservedRunningTime="2026-03-09 13:44:40.501339354 +0000 UTC m=+1435.751511262" Mar 09 13:44:48 crc kubenswrapper[4764]: I0309 13:44:48.203968 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:48 crc kubenswrapper[4764]: I0309 13:44:48.273763 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-bwpkr"] Mar 09 13:44:48 crc kubenswrapper[4764]: I0309 13:44:48.274091 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" podUID="4ae25624-74af-4de4-8aa1-14ea5dbc7b68" containerName="dnsmasq-dns" containerID="cri-o://37fa28134b3929cb4f988c4f648598adc00dd97c039b66d6a61153de431009b4" gracePeriod=10 Mar 09 13:44:48 crc kubenswrapper[4764]: I0309 13:44:48.462514 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fbc59fbb7-688wh"] Mar 09 13:44:48 crc kubenswrapper[4764]: I0309 13:44:48.464624 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:48 crc kubenswrapper[4764]: I0309 13:44:48.484551 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fbc59fbb7-688wh"] Mar 09 13:44:48 crc kubenswrapper[4764]: I0309 13:44:48.575502 4764 generic.go:334] "Generic (PLEG): container finished" podID="4ae25624-74af-4de4-8aa1-14ea5dbc7b68" containerID="37fa28134b3929cb4f988c4f648598adc00dd97c039b66d6a61153de431009b4" exitCode=0 Mar 09 13:44:48 crc kubenswrapper[4764]: I0309 13:44:48.575572 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" event={"ID":"4ae25624-74af-4de4-8aa1-14ea5dbc7b68","Type":"ContainerDied","Data":"37fa28134b3929cb4f988c4f648598adc00dd97c039b66d6a61153de431009b4"} Mar 09 13:44:48 crc kubenswrapper[4764]: I0309 13:44:48.589954 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-ovsdbserver-sb\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:48 crc kubenswrapper[4764]: I0309 13:44:48.590025 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-config\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:48 crc kubenswrapper[4764]: I0309 13:44:48.590049 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j76rw\" (UniqueName: \"kubernetes.io/projected/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-kube-api-access-j76rw\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:48 crc kubenswrapper[4764]: I0309 13:44:48.590135 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-ovsdbserver-nb\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:48 crc kubenswrapper[4764]: I0309 13:44:48.590157 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-openstack-edpm-ipam\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:48 crc kubenswrapper[4764]: I0309 13:44:48.590201 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-dns-svc\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.691909 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-openstack-edpm-ipam\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.692039 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-dns-svc\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.692128 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-ovsdbserver-sb\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.692179 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-config\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.692201 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j76rw\" (UniqueName: \"kubernetes.io/projected/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-kube-api-access-j76rw\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.692368 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-ovsdbserver-nb\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.693307 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-dns-svc\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.693366 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-ovsdbserver-nb\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.693982 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-openstack-edpm-ipam\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.694568 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-config\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.695392 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-ovsdbserver-sb\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.721067 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j76rw\" (UniqueName: \"kubernetes.io/projected/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-kube-api-access-j76rw\") pod \"dnsmasq-dns-fbc59fbb7-688wh\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.810020 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.830459 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.999523 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-config\") pod \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.999631 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-ovsdbserver-nb\") pod \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.999790 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jzfc\" (UniqueName: \"kubernetes.io/projected/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-kube-api-access-8jzfc\") pod \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:48.999971 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-dns-svc\") pod \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.000046 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-ovsdbserver-sb\") pod \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\" (UID: \"4ae25624-74af-4de4-8aa1-14ea5dbc7b68\") " Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.007355 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-kube-api-access-8jzfc" (OuterVolumeSpecName: "kube-api-access-8jzfc") pod "4ae25624-74af-4de4-8aa1-14ea5dbc7b68" (UID: "4ae25624-74af-4de4-8aa1-14ea5dbc7b68"). InnerVolumeSpecName "kube-api-access-8jzfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.059890 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4ae25624-74af-4de4-8aa1-14ea5dbc7b68" (UID: "4ae25624-74af-4de4-8aa1-14ea5dbc7b68"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.060228 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-config" (OuterVolumeSpecName: "config") pod "4ae25624-74af-4de4-8aa1-14ea5dbc7b68" (UID: "4ae25624-74af-4de4-8aa1-14ea5dbc7b68"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.060750 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4ae25624-74af-4de4-8aa1-14ea5dbc7b68" (UID: "4ae25624-74af-4de4-8aa1-14ea5dbc7b68"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.061184 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4ae25624-74af-4de4-8aa1-14ea5dbc7b68" (UID: "4ae25624-74af-4de4-8aa1-14ea5dbc7b68"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.102621 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.102670 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.102683 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jzfc\" (UniqueName: \"kubernetes.io/projected/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-kube-api-access-8jzfc\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.102692 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.102709 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ae25624-74af-4de4-8aa1-14ea5dbc7b68-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.591425 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" event={"ID":"4ae25624-74af-4de4-8aa1-14ea5dbc7b68","Type":"ContainerDied","Data":"3ac89812f31874fa083542c67d0593b58a24f3774ac9cde06937d2e9c1a94aaf"} Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.591481 4764 scope.go:117] "RemoveContainer" containerID="37fa28134b3929cb4f988c4f648598adc00dd97c039b66d6a61153de431009b4" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.591660 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d4b6d797-bwpkr" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.618233 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-bwpkr"] Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.627314 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-bwpkr"] Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.630673 4764 scope.go:117] "RemoveContainer" containerID="1417bd2805b59981136a1861cd39d3dca99a51d7e7919e8b5c01839da7f91123" Mar 09 13:44:49 crc kubenswrapper[4764]: I0309 13:44:49.996151 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fbc59fbb7-688wh"] Mar 09 13:44:50 crc kubenswrapper[4764]: I0309 13:44:50.607451 4764 generic.go:334] "Generic (PLEG): container finished" podID="6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff" containerID="3b50e9b2a9f264bfcb22b787b33fb1bad8b4dde68292e87c3d306716bd5422e2" exitCode=0 Mar 09 13:44:50 crc kubenswrapper[4764]: I0309 13:44:50.607585 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" event={"ID":"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff","Type":"ContainerDied","Data":"3b50e9b2a9f264bfcb22b787b33fb1bad8b4dde68292e87c3d306716bd5422e2"} Mar 09 13:44:50 crc kubenswrapper[4764]: I0309 13:44:50.607696 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" event={"ID":"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff","Type":"ContainerStarted","Data":"7623e23ae56b8d47625b88ad80b059396febc09e926f11449c6996f010201d29"} Mar 09 13:44:51 crc kubenswrapper[4764]: I0309 13:44:51.573174 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ae25624-74af-4de4-8aa1-14ea5dbc7b68" path="/var/lib/kubelet/pods/4ae25624-74af-4de4-8aa1-14ea5dbc7b68/volumes" Mar 09 13:44:51 crc kubenswrapper[4764]: I0309 13:44:51.621568 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" event={"ID":"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff","Type":"ContainerStarted","Data":"0b92f70675fc36bf2d871b5244dabd98b74bc41c0324eee9ebebf35ece42f834"} Mar 09 13:44:51 crc kubenswrapper[4764]: I0309 13:44:51.651937 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" podStartSLOduration=3.651911845 podStartE2EDuration="3.651911845s" podCreationTimestamp="2026-03-09 13:44:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:44:51.643014648 +0000 UTC m=+1446.893186586" watchObservedRunningTime="2026-03-09 13:44:51.651911845 +0000 UTC m=+1446.902083763" Mar 09 13:44:52 crc kubenswrapper[4764]: I0309 13:44:52.631868 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:58 crc kubenswrapper[4764]: I0309 13:44:58.812842 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 13:44:58 crc kubenswrapper[4764]: I0309 13:44:58.890412 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-f8lbq"] Mar 09 13:44:58 crc kubenswrapper[4764]: I0309 13:44:58.891611 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" podUID="3912c156-63a8-4756-bc55-4e403c3807f8" containerName="dnsmasq-dns" containerID="cri-o://5a80bb357eacada450213ffb7482e6c0c27854fdde2db316d00ae017b880c5af" gracePeriod=10 Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.389106 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.560854 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws2cl\" (UniqueName: \"kubernetes.io/projected/3912c156-63a8-4756-bc55-4e403c3807f8-kube-api-access-ws2cl\") pod \"3912c156-63a8-4756-bc55-4e403c3807f8\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.560917 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-config\") pod \"3912c156-63a8-4756-bc55-4e403c3807f8\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.560962 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-ovsdbserver-sb\") pod \"3912c156-63a8-4756-bc55-4e403c3807f8\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.561030 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-openstack-edpm-ipam\") pod \"3912c156-63a8-4756-bc55-4e403c3807f8\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.561167 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-dns-svc\") pod \"3912c156-63a8-4756-bc55-4e403c3807f8\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.561250 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-ovsdbserver-nb\") pod \"3912c156-63a8-4756-bc55-4e403c3807f8\" (UID: \"3912c156-63a8-4756-bc55-4e403c3807f8\") " Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.574080 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3912c156-63a8-4756-bc55-4e403c3807f8-kube-api-access-ws2cl" (OuterVolumeSpecName: "kube-api-access-ws2cl") pod "3912c156-63a8-4756-bc55-4e403c3807f8" (UID: "3912c156-63a8-4756-bc55-4e403c3807f8"). InnerVolumeSpecName "kube-api-access-ws2cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.612845 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-config" (OuterVolumeSpecName: "config") pod "3912c156-63a8-4756-bc55-4e403c3807f8" (UID: "3912c156-63a8-4756-bc55-4e403c3807f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.616796 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3912c156-63a8-4756-bc55-4e403c3807f8" (UID: "3912c156-63a8-4756-bc55-4e403c3807f8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.619693 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3912c156-63a8-4756-bc55-4e403c3807f8" (UID: "3912c156-63a8-4756-bc55-4e403c3807f8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.620143 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "3912c156-63a8-4756-bc55-4e403c3807f8" (UID: "3912c156-63a8-4756-bc55-4e403c3807f8"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.620198 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3912c156-63a8-4756-bc55-4e403c3807f8" (UID: "3912c156-63a8-4756-bc55-4e403c3807f8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.663966 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.664006 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws2cl\" (UniqueName: \"kubernetes.io/projected/3912c156-63a8-4756-bc55-4e403c3807f8-kube-api-access-ws2cl\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.664019 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.664029 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.664038 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.664051 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3912c156-63a8-4756-bc55-4e403c3807f8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.702510 4764 generic.go:334] "Generic (PLEG): container finished" podID="3912c156-63a8-4756-bc55-4e403c3807f8" containerID="5a80bb357eacada450213ffb7482e6c0c27854fdde2db316d00ae017b880c5af" exitCode=0 Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.702582 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" event={"ID":"3912c156-63a8-4756-bc55-4e403c3807f8","Type":"ContainerDied","Data":"5a80bb357eacada450213ffb7482e6c0c27854fdde2db316d00ae017b880c5af"} Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.702610 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.702718 4764 scope.go:117] "RemoveContainer" containerID="5a80bb357eacada450213ffb7482e6c0c27854fdde2db316d00ae017b880c5af" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.702627 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-f8lbq" event={"ID":"3912c156-63a8-4756-bc55-4e403c3807f8","Type":"ContainerDied","Data":"05c41c0e7884eaf2ee86d117de131aea3691b93a29e2fa0912581efbd369cd9a"} Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.731362 4764 scope.go:117] "RemoveContainer" containerID="56f82ff470e1fffea03da2d37cd7b44a6189cee402a05697723d71f2a014b32a" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.743415 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-f8lbq"] Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.751881 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-f8lbq"] Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.781356 4764 scope.go:117] "RemoveContainer" containerID="5a80bb357eacada450213ffb7482e6c0c27854fdde2db316d00ae017b880c5af" Mar 09 13:44:59 crc kubenswrapper[4764]: E0309 13:44:59.782068 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a80bb357eacada450213ffb7482e6c0c27854fdde2db316d00ae017b880c5af\": container with ID starting with 5a80bb357eacada450213ffb7482e6c0c27854fdde2db316d00ae017b880c5af not found: ID does not exist" containerID="5a80bb357eacada450213ffb7482e6c0c27854fdde2db316d00ae017b880c5af" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.782137 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a80bb357eacada450213ffb7482e6c0c27854fdde2db316d00ae017b880c5af"} err="failed to get container status \"5a80bb357eacada450213ffb7482e6c0c27854fdde2db316d00ae017b880c5af\": rpc error: code = NotFound desc = could not find container \"5a80bb357eacada450213ffb7482e6c0c27854fdde2db316d00ae017b880c5af\": container with ID starting with 5a80bb357eacada450213ffb7482e6c0c27854fdde2db316d00ae017b880c5af not found: ID does not exist" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.782177 4764 scope.go:117] "RemoveContainer" containerID="56f82ff470e1fffea03da2d37cd7b44a6189cee402a05697723d71f2a014b32a" Mar 09 13:44:59 crc kubenswrapper[4764]: E0309 13:44:59.783060 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56f82ff470e1fffea03da2d37cd7b44a6189cee402a05697723d71f2a014b32a\": container with ID starting with 56f82ff470e1fffea03da2d37cd7b44a6189cee402a05697723d71f2a014b32a not found: ID does not exist" containerID="56f82ff470e1fffea03da2d37cd7b44a6189cee402a05697723d71f2a014b32a" Mar 09 13:44:59 crc kubenswrapper[4764]: I0309 13:44:59.783108 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56f82ff470e1fffea03da2d37cd7b44a6189cee402a05697723d71f2a014b32a"} err="failed to get container status \"56f82ff470e1fffea03da2d37cd7b44a6189cee402a05697723d71f2a014b32a\": rpc error: code = NotFound desc = could not find container \"56f82ff470e1fffea03da2d37cd7b44a6189cee402a05697723d71f2a014b32a\": container with ID starting with 56f82ff470e1fffea03da2d37cd7b44a6189cee402a05697723d71f2a014b32a not found: ID does not exist" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.151973 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt"] Mar 09 13:45:00 crc kubenswrapper[4764]: E0309 13:45:00.152776 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae25624-74af-4de4-8aa1-14ea5dbc7b68" containerName="dnsmasq-dns" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.152792 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae25624-74af-4de4-8aa1-14ea5dbc7b68" containerName="dnsmasq-dns" Mar 09 13:45:00 crc kubenswrapper[4764]: E0309 13:45:00.152839 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae25624-74af-4de4-8aa1-14ea5dbc7b68" containerName="init" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.152846 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae25624-74af-4de4-8aa1-14ea5dbc7b68" containerName="init" Mar 09 13:45:00 crc kubenswrapper[4764]: E0309 13:45:00.152859 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3912c156-63a8-4756-bc55-4e403c3807f8" containerName="init" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.152865 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3912c156-63a8-4756-bc55-4e403c3807f8" containerName="init" Mar 09 13:45:00 crc kubenswrapper[4764]: E0309 13:45:00.152878 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3912c156-63a8-4756-bc55-4e403c3807f8" containerName="dnsmasq-dns" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.152884 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3912c156-63a8-4756-bc55-4e403c3807f8" containerName="dnsmasq-dns" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.153056 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3912c156-63a8-4756-bc55-4e403c3807f8" containerName="dnsmasq-dns" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.153076 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ae25624-74af-4de4-8aa1-14ea5dbc7b68" containerName="dnsmasq-dns" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.154020 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.158562 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.159845 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.165480 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt"] Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.173449 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-config-volume\") pod \"collect-profiles-29551065-gdnvt\" (UID: \"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.173533 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-secret-volume\") pod \"collect-profiles-29551065-gdnvt\" (UID: \"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.173703 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hpj2\" (UniqueName: \"kubernetes.io/projected/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-kube-api-access-4hpj2\") pod \"collect-profiles-29551065-gdnvt\" (UID: \"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.275718 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-secret-volume\") pod \"collect-profiles-29551065-gdnvt\" (UID: \"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.275845 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hpj2\" (UniqueName: \"kubernetes.io/projected/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-kube-api-access-4hpj2\") pod \"collect-profiles-29551065-gdnvt\" (UID: \"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.275937 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-config-volume\") pod \"collect-profiles-29551065-gdnvt\" (UID: \"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.276875 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-config-volume\") pod \"collect-profiles-29551065-gdnvt\" (UID: \"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.284087 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-secret-volume\") pod \"collect-profiles-29551065-gdnvt\" (UID: \"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.299226 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hpj2\" (UniqueName: \"kubernetes.io/projected/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-kube-api-access-4hpj2\") pod \"collect-profiles-29551065-gdnvt\" (UID: \"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.477926 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" Mar 09 13:45:00 crc kubenswrapper[4764]: I0309 13:45:00.941325 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt"] Mar 09 13:45:01 crc kubenswrapper[4764]: I0309 13:45:01.575941 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3912c156-63a8-4756-bc55-4e403c3807f8" path="/var/lib/kubelet/pods/3912c156-63a8-4756-bc55-4e403c3807f8/volumes" Mar 09 13:45:01 crc kubenswrapper[4764]: I0309 13:45:01.736424 4764 generic.go:334] "Generic (PLEG): container finished" podID="5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d" containerID="797b0150f56f98282613dd9aa20b75479aae0c1306bab3b67c8a86168757b198" exitCode=0 Mar 09 13:45:01 crc kubenswrapper[4764]: I0309 13:45:01.736483 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" event={"ID":"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d","Type":"ContainerDied","Data":"797b0150f56f98282613dd9aa20b75479aae0c1306bab3b67c8a86168757b198"} Mar 09 13:45:01 crc kubenswrapper[4764]: I0309 13:45:01.736524 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" event={"ID":"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d","Type":"ContainerStarted","Data":"6d6a58716a56d842f085937a0ecf38105dd034284c1e2744c05cc090d42d339a"} Mar 09 13:45:03 crc kubenswrapper[4764]: I0309 13:45:03.079072 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" Mar 09 13:45:03 crc kubenswrapper[4764]: I0309 13:45:03.134409 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-secret-volume\") pod \"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d\" (UID: \"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d\") " Mar 09 13:45:03 crc kubenswrapper[4764]: I0309 13:45:03.134825 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hpj2\" (UniqueName: \"kubernetes.io/projected/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-kube-api-access-4hpj2\") pod \"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d\" (UID: \"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d\") " Mar 09 13:45:03 crc kubenswrapper[4764]: I0309 13:45:03.142141 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d" (UID: "5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:45:03 crc kubenswrapper[4764]: I0309 13:45:03.142187 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-kube-api-access-4hpj2" (OuterVolumeSpecName: "kube-api-access-4hpj2") pod "5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d" (UID: "5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d"). InnerVolumeSpecName "kube-api-access-4hpj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:45:03 crc kubenswrapper[4764]: I0309 13:45:03.237081 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-config-volume\") pod \"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d\" (UID: \"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d\") " Mar 09 13:45:03 crc kubenswrapper[4764]: I0309 13:45:03.237577 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:03 crc kubenswrapper[4764]: I0309 13:45:03.237608 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hpj2\" (UniqueName: \"kubernetes.io/projected/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-kube-api-access-4hpj2\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:03 crc kubenswrapper[4764]: I0309 13:45:03.237951 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-config-volume" (OuterVolumeSpecName: "config-volume") pod "5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d" (UID: "5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:45:03 crc kubenswrapper[4764]: I0309 13:45:03.340760 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:03 crc kubenswrapper[4764]: I0309 13:45:03.762411 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" event={"ID":"5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d","Type":"ContainerDied","Data":"6d6a58716a56d842f085937a0ecf38105dd034284c1e2744c05cc090d42d339a"} Mar 09 13:45:03 crc kubenswrapper[4764]: I0309 13:45:03.762475 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d6a58716a56d842f085937a0ecf38105dd034284c1e2744c05cc090d42d339a" Mar 09 13:45:03 crc kubenswrapper[4764]: I0309 13:45:03.762476 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt" Mar 09 13:45:05 crc kubenswrapper[4764]: I0309 13:45:05.013884 4764 scope.go:117] "RemoveContainer" containerID="c0f759ebfc59d520d002517970a3889749936b75779f65069038f6e34fd87723" Mar 09 13:45:08 crc kubenswrapper[4764]: I0309 13:45:08.994097 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7"] Mar 09 13:45:08 crc kubenswrapper[4764]: E0309 13:45:08.995336 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d" containerName="collect-profiles" Mar 09 13:45:08 crc kubenswrapper[4764]: I0309 13:45:08.995353 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d" containerName="collect-profiles" Mar 09 13:45:08 crc kubenswrapper[4764]: I0309 13:45:08.995549 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d" containerName="collect-profiles" Mar 09 13:45:08 crc kubenswrapper[4764]: I0309 13:45:08.996340 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:08 crc kubenswrapper[4764]: I0309 13:45:08.999332 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:08.999996 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.000385 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.000597 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.012680 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7"] Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.170197 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npdx4\" (UniqueName: \"kubernetes.io/projected/07f61b11-aba4-469c-a5ed-9566f1951559-kube-api-access-npdx4\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.170361 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.170475 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.170504 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.272291 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.272356 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.272391 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.272479 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npdx4\" (UniqueName: \"kubernetes.io/projected/07f61b11-aba4-469c-a5ed-9566f1951559-kube-api-access-npdx4\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.289726 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.293558 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.296953 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.304771 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npdx4\" (UniqueName: \"kubernetes.io/projected/07f61b11-aba4-469c-a5ed-9566f1951559-kube-api-access-npdx4\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.328357 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.883728 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7"] Mar 09 13:45:09 crc kubenswrapper[4764]: I0309 13:45:09.891514 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:45:10 crc kubenswrapper[4764]: I0309 13:45:10.834496 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" event={"ID":"07f61b11-aba4-469c-a5ed-9566f1951559","Type":"ContainerStarted","Data":"126e7156c2e3006e0a3f1b9868e0b20082a6b17496796473f809ef087c89923d"} Mar 09 13:45:12 crc kubenswrapper[4764]: I0309 13:45:12.861982 4764 generic.go:334] "Generic (PLEG): container finished" podID="c5579dd7-5380-4042-8c78-c6837d841d5e" containerID="188de235de55b51ff40610026c7e52499fd41faec0f6ea44af2ec4f4625f4040" exitCode=0 Mar 09 13:45:12 crc kubenswrapper[4764]: I0309 13:45:12.862493 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c5579dd7-5380-4042-8c78-c6837d841d5e","Type":"ContainerDied","Data":"188de235de55b51ff40610026c7e52499fd41faec0f6ea44af2ec4f4625f4040"} Mar 09 13:45:12 crc kubenswrapper[4764]: I0309 13:45:12.866600 4764 generic.go:334] "Generic (PLEG): container finished" podID="b19144b6-cc4c-41d6-ad2e-409c021f657c" containerID="4ebc5d892f1f305b7293fcf5f466f2290393eed4c595d382d19c94d2aa4f9a50" exitCode=0 Mar 09 13:45:12 crc kubenswrapper[4764]: I0309 13:45:12.866677 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b19144b6-cc4c-41d6-ad2e-409c021f657c","Type":"ContainerDied","Data":"4ebc5d892f1f305b7293fcf5f466f2290393eed4c595d382d19c94d2aa4f9a50"} Mar 09 13:45:13 crc kubenswrapper[4764]: I0309 13:45:13.877314 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c5579dd7-5380-4042-8c78-c6837d841d5e","Type":"ContainerStarted","Data":"6c3db23c5bed35b1a88f34166c094e7dd5ad5d574e5e1e8d461ac988365c8de5"} Mar 09 13:45:13 crc kubenswrapper[4764]: I0309 13:45:13.878217 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:45:13 crc kubenswrapper[4764]: I0309 13:45:13.879021 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b19144b6-cc4c-41d6-ad2e-409c021f657c","Type":"ContainerStarted","Data":"e39b9f2171082500f2f291393f6f825360523292c0d053435a6028812f74ae5a"} Mar 09 13:45:13 crc kubenswrapper[4764]: I0309 13:45:13.879494 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 09 13:45:13 crc kubenswrapper[4764]: I0309 13:45:13.914105 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.914082065 podStartE2EDuration="36.914082065s" podCreationTimestamp="2026-03-09 13:44:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:45:13.908462675 +0000 UTC m=+1469.158634613" watchObservedRunningTime="2026-03-09 13:45:13.914082065 +0000 UTC m=+1469.164253983" Mar 09 13:45:13 crc kubenswrapper[4764]: I0309 13:45:13.944753 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.944727812 podStartE2EDuration="37.944727812s" podCreationTimestamp="2026-03-09 13:44:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:45:13.934990482 +0000 UTC m=+1469.185162390" watchObservedRunningTime="2026-03-09 13:45:13.944727812 +0000 UTC m=+1469.194899720" Mar 09 13:45:19 crc kubenswrapper[4764]: I0309 13:45:19.942834 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" event={"ID":"07f61b11-aba4-469c-a5ed-9566f1951559","Type":"ContainerStarted","Data":"bde45a06faa972981715117bca5bea2ffa8f521b6f7a639569a4330b9afefe5a"} Mar 09 13:45:19 crc kubenswrapper[4764]: I0309 13:45:19.963808 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" podStartSLOduration=2.64079206 podStartE2EDuration="11.963784219s" podCreationTimestamp="2026-03-09 13:45:08 +0000 UTC" firstStartedPulling="2026-03-09 13:45:09.891264778 +0000 UTC m=+1465.141436686" lastFinishedPulling="2026-03-09 13:45:19.214256937 +0000 UTC m=+1474.464428845" observedRunningTime="2026-03-09 13:45:19.9623275 +0000 UTC m=+1475.212499428" watchObservedRunningTime="2026-03-09 13:45:19.963784219 +0000 UTC m=+1475.213956147" Mar 09 13:45:27 crc kubenswrapper[4764]: I0309 13:45:27.155020 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 09 13:45:27 crc kubenswrapper[4764]: I0309 13:45:27.900537 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 09 13:45:30 crc kubenswrapper[4764]: I0309 13:45:30.044924 4764 generic.go:334] "Generic (PLEG): container finished" podID="07f61b11-aba4-469c-a5ed-9566f1951559" containerID="bde45a06faa972981715117bca5bea2ffa8f521b6f7a639569a4330b9afefe5a" exitCode=0 Mar 09 13:45:30 crc kubenswrapper[4764]: I0309 13:45:30.045021 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" event={"ID":"07f61b11-aba4-469c-a5ed-9566f1951559","Type":"ContainerDied","Data":"bde45a06faa972981715117bca5bea2ffa8f521b6f7a639569a4330b9afefe5a"} Mar 09 13:45:31 crc kubenswrapper[4764]: I0309 13:45:31.783227 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:31 crc kubenswrapper[4764]: I0309 13:45:31.923296 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-ssh-key-openstack-edpm-ipam\") pod \"07f61b11-aba4-469c-a5ed-9566f1951559\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " Mar 09 13:45:31 crc kubenswrapper[4764]: I0309 13:45:31.923688 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npdx4\" (UniqueName: \"kubernetes.io/projected/07f61b11-aba4-469c-a5ed-9566f1951559-kube-api-access-npdx4\") pod \"07f61b11-aba4-469c-a5ed-9566f1951559\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " Mar 09 13:45:31 crc kubenswrapper[4764]: I0309 13:45:31.923838 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-inventory\") pod \"07f61b11-aba4-469c-a5ed-9566f1951559\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " Mar 09 13:45:31 crc kubenswrapper[4764]: I0309 13:45:31.924270 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-repo-setup-combined-ca-bundle\") pod \"07f61b11-aba4-469c-a5ed-9566f1951559\" (UID: \"07f61b11-aba4-469c-a5ed-9566f1951559\") " Mar 09 13:45:31 crc kubenswrapper[4764]: I0309 13:45:31.931096 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "07f61b11-aba4-469c-a5ed-9566f1951559" (UID: "07f61b11-aba4-469c-a5ed-9566f1951559"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:45:31 crc kubenswrapper[4764]: I0309 13:45:31.935687 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07f61b11-aba4-469c-a5ed-9566f1951559-kube-api-access-npdx4" (OuterVolumeSpecName: "kube-api-access-npdx4") pod "07f61b11-aba4-469c-a5ed-9566f1951559" (UID: "07f61b11-aba4-469c-a5ed-9566f1951559"). InnerVolumeSpecName "kube-api-access-npdx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:45:31 crc kubenswrapper[4764]: I0309 13:45:31.952661 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "07f61b11-aba4-469c-a5ed-9566f1951559" (UID: "07f61b11-aba4-469c-a5ed-9566f1951559"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:45:31 crc kubenswrapper[4764]: I0309 13:45:31.954889 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-inventory" (OuterVolumeSpecName: "inventory") pod "07f61b11-aba4-469c-a5ed-9566f1951559" (UID: "07f61b11-aba4-469c-a5ed-9566f1951559"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.026913 4764 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.026964 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.026976 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npdx4\" (UniqueName: \"kubernetes.io/projected/07f61b11-aba4-469c-a5ed-9566f1951559-kube-api-access-npdx4\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.026987 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07f61b11-aba4-469c-a5ed-9566f1951559-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.073814 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" event={"ID":"07f61b11-aba4-469c-a5ed-9566f1951559","Type":"ContainerDied","Data":"126e7156c2e3006e0a3f1b9868e0b20082a6b17496796473f809ef087c89923d"} Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.073878 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="126e7156c2e3006e0a3f1b9868e0b20082a6b17496796473f809ef087c89923d" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.074027 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.154721 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr"] Mar 09 13:45:32 crc kubenswrapper[4764]: E0309 13:45:32.155264 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07f61b11-aba4-469c-a5ed-9566f1951559" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.155287 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="07f61b11-aba4-469c-a5ed-9566f1951559" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.155482 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="07f61b11-aba4-469c-a5ed-9566f1951559" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.156279 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.159714 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.160831 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.161043 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.164098 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.177722 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr"] Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.231542 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.231633 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrpw2\" (UniqueName: \"kubernetes.io/projected/fe0d9990-083b-428b-baec-a40ae99487db-kube-api-access-rrpw2\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.231681 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.231713 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.335021 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.335237 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrpw2\" (UniqueName: \"kubernetes.io/projected/fe0d9990-083b-428b-baec-a40ae99487db-kube-api-access-rrpw2\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.335288 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.335335 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.340737 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.341312 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.346559 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.360691 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrpw2\" (UniqueName: \"kubernetes.io/projected/fe0d9990-083b-428b-baec-a40ae99487db-kube-api-access-rrpw2\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:45:32 crc kubenswrapper[4764]: I0309 13:45:32.476950 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:45:33 crc kubenswrapper[4764]: I0309 13:45:33.034617 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr"] Mar 09 13:45:33 crc kubenswrapper[4764]: W0309 13:45:33.036102 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe0d9990_083b_428b_baec_a40ae99487db.slice/crio-88f1f6de4e55f98cf3fb91581e4ad15ec69d81b47c69d9187bda289cfb67da48 WatchSource:0}: Error finding container 88f1f6de4e55f98cf3fb91581e4ad15ec69d81b47c69d9187bda289cfb67da48: Status 404 returned error can't find the container with id 88f1f6de4e55f98cf3fb91581e4ad15ec69d81b47c69d9187bda289cfb67da48 Mar 09 13:45:33 crc kubenswrapper[4764]: I0309 13:45:33.085800 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" event={"ID":"fe0d9990-083b-428b-baec-a40ae99487db","Type":"ContainerStarted","Data":"88f1f6de4e55f98cf3fb91581e4ad15ec69d81b47c69d9187bda289cfb67da48"} Mar 09 13:45:34 crc kubenswrapper[4764]: I0309 13:45:34.099152 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" event={"ID":"fe0d9990-083b-428b-baec-a40ae99487db","Type":"ContainerStarted","Data":"6e4f8f0feb8ec9ecf660634549716b0c350cba173b98c033e4c7e09aa6bd108d"} Mar 09 13:45:34 crc kubenswrapper[4764]: I0309 13:45:34.126799 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" podStartSLOduration=1.692755281 podStartE2EDuration="2.126773931s" podCreationTimestamp="2026-03-09 13:45:32 +0000 UTC" firstStartedPulling="2026-03-09 13:45:33.041133228 +0000 UTC m=+1488.291305136" lastFinishedPulling="2026-03-09 13:45:33.475151878 +0000 UTC m=+1488.725323786" observedRunningTime="2026-03-09 13:45:34.119027624 +0000 UTC m=+1489.369199542" watchObservedRunningTime="2026-03-09 13:45:34.126773931 +0000 UTC m=+1489.376945849" Mar 09 13:45:47 crc kubenswrapper[4764]: I0309 13:45:47.107078 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m5wvh"] Mar 09 13:45:47 crc kubenswrapper[4764]: I0309 13:45:47.110691 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:45:47 crc kubenswrapper[4764]: I0309 13:45:47.117029 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m5wvh"] Mar 09 13:45:47 crc kubenswrapper[4764]: I0309 13:45:47.186505 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhw7v\" (UniqueName: \"kubernetes.io/projected/17216f70-2204-498b-9a97-97d6ce40bd8d-kube-api-access-fhw7v\") pod \"redhat-operators-m5wvh\" (UID: \"17216f70-2204-498b-9a97-97d6ce40bd8d\") " pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:45:47 crc kubenswrapper[4764]: I0309 13:45:47.186733 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17216f70-2204-498b-9a97-97d6ce40bd8d-catalog-content\") pod \"redhat-operators-m5wvh\" (UID: \"17216f70-2204-498b-9a97-97d6ce40bd8d\") " pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:45:47 crc kubenswrapper[4764]: I0309 13:45:47.186776 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17216f70-2204-498b-9a97-97d6ce40bd8d-utilities\") pod \"redhat-operators-m5wvh\" (UID: \"17216f70-2204-498b-9a97-97d6ce40bd8d\") " pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:45:47 crc kubenswrapper[4764]: I0309 13:45:47.290065 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhw7v\" (UniqueName: \"kubernetes.io/projected/17216f70-2204-498b-9a97-97d6ce40bd8d-kube-api-access-fhw7v\") pod \"redhat-operators-m5wvh\" (UID: \"17216f70-2204-498b-9a97-97d6ce40bd8d\") " pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:45:47 crc kubenswrapper[4764]: I0309 13:45:47.290202 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17216f70-2204-498b-9a97-97d6ce40bd8d-catalog-content\") pod \"redhat-operators-m5wvh\" (UID: \"17216f70-2204-498b-9a97-97d6ce40bd8d\") " pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:45:47 crc kubenswrapper[4764]: I0309 13:45:47.290246 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17216f70-2204-498b-9a97-97d6ce40bd8d-utilities\") pod \"redhat-operators-m5wvh\" (UID: \"17216f70-2204-498b-9a97-97d6ce40bd8d\") " pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:45:47 crc kubenswrapper[4764]: I0309 13:45:47.291006 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17216f70-2204-498b-9a97-97d6ce40bd8d-utilities\") pod \"redhat-operators-m5wvh\" (UID: \"17216f70-2204-498b-9a97-97d6ce40bd8d\") " pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:45:47 crc kubenswrapper[4764]: I0309 13:45:47.291086 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17216f70-2204-498b-9a97-97d6ce40bd8d-catalog-content\") pod \"redhat-operators-m5wvh\" (UID: \"17216f70-2204-498b-9a97-97d6ce40bd8d\") " pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:45:47 crc kubenswrapper[4764]: I0309 13:45:47.313543 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhw7v\" (UniqueName: \"kubernetes.io/projected/17216f70-2204-498b-9a97-97d6ce40bd8d-kube-api-access-fhw7v\") pod \"redhat-operators-m5wvh\" (UID: \"17216f70-2204-498b-9a97-97d6ce40bd8d\") " pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:45:47 crc kubenswrapper[4764]: I0309 13:45:47.442545 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:45:47 crc kubenswrapper[4764]: I0309 13:45:47.978334 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m5wvh"] Mar 09 13:45:48 crc kubenswrapper[4764]: I0309 13:45:48.262113 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5wvh" event={"ID":"17216f70-2204-498b-9a97-97d6ce40bd8d","Type":"ContainerStarted","Data":"6365ac478b93353f9f38b36c4bfb228fb56e0098a8f6f51ebddbad0e5763fd55"} Mar 09 13:45:49 crc kubenswrapper[4764]: I0309 13:45:49.273302 4764 generic.go:334] "Generic (PLEG): container finished" podID="17216f70-2204-498b-9a97-97d6ce40bd8d" containerID="4408ae7df69e11c3a006ccacaf85cc6b3509b54b8182706c5b060cacebfd0ac1" exitCode=0 Mar 09 13:45:49 crc kubenswrapper[4764]: I0309 13:45:49.273393 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5wvh" event={"ID":"17216f70-2204-498b-9a97-97d6ce40bd8d","Type":"ContainerDied","Data":"4408ae7df69e11c3a006ccacaf85cc6b3509b54b8182706c5b060cacebfd0ac1"} Mar 09 13:45:53 crc kubenswrapper[4764]: I0309 13:45:53.320676 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5wvh" event={"ID":"17216f70-2204-498b-9a97-97d6ce40bd8d","Type":"ContainerStarted","Data":"1f65c2b840e0a2ef446a8cc3913e0633276d45ba57e7a31e6e4b8ecb08764b0e"} Mar 09 13:45:55 crc kubenswrapper[4764]: I0309 13:45:55.346269 4764 generic.go:334] "Generic (PLEG): container finished" podID="17216f70-2204-498b-9a97-97d6ce40bd8d" containerID="1f65c2b840e0a2ef446a8cc3913e0633276d45ba57e7a31e6e4b8ecb08764b0e" exitCode=0 Mar 09 13:45:55 crc kubenswrapper[4764]: I0309 13:45:55.346325 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5wvh" event={"ID":"17216f70-2204-498b-9a97-97d6ce40bd8d","Type":"ContainerDied","Data":"1f65c2b840e0a2ef446a8cc3913e0633276d45ba57e7a31e6e4b8ecb08764b0e"} Mar 09 13:45:56 crc kubenswrapper[4764]: I0309 13:45:56.366507 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5wvh" event={"ID":"17216f70-2204-498b-9a97-97d6ce40bd8d","Type":"ContainerStarted","Data":"fcd44c7ba1ba4d3dd3602f774658ca367a62c8198a3597b77cff64b7b33a5be7"} Mar 09 13:45:56 crc kubenswrapper[4764]: I0309 13:45:56.396246 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m5wvh" podStartSLOduration=3.95046263 podStartE2EDuration="9.396220113s" podCreationTimestamp="2026-03-09 13:45:47 +0000 UTC" firstStartedPulling="2026-03-09 13:45:50.287468895 +0000 UTC m=+1505.537640803" lastFinishedPulling="2026-03-09 13:45:55.733226358 +0000 UTC m=+1510.983398286" observedRunningTime="2026-03-09 13:45:56.39423434 +0000 UTC m=+1511.644406268" watchObservedRunningTime="2026-03-09 13:45:56.396220113 +0000 UTC m=+1511.646392051" Mar 09 13:45:57 crc kubenswrapper[4764]: I0309 13:45:57.442703 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:45:57 crc kubenswrapper[4764]: I0309 13:45:57.443822 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:45:58 crc kubenswrapper[4764]: I0309 13:45:58.488538 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m5wvh" podUID="17216f70-2204-498b-9a97-97d6ce40bd8d" containerName="registry-server" probeResult="failure" output=< Mar 09 13:45:58 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Mar 09 13:45:58 crc kubenswrapper[4764]: > Mar 09 13:46:00 crc kubenswrapper[4764]: I0309 13:46:00.145934 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551066-q2m88"] Mar 09 13:46:00 crc kubenswrapper[4764]: I0309 13:46:00.147948 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551066-q2m88" Mar 09 13:46:00 crc kubenswrapper[4764]: I0309 13:46:00.150951 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:46:00 crc kubenswrapper[4764]: I0309 13:46:00.151376 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:46:00 crc kubenswrapper[4764]: I0309 13:46:00.154589 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:46:00 crc kubenswrapper[4764]: I0309 13:46:00.179333 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551066-q2m88"] Mar 09 13:46:00 crc kubenswrapper[4764]: I0309 13:46:00.306961 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pffnw\" (UniqueName: \"kubernetes.io/projected/2f277802-4cc0-41e2-90f9-a9e2ac441979-kube-api-access-pffnw\") pod \"auto-csr-approver-29551066-q2m88\" (UID: \"2f277802-4cc0-41e2-90f9-a9e2ac441979\") " pod="openshift-infra/auto-csr-approver-29551066-q2m88" Mar 09 13:46:00 crc kubenswrapper[4764]: I0309 13:46:00.410098 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pffnw\" (UniqueName: \"kubernetes.io/projected/2f277802-4cc0-41e2-90f9-a9e2ac441979-kube-api-access-pffnw\") pod \"auto-csr-approver-29551066-q2m88\" (UID: \"2f277802-4cc0-41e2-90f9-a9e2ac441979\") " pod="openshift-infra/auto-csr-approver-29551066-q2m88" Mar 09 13:46:00 crc kubenswrapper[4764]: I0309 13:46:00.433204 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pffnw\" (UniqueName: \"kubernetes.io/projected/2f277802-4cc0-41e2-90f9-a9e2ac441979-kube-api-access-pffnw\") pod \"auto-csr-approver-29551066-q2m88\" (UID: \"2f277802-4cc0-41e2-90f9-a9e2ac441979\") " pod="openshift-infra/auto-csr-approver-29551066-q2m88" Mar 09 13:46:00 crc kubenswrapper[4764]: I0309 13:46:00.506463 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551066-q2m88" Mar 09 13:46:01 crc kubenswrapper[4764]: I0309 13:46:01.013418 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551066-q2m88"] Mar 09 13:46:01 crc kubenswrapper[4764]: I0309 13:46:01.415509 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551066-q2m88" event={"ID":"2f277802-4cc0-41e2-90f9-a9e2ac441979","Type":"ContainerStarted","Data":"e09f6c465a0041bea1efa258d3a816638c67fbb6e52766cc7015b1ba22284328"} Mar 09 13:46:02 crc kubenswrapper[4764]: I0309 13:46:02.427666 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551066-q2m88" event={"ID":"2f277802-4cc0-41e2-90f9-a9e2ac441979","Type":"ContainerStarted","Data":"ad8cc18bf0e9a68496606eb3c3aef5b4008faa06bd6343718c7f8c425fd14c07"} Mar 09 13:46:03 crc kubenswrapper[4764]: I0309 13:46:03.441402 4764 generic.go:334] "Generic (PLEG): container finished" podID="2f277802-4cc0-41e2-90f9-a9e2ac441979" containerID="ad8cc18bf0e9a68496606eb3c3aef5b4008faa06bd6343718c7f8c425fd14c07" exitCode=0 Mar 09 13:46:03 crc kubenswrapper[4764]: I0309 13:46:03.441455 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551066-q2m88" event={"ID":"2f277802-4cc0-41e2-90f9-a9e2ac441979","Type":"ContainerDied","Data":"ad8cc18bf0e9a68496606eb3c3aef5b4008faa06bd6343718c7f8c425fd14c07"} Mar 09 13:46:04 crc kubenswrapper[4764]: I0309 13:46:04.876715 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551066-q2m88" Mar 09 13:46:05 crc kubenswrapper[4764]: I0309 13:46:05.013480 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pffnw\" (UniqueName: \"kubernetes.io/projected/2f277802-4cc0-41e2-90f9-a9e2ac441979-kube-api-access-pffnw\") pod \"2f277802-4cc0-41e2-90f9-a9e2ac441979\" (UID: \"2f277802-4cc0-41e2-90f9-a9e2ac441979\") " Mar 09 13:46:05 crc kubenswrapper[4764]: I0309 13:46:05.021596 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f277802-4cc0-41e2-90f9-a9e2ac441979-kube-api-access-pffnw" (OuterVolumeSpecName: "kube-api-access-pffnw") pod "2f277802-4cc0-41e2-90f9-a9e2ac441979" (UID: "2f277802-4cc0-41e2-90f9-a9e2ac441979"). InnerVolumeSpecName "kube-api-access-pffnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:46:05 crc kubenswrapper[4764]: I0309 13:46:05.116341 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pffnw\" (UniqueName: \"kubernetes.io/projected/2f277802-4cc0-41e2-90f9-a9e2ac441979-kube-api-access-pffnw\") on node \"crc\" DevicePath \"\"" Mar 09 13:46:05 crc kubenswrapper[4764]: I0309 13:46:05.178518 4764 scope.go:117] "RemoveContainer" containerID="9847def4972a082de5de5ce66af1aea1d5c2f1147d6cb0e73ffadb728f7879a5" Mar 09 13:46:05 crc kubenswrapper[4764]: I0309 13:46:05.469572 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551066-q2m88" event={"ID":"2f277802-4cc0-41e2-90f9-a9e2ac441979","Type":"ContainerDied","Data":"e09f6c465a0041bea1efa258d3a816638c67fbb6e52766cc7015b1ba22284328"} Mar 09 13:46:05 crc kubenswrapper[4764]: I0309 13:46:05.469625 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e09f6c465a0041bea1efa258d3a816638c67fbb6e52766cc7015b1ba22284328" Mar 09 13:46:05 crc kubenswrapper[4764]: I0309 13:46:05.469708 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551066-q2m88" Mar 09 13:46:05 crc kubenswrapper[4764]: I0309 13:46:05.532397 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551060-n7f54"] Mar 09 13:46:05 crc kubenswrapper[4764]: I0309 13:46:05.550477 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551060-n7f54"] Mar 09 13:46:05 crc kubenswrapper[4764]: I0309 13:46:05.574399 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16623a65-1bef-4faa-a891-bae0a7d04977" path="/var/lib/kubelet/pods/16623a65-1bef-4faa-a891-bae0a7d04977/volumes" Mar 09 13:46:07 crc kubenswrapper[4764]: I0309 13:46:07.498776 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:46:07 crc kubenswrapper[4764]: I0309 13:46:07.571430 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:46:07 crc kubenswrapper[4764]: I0309 13:46:07.753445 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m5wvh"] Mar 09 13:46:09 crc kubenswrapper[4764]: I0309 13:46:09.523349 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m5wvh" podUID="17216f70-2204-498b-9a97-97d6ce40bd8d" containerName="registry-server" containerID="cri-o://fcd44c7ba1ba4d3dd3602f774658ca367a62c8198a3597b77cff64b7b33a5be7" gracePeriod=2 Mar 09 13:46:09 crc kubenswrapper[4764]: I0309 13:46:09.992585 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.128831 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17216f70-2204-498b-9a97-97d6ce40bd8d-catalog-content\") pod \"17216f70-2204-498b-9a97-97d6ce40bd8d\" (UID: \"17216f70-2204-498b-9a97-97d6ce40bd8d\") " Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.128931 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhw7v\" (UniqueName: \"kubernetes.io/projected/17216f70-2204-498b-9a97-97d6ce40bd8d-kube-api-access-fhw7v\") pod \"17216f70-2204-498b-9a97-97d6ce40bd8d\" (UID: \"17216f70-2204-498b-9a97-97d6ce40bd8d\") " Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.129266 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17216f70-2204-498b-9a97-97d6ce40bd8d-utilities\") pod \"17216f70-2204-498b-9a97-97d6ce40bd8d\" (UID: \"17216f70-2204-498b-9a97-97d6ce40bd8d\") " Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.129943 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17216f70-2204-498b-9a97-97d6ce40bd8d-utilities" (OuterVolumeSpecName: "utilities") pod "17216f70-2204-498b-9a97-97d6ce40bd8d" (UID: "17216f70-2204-498b-9a97-97d6ce40bd8d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.135499 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17216f70-2204-498b-9a97-97d6ce40bd8d-kube-api-access-fhw7v" (OuterVolumeSpecName: "kube-api-access-fhw7v") pod "17216f70-2204-498b-9a97-97d6ce40bd8d" (UID: "17216f70-2204-498b-9a97-97d6ce40bd8d"). InnerVolumeSpecName "kube-api-access-fhw7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.231593 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17216f70-2204-498b-9a97-97d6ce40bd8d-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.231666 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhw7v\" (UniqueName: \"kubernetes.io/projected/17216f70-2204-498b-9a97-97d6ce40bd8d-kube-api-access-fhw7v\") on node \"crc\" DevicePath \"\"" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.272898 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17216f70-2204-498b-9a97-97d6ce40bd8d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17216f70-2204-498b-9a97-97d6ce40bd8d" (UID: "17216f70-2204-498b-9a97-97d6ce40bd8d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.334232 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17216f70-2204-498b-9a97-97d6ce40bd8d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.539515 4764 generic.go:334] "Generic (PLEG): container finished" podID="17216f70-2204-498b-9a97-97d6ce40bd8d" containerID="fcd44c7ba1ba4d3dd3602f774658ca367a62c8198a3597b77cff64b7b33a5be7" exitCode=0 Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.539580 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5wvh" event={"ID":"17216f70-2204-498b-9a97-97d6ce40bd8d","Type":"ContainerDied","Data":"fcd44c7ba1ba4d3dd3602f774658ca367a62c8198a3597b77cff64b7b33a5be7"} Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.539611 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m5wvh" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.539695 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5wvh" event={"ID":"17216f70-2204-498b-9a97-97d6ce40bd8d","Type":"ContainerDied","Data":"6365ac478b93353f9f38b36c4bfb228fb56e0098a8f6f51ebddbad0e5763fd55"} Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.539724 4764 scope.go:117] "RemoveContainer" containerID="fcd44c7ba1ba4d3dd3602f774658ca367a62c8198a3597b77cff64b7b33a5be7" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.566829 4764 scope.go:117] "RemoveContainer" containerID="1f65c2b840e0a2ef446a8cc3913e0633276d45ba57e7a31e6e4b8ecb08764b0e" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.596339 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m5wvh"] Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.603305 4764 scope.go:117] "RemoveContainer" containerID="4408ae7df69e11c3a006ccacaf85cc6b3509b54b8182706c5b060cacebfd0ac1" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.605610 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m5wvh"] Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.644551 4764 scope.go:117] "RemoveContainer" containerID="fcd44c7ba1ba4d3dd3602f774658ca367a62c8198a3597b77cff64b7b33a5be7" Mar 09 13:46:10 crc kubenswrapper[4764]: E0309 13:46:10.645245 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcd44c7ba1ba4d3dd3602f774658ca367a62c8198a3597b77cff64b7b33a5be7\": container with ID starting with fcd44c7ba1ba4d3dd3602f774658ca367a62c8198a3597b77cff64b7b33a5be7 not found: ID does not exist" containerID="fcd44c7ba1ba4d3dd3602f774658ca367a62c8198a3597b77cff64b7b33a5be7" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.645289 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcd44c7ba1ba4d3dd3602f774658ca367a62c8198a3597b77cff64b7b33a5be7"} err="failed to get container status \"fcd44c7ba1ba4d3dd3602f774658ca367a62c8198a3597b77cff64b7b33a5be7\": rpc error: code = NotFound desc = could not find container \"fcd44c7ba1ba4d3dd3602f774658ca367a62c8198a3597b77cff64b7b33a5be7\": container with ID starting with fcd44c7ba1ba4d3dd3602f774658ca367a62c8198a3597b77cff64b7b33a5be7 not found: ID does not exist" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.645319 4764 scope.go:117] "RemoveContainer" containerID="1f65c2b840e0a2ef446a8cc3913e0633276d45ba57e7a31e6e4b8ecb08764b0e" Mar 09 13:46:10 crc kubenswrapper[4764]: E0309 13:46:10.645864 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f65c2b840e0a2ef446a8cc3913e0633276d45ba57e7a31e6e4b8ecb08764b0e\": container with ID starting with 1f65c2b840e0a2ef446a8cc3913e0633276d45ba57e7a31e6e4b8ecb08764b0e not found: ID does not exist" containerID="1f65c2b840e0a2ef446a8cc3913e0633276d45ba57e7a31e6e4b8ecb08764b0e" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.645897 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f65c2b840e0a2ef446a8cc3913e0633276d45ba57e7a31e6e4b8ecb08764b0e"} err="failed to get container status \"1f65c2b840e0a2ef446a8cc3913e0633276d45ba57e7a31e6e4b8ecb08764b0e\": rpc error: code = NotFound desc = could not find container \"1f65c2b840e0a2ef446a8cc3913e0633276d45ba57e7a31e6e4b8ecb08764b0e\": container with ID starting with 1f65c2b840e0a2ef446a8cc3913e0633276d45ba57e7a31e6e4b8ecb08764b0e not found: ID does not exist" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.645920 4764 scope.go:117] "RemoveContainer" containerID="4408ae7df69e11c3a006ccacaf85cc6b3509b54b8182706c5b060cacebfd0ac1" Mar 09 13:46:10 crc kubenswrapper[4764]: E0309 13:46:10.646273 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4408ae7df69e11c3a006ccacaf85cc6b3509b54b8182706c5b060cacebfd0ac1\": container with ID starting with 4408ae7df69e11c3a006ccacaf85cc6b3509b54b8182706c5b060cacebfd0ac1 not found: ID does not exist" containerID="4408ae7df69e11c3a006ccacaf85cc6b3509b54b8182706c5b060cacebfd0ac1" Mar 09 13:46:10 crc kubenswrapper[4764]: I0309 13:46:10.646314 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4408ae7df69e11c3a006ccacaf85cc6b3509b54b8182706c5b060cacebfd0ac1"} err="failed to get container status \"4408ae7df69e11c3a006ccacaf85cc6b3509b54b8182706c5b060cacebfd0ac1\": rpc error: code = NotFound desc = could not find container \"4408ae7df69e11c3a006ccacaf85cc6b3509b54b8182706c5b060cacebfd0ac1\": container with ID starting with 4408ae7df69e11c3a006ccacaf85cc6b3509b54b8182706c5b060cacebfd0ac1 not found: ID does not exist" Mar 09 13:46:11 crc kubenswrapper[4764]: I0309 13:46:11.570612 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17216f70-2204-498b-9a97-97d6ce40bd8d" path="/var/lib/kubelet/pods/17216f70-2204-498b-9a97-97d6ce40bd8d/volumes" Mar 09 13:46:28 crc kubenswrapper[4764]: I0309 13:46:28.370324 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:46:28 crc kubenswrapper[4764]: I0309 13:46:28.371023 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:46:58 crc kubenswrapper[4764]: I0309 13:46:58.370675 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:46:58 crc kubenswrapper[4764]: I0309 13:46:58.371422 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:47:05 crc kubenswrapper[4764]: I0309 13:47:05.256179 4764 scope.go:117] "RemoveContainer" containerID="14bfbdfc3fcd7dcb9efec055a62cdfeec5d1e0e90e453a55c55ffd01ca49ad5a" Mar 09 13:47:05 crc kubenswrapper[4764]: I0309 13:47:05.312440 4764 scope.go:117] "RemoveContainer" containerID="e53ea6806faef5cb682bac2b7668fda78ebb3a7b30791520082d7ab2a13f5aa0" Mar 09 13:47:05 crc kubenswrapper[4764]: I0309 13:47:05.378428 4764 scope.go:117] "RemoveContainer" containerID="e74ba0435a2089e000c8df414a7a0d71d67c7b4a00cc240811cbefae67ccd184" Mar 09 13:47:05 crc kubenswrapper[4764]: I0309 13:47:05.405199 4764 scope.go:117] "RemoveContainer" containerID="8959c8a0509c3231f91de89d62e7a7d8e52e391a4941e4498d6d53a3afa1efea" Mar 09 13:47:05 crc kubenswrapper[4764]: I0309 13:47:05.452898 4764 scope.go:117] "RemoveContainer" containerID="f865dbb42b23c282654ed11e8388916b4f7e6868644331082fbaed39e3cbb723" Mar 09 13:47:28 crc kubenswrapper[4764]: I0309 13:47:28.370089 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:47:28 crc kubenswrapper[4764]: I0309 13:47:28.370841 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:47:28 crc kubenswrapper[4764]: I0309 13:47:28.370902 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:47:28 crc kubenswrapper[4764]: I0309 13:47:28.371852 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:47:28 crc kubenswrapper[4764]: I0309 13:47:28.371900 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" gracePeriod=600 Mar 09 13:47:28 crc kubenswrapper[4764]: E0309 13:47:28.492425 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:47:29 crc kubenswrapper[4764]: I0309 13:47:29.420599 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" exitCode=0 Mar 09 13:47:29 crc kubenswrapper[4764]: I0309 13:47:29.420691 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb"} Mar 09 13:47:29 crc kubenswrapper[4764]: I0309 13:47:29.421115 4764 scope.go:117] "RemoveContainer" containerID="69810f80271be58962525ba5a5a37ce68d5e172f7c253c440a5a45f3f3beab77" Mar 09 13:47:29 crc kubenswrapper[4764]: I0309 13:47:29.421998 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:47:29 crc kubenswrapper[4764]: E0309 13:47:29.422325 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:47:43 crc kubenswrapper[4764]: I0309 13:47:43.561555 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:47:43 crc kubenswrapper[4764]: E0309 13:47:43.562641 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:47:54 crc kubenswrapper[4764]: I0309 13:47:54.932864 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nxs42"] Mar 09 13:47:54 crc kubenswrapper[4764]: E0309 13:47:54.934128 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f277802-4cc0-41e2-90f9-a9e2ac441979" containerName="oc" Mar 09 13:47:54 crc kubenswrapper[4764]: I0309 13:47:54.934145 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f277802-4cc0-41e2-90f9-a9e2ac441979" containerName="oc" Mar 09 13:47:54 crc kubenswrapper[4764]: E0309 13:47:54.934166 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17216f70-2204-498b-9a97-97d6ce40bd8d" containerName="extract-utilities" Mar 09 13:47:54 crc kubenswrapper[4764]: I0309 13:47:54.934176 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="17216f70-2204-498b-9a97-97d6ce40bd8d" containerName="extract-utilities" Mar 09 13:47:54 crc kubenswrapper[4764]: E0309 13:47:54.934220 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17216f70-2204-498b-9a97-97d6ce40bd8d" containerName="extract-content" Mar 09 13:47:54 crc kubenswrapper[4764]: I0309 13:47:54.934230 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="17216f70-2204-498b-9a97-97d6ce40bd8d" containerName="extract-content" Mar 09 13:47:54 crc kubenswrapper[4764]: E0309 13:47:54.934257 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17216f70-2204-498b-9a97-97d6ce40bd8d" containerName="registry-server" Mar 09 13:47:54 crc kubenswrapper[4764]: I0309 13:47:54.934266 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="17216f70-2204-498b-9a97-97d6ce40bd8d" containerName="registry-server" Mar 09 13:47:54 crc kubenswrapper[4764]: I0309 13:47:54.934483 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="17216f70-2204-498b-9a97-97d6ce40bd8d" containerName="registry-server" Mar 09 13:47:54 crc kubenswrapper[4764]: I0309 13:47:54.934514 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f277802-4cc0-41e2-90f9-a9e2ac441979" containerName="oc" Mar 09 13:47:54 crc kubenswrapper[4764]: I0309 13:47:54.936680 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:47:54 crc kubenswrapper[4764]: I0309 13:47:54.946955 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nxs42"] Mar 09 13:47:55 crc kubenswrapper[4764]: I0309 13:47:55.072625 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1852f78c-18c6-481e-bf04-c3eba97b11e7-utilities\") pod \"community-operators-nxs42\" (UID: \"1852f78c-18c6-481e-bf04-c3eba97b11e7\") " pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:47:55 crc kubenswrapper[4764]: I0309 13:47:55.072811 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6gwd\" (UniqueName: \"kubernetes.io/projected/1852f78c-18c6-481e-bf04-c3eba97b11e7-kube-api-access-q6gwd\") pod \"community-operators-nxs42\" (UID: \"1852f78c-18c6-481e-bf04-c3eba97b11e7\") " pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:47:55 crc kubenswrapper[4764]: I0309 13:47:55.072839 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1852f78c-18c6-481e-bf04-c3eba97b11e7-catalog-content\") pod \"community-operators-nxs42\" (UID: \"1852f78c-18c6-481e-bf04-c3eba97b11e7\") " pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:47:55 crc kubenswrapper[4764]: I0309 13:47:55.174802 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6gwd\" (UniqueName: \"kubernetes.io/projected/1852f78c-18c6-481e-bf04-c3eba97b11e7-kube-api-access-q6gwd\") pod \"community-operators-nxs42\" (UID: \"1852f78c-18c6-481e-bf04-c3eba97b11e7\") " pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:47:55 crc kubenswrapper[4764]: I0309 13:47:55.174869 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1852f78c-18c6-481e-bf04-c3eba97b11e7-catalog-content\") pod \"community-operators-nxs42\" (UID: \"1852f78c-18c6-481e-bf04-c3eba97b11e7\") " pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:47:55 crc kubenswrapper[4764]: I0309 13:47:55.174986 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1852f78c-18c6-481e-bf04-c3eba97b11e7-utilities\") pod \"community-operators-nxs42\" (UID: \"1852f78c-18c6-481e-bf04-c3eba97b11e7\") " pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:47:55 crc kubenswrapper[4764]: I0309 13:47:55.175566 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1852f78c-18c6-481e-bf04-c3eba97b11e7-utilities\") pod \"community-operators-nxs42\" (UID: \"1852f78c-18c6-481e-bf04-c3eba97b11e7\") " pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:47:55 crc kubenswrapper[4764]: I0309 13:47:55.175585 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1852f78c-18c6-481e-bf04-c3eba97b11e7-catalog-content\") pod \"community-operators-nxs42\" (UID: \"1852f78c-18c6-481e-bf04-c3eba97b11e7\") " pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:47:55 crc kubenswrapper[4764]: I0309 13:47:55.198107 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6gwd\" (UniqueName: \"kubernetes.io/projected/1852f78c-18c6-481e-bf04-c3eba97b11e7-kube-api-access-q6gwd\") pod \"community-operators-nxs42\" (UID: \"1852f78c-18c6-481e-bf04-c3eba97b11e7\") " pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:47:55 crc kubenswrapper[4764]: I0309 13:47:55.302116 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:47:55 crc kubenswrapper[4764]: I0309 13:47:55.910426 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nxs42"] Mar 09 13:47:56 crc kubenswrapper[4764]: I0309 13:47:56.746406 4764 generic.go:334] "Generic (PLEG): container finished" podID="1852f78c-18c6-481e-bf04-c3eba97b11e7" containerID="be5da138897f4a01b72eee2c62c947a531c8838ff68ca8adea703659041c2147" exitCode=0 Mar 09 13:47:56 crc kubenswrapper[4764]: I0309 13:47:56.746488 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nxs42" event={"ID":"1852f78c-18c6-481e-bf04-c3eba97b11e7","Type":"ContainerDied","Data":"be5da138897f4a01b72eee2c62c947a531c8838ff68ca8adea703659041c2147"} Mar 09 13:47:56 crc kubenswrapper[4764]: I0309 13:47:56.746829 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nxs42" event={"ID":"1852f78c-18c6-481e-bf04-c3eba97b11e7","Type":"ContainerStarted","Data":"296edd151f10a4f45fecda1a67c35226a5a35655cc126212c50ad827e9d7aed5"} Mar 09 13:47:57 crc kubenswrapper[4764]: I0309 13:47:57.560340 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:47:57 crc kubenswrapper[4764]: E0309 13:47:57.560782 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:47:58 crc kubenswrapper[4764]: I0309 13:47:58.768180 4764 generic.go:334] "Generic (PLEG): container finished" podID="1852f78c-18c6-481e-bf04-c3eba97b11e7" containerID="379ba48f962950be95366dcbf3c94dd62e989eb1302cd5bb42af707eed9f0e85" exitCode=0 Mar 09 13:47:58 crc kubenswrapper[4764]: I0309 13:47:58.768242 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nxs42" event={"ID":"1852f78c-18c6-481e-bf04-c3eba97b11e7","Type":"ContainerDied","Data":"379ba48f962950be95366dcbf3c94dd62e989eb1302cd5bb42af707eed9f0e85"} Mar 09 13:47:59 crc kubenswrapper[4764]: I0309 13:47:59.783331 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nxs42" event={"ID":"1852f78c-18c6-481e-bf04-c3eba97b11e7","Type":"ContainerStarted","Data":"06a8e00ee599e76b9570f836b2bc6f955ebe71a9de168cbb360d725880a2a573"} Mar 09 13:47:59 crc kubenswrapper[4764]: I0309 13:47:59.807630 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nxs42" podStartSLOduration=3.321709317 podStartE2EDuration="5.807609756s" podCreationTimestamp="2026-03-09 13:47:54 +0000 UTC" firstStartedPulling="2026-03-09 13:47:56.74942312 +0000 UTC m=+1631.999595028" lastFinishedPulling="2026-03-09 13:47:59.235323559 +0000 UTC m=+1634.485495467" observedRunningTime="2026-03-09 13:47:59.805752337 +0000 UTC m=+1635.055924275" watchObservedRunningTime="2026-03-09 13:47:59.807609756 +0000 UTC m=+1635.057781674" Mar 09 13:48:00 crc kubenswrapper[4764]: I0309 13:48:00.157754 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551068-v6md5"] Mar 09 13:48:00 crc kubenswrapper[4764]: I0309 13:48:00.159488 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551068-v6md5" Mar 09 13:48:00 crc kubenswrapper[4764]: I0309 13:48:00.162211 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:48:00 crc kubenswrapper[4764]: I0309 13:48:00.164125 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:48:00 crc kubenswrapper[4764]: I0309 13:48:00.164812 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:48:00 crc kubenswrapper[4764]: I0309 13:48:00.170371 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551068-v6md5"] Mar 09 13:48:00 crc kubenswrapper[4764]: I0309 13:48:00.294193 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7trk\" (UniqueName: \"kubernetes.io/projected/ab11b944-7857-4998-b32b-264ac7683616-kube-api-access-w7trk\") pod \"auto-csr-approver-29551068-v6md5\" (UID: \"ab11b944-7857-4998-b32b-264ac7683616\") " pod="openshift-infra/auto-csr-approver-29551068-v6md5" Mar 09 13:48:00 crc kubenswrapper[4764]: I0309 13:48:00.395922 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7trk\" (UniqueName: \"kubernetes.io/projected/ab11b944-7857-4998-b32b-264ac7683616-kube-api-access-w7trk\") pod \"auto-csr-approver-29551068-v6md5\" (UID: \"ab11b944-7857-4998-b32b-264ac7683616\") " pod="openshift-infra/auto-csr-approver-29551068-v6md5" Mar 09 13:48:00 crc kubenswrapper[4764]: I0309 13:48:00.423458 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7trk\" (UniqueName: \"kubernetes.io/projected/ab11b944-7857-4998-b32b-264ac7683616-kube-api-access-w7trk\") pod \"auto-csr-approver-29551068-v6md5\" (UID: \"ab11b944-7857-4998-b32b-264ac7683616\") " pod="openshift-infra/auto-csr-approver-29551068-v6md5" Mar 09 13:48:00 crc kubenswrapper[4764]: I0309 13:48:00.484222 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551068-v6md5" Mar 09 13:48:01 crc kubenswrapper[4764]: I0309 13:48:00.993800 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551068-v6md5"] Mar 09 13:48:01 crc kubenswrapper[4764]: I0309 13:48:01.812061 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551068-v6md5" event={"ID":"ab11b944-7857-4998-b32b-264ac7683616","Type":"ContainerStarted","Data":"f126d2e1b174f599134d0ceb8f1afe3dc279a52c5b8d7026721c430819068c70"} Mar 09 13:48:02 crc kubenswrapper[4764]: I0309 13:48:02.824919 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551068-v6md5" event={"ID":"ab11b944-7857-4998-b32b-264ac7683616","Type":"ContainerStarted","Data":"2a7ab8e616981504d9d31e4fc4313f083401f97f7e60e7b3cdc2825dfc09335b"} Mar 09 13:48:02 crc kubenswrapper[4764]: I0309 13:48:02.851122 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551068-v6md5" podStartSLOduration=1.541986385 podStartE2EDuration="2.851096952s" podCreationTimestamp="2026-03-09 13:48:00 +0000 UTC" firstStartedPulling="2026-03-09 13:48:01.00170779 +0000 UTC m=+1636.251879698" lastFinishedPulling="2026-03-09 13:48:02.310818357 +0000 UTC m=+1637.560990265" observedRunningTime="2026-03-09 13:48:02.841428774 +0000 UTC m=+1638.091600702" watchObservedRunningTime="2026-03-09 13:48:02.851096952 +0000 UTC m=+1638.101268860" Mar 09 13:48:03 crc kubenswrapper[4764]: I0309 13:48:03.840686 4764 generic.go:334] "Generic (PLEG): container finished" podID="ab11b944-7857-4998-b32b-264ac7683616" containerID="2a7ab8e616981504d9d31e4fc4313f083401f97f7e60e7b3cdc2825dfc09335b" exitCode=0 Mar 09 13:48:03 crc kubenswrapper[4764]: I0309 13:48:03.840821 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551068-v6md5" event={"ID":"ab11b944-7857-4998-b32b-264ac7683616","Type":"ContainerDied","Data":"2a7ab8e616981504d9d31e4fc4313f083401f97f7e60e7b3cdc2825dfc09335b"} Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.293996 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551068-v6md5" Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.302989 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.303089 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.367368 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.450858 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7trk\" (UniqueName: \"kubernetes.io/projected/ab11b944-7857-4998-b32b-264ac7683616-kube-api-access-w7trk\") pod \"ab11b944-7857-4998-b32b-264ac7683616\" (UID: \"ab11b944-7857-4998-b32b-264ac7683616\") " Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.459088 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab11b944-7857-4998-b32b-264ac7683616-kube-api-access-w7trk" (OuterVolumeSpecName: "kube-api-access-w7trk") pod "ab11b944-7857-4998-b32b-264ac7683616" (UID: "ab11b944-7857-4998-b32b-264ac7683616"). InnerVolumeSpecName "kube-api-access-w7trk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.554427 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7trk\" (UniqueName: \"kubernetes.io/projected/ab11b944-7857-4998-b32b-264ac7683616-kube-api-access-w7trk\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.656472 4764 scope.go:117] "RemoveContainer" containerID="8419a9151f23a452d93428da7d46b2dd5d9147864269035236c144959618c6a4" Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.709567 4764 scope.go:117] "RemoveContainer" containerID="20dab72734e5903795d478bac44970ac57f0690f8f454dee9dae223fbd3e92ac" Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.872801 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551068-v6md5" event={"ID":"ab11b944-7857-4998-b32b-264ac7683616","Type":"ContainerDied","Data":"f126d2e1b174f599134d0ceb8f1afe3dc279a52c5b8d7026721c430819068c70"} Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.872862 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551068-v6md5" Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.872880 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f126d2e1b174f599134d0ceb8f1afe3dc279a52c5b8d7026721c430819068c70" Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.930269 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551062-wl7gf"] Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.932995 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.939271 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551062-wl7gf"] Mar 09 13:48:05 crc kubenswrapper[4764]: I0309 13:48:05.994187 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nxs42"] Mar 09 13:48:07 crc kubenswrapper[4764]: I0309 13:48:07.572666 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61f41fcc-bb74-4c90-a6af-bfcd168ef2cb" path="/var/lib/kubelet/pods/61f41fcc-bb74-4c90-a6af-bfcd168ef2cb/volumes" Mar 09 13:48:07 crc kubenswrapper[4764]: I0309 13:48:07.893532 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nxs42" podUID="1852f78c-18c6-481e-bf04-c3eba97b11e7" containerName="registry-server" containerID="cri-o://06a8e00ee599e76b9570f836b2bc6f955ebe71a9de168cbb360d725880a2a573" gracePeriod=2 Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.361882 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.517306 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1852f78c-18c6-481e-bf04-c3eba97b11e7-utilities\") pod \"1852f78c-18c6-481e-bf04-c3eba97b11e7\" (UID: \"1852f78c-18c6-481e-bf04-c3eba97b11e7\") " Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.517496 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6gwd\" (UniqueName: \"kubernetes.io/projected/1852f78c-18c6-481e-bf04-c3eba97b11e7-kube-api-access-q6gwd\") pod \"1852f78c-18c6-481e-bf04-c3eba97b11e7\" (UID: \"1852f78c-18c6-481e-bf04-c3eba97b11e7\") " Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.517610 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1852f78c-18c6-481e-bf04-c3eba97b11e7-catalog-content\") pod \"1852f78c-18c6-481e-bf04-c3eba97b11e7\" (UID: \"1852f78c-18c6-481e-bf04-c3eba97b11e7\") " Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.521156 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1852f78c-18c6-481e-bf04-c3eba97b11e7-utilities" (OuterVolumeSpecName: "utilities") pod "1852f78c-18c6-481e-bf04-c3eba97b11e7" (UID: "1852f78c-18c6-481e-bf04-c3eba97b11e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.532021 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1852f78c-18c6-481e-bf04-c3eba97b11e7-kube-api-access-q6gwd" (OuterVolumeSpecName: "kube-api-access-q6gwd") pod "1852f78c-18c6-481e-bf04-c3eba97b11e7" (UID: "1852f78c-18c6-481e-bf04-c3eba97b11e7"). InnerVolumeSpecName "kube-api-access-q6gwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.604839 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1852f78c-18c6-481e-bf04-c3eba97b11e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1852f78c-18c6-481e-bf04-c3eba97b11e7" (UID: "1852f78c-18c6-481e-bf04-c3eba97b11e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.621078 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6gwd\" (UniqueName: \"kubernetes.io/projected/1852f78c-18c6-481e-bf04-c3eba97b11e7-kube-api-access-q6gwd\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.621116 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1852f78c-18c6-481e-bf04-c3eba97b11e7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.621127 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1852f78c-18c6-481e-bf04-c3eba97b11e7-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.907304 4764 generic.go:334] "Generic (PLEG): container finished" podID="1852f78c-18c6-481e-bf04-c3eba97b11e7" containerID="06a8e00ee599e76b9570f836b2bc6f955ebe71a9de168cbb360d725880a2a573" exitCode=0 Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.907842 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nxs42" event={"ID":"1852f78c-18c6-481e-bf04-c3eba97b11e7","Type":"ContainerDied","Data":"06a8e00ee599e76b9570f836b2bc6f955ebe71a9de168cbb360d725880a2a573"} Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.907890 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nxs42" event={"ID":"1852f78c-18c6-481e-bf04-c3eba97b11e7","Type":"ContainerDied","Data":"296edd151f10a4f45fecda1a67c35226a5a35655cc126212c50ad827e9d7aed5"} Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.907920 4764 scope.go:117] "RemoveContainer" containerID="06a8e00ee599e76b9570f836b2bc6f955ebe71a9de168cbb360d725880a2a573" Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.908157 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nxs42" Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.967104 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nxs42"] Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.967965 4764 scope.go:117] "RemoveContainer" containerID="379ba48f962950be95366dcbf3c94dd62e989eb1302cd5bb42af707eed9f0e85" Mar 09 13:48:08 crc kubenswrapper[4764]: I0309 13:48:08.977269 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nxs42"] Mar 09 13:48:09 crc kubenswrapper[4764]: I0309 13:48:09.008429 4764 scope.go:117] "RemoveContainer" containerID="be5da138897f4a01b72eee2c62c947a531c8838ff68ca8adea703659041c2147" Mar 09 13:48:09 crc kubenswrapper[4764]: I0309 13:48:09.041266 4764 scope.go:117] "RemoveContainer" containerID="06a8e00ee599e76b9570f836b2bc6f955ebe71a9de168cbb360d725880a2a573" Mar 09 13:48:09 crc kubenswrapper[4764]: E0309 13:48:09.041779 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06a8e00ee599e76b9570f836b2bc6f955ebe71a9de168cbb360d725880a2a573\": container with ID starting with 06a8e00ee599e76b9570f836b2bc6f955ebe71a9de168cbb360d725880a2a573 not found: ID does not exist" containerID="06a8e00ee599e76b9570f836b2bc6f955ebe71a9de168cbb360d725880a2a573" Mar 09 13:48:09 crc kubenswrapper[4764]: I0309 13:48:09.041830 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06a8e00ee599e76b9570f836b2bc6f955ebe71a9de168cbb360d725880a2a573"} err="failed to get container status \"06a8e00ee599e76b9570f836b2bc6f955ebe71a9de168cbb360d725880a2a573\": rpc error: code = NotFound desc = could not find container \"06a8e00ee599e76b9570f836b2bc6f955ebe71a9de168cbb360d725880a2a573\": container with ID starting with 06a8e00ee599e76b9570f836b2bc6f955ebe71a9de168cbb360d725880a2a573 not found: ID does not exist" Mar 09 13:48:09 crc kubenswrapper[4764]: I0309 13:48:09.041864 4764 scope.go:117] "RemoveContainer" containerID="379ba48f962950be95366dcbf3c94dd62e989eb1302cd5bb42af707eed9f0e85" Mar 09 13:48:09 crc kubenswrapper[4764]: E0309 13:48:09.042133 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"379ba48f962950be95366dcbf3c94dd62e989eb1302cd5bb42af707eed9f0e85\": container with ID starting with 379ba48f962950be95366dcbf3c94dd62e989eb1302cd5bb42af707eed9f0e85 not found: ID does not exist" containerID="379ba48f962950be95366dcbf3c94dd62e989eb1302cd5bb42af707eed9f0e85" Mar 09 13:48:09 crc kubenswrapper[4764]: I0309 13:48:09.042168 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"379ba48f962950be95366dcbf3c94dd62e989eb1302cd5bb42af707eed9f0e85"} err="failed to get container status \"379ba48f962950be95366dcbf3c94dd62e989eb1302cd5bb42af707eed9f0e85\": rpc error: code = NotFound desc = could not find container \"379ba48f962950be95366dcbf3c94dd62e989eb1302cd5bb42af707eed9f0e85\": container with ID starting with 379ba48f962950be95366dcbf3c94dd62e989eb1302cd5bb42af707eed9f0e85 not found: ID does not exist" Mar 09 13:48:09 crc kubenswrapper[4764]: I0309 13:48:09.042190 4764 scope.go:117] "RemoveContainer" containerID="be5da138897f4a01b72eee2c62c947a531c8838ff68ca8adea703659041c2147" Mar 09 13:48:09 crc kubenswrapper[4764]: E0309 13:48:09.042410 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be5da138897f4a01b72eee2c62c947a531c8838ff68ca8adea703659041c2147\": container with ID starting with be5da138897f4a01b72eee2c62c947a531c8838ff68ca8adea703659041c2147 not found: ID does not exist" containerID="be5da138897f4a01b72eee2c62c947a531c8838ff68ca8adea703659041c2147" Mar 09 13:48:09 crc kubenswrapper[4764]: I0309 13:48:09.042442 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be5da138897f4a01b72eee2c62c947a531c8838ff68ca8adea703659041c2147"} err="failed to get container status \"be5da138897f4a01b72eee2c62c947a531c8838ff68ca8adea703659041c2147\": rpc error: code = NotFound desc = could not find container \"be5da138897f4a01b72eee2c62c947a531c8838ff68ca8adea703659041c2147\": container with ID starting with be5da138897f4a01b72eee2c62c947a531c8838ff68ca8adea703659041c2147 not found: ID does not exist" Mar 09 13:48:09 crc kubenswrapper[4764]: I0309 13:48:09.600015 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1852f78c-18c6-481e-bf04-c3eba97b11e7" path="/var/lib/kubelet/pods/1852f78c-18c6-481e-bf04-c3eba97b11e7/volumes" Mar 09 13:48:10 crc kubenswrapper[4764]: I0309 13:48:10.560335 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:48:10 crc kubenswrapper[4764]: E0309 13:48:10.561260 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.481449 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5j7k8"] Mar 09 13:48:21 crc kubenswrapper[4764]: E0309 13:48:21.482969 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab11b944-7857-4998-b32b-264ac7683616" containerName="oc" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.482993 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab11b944-7857-4998-b32b-264ac7683616" containerName="oc" Mar 09 13:48:21 crc kubenswrapper[4764]: E0309 13:48:21.483010 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1852f78c-18c6-481e-bf04-c3eba97b11e7" containerName="registry-server" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.483018 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1852f78c-18c6-481e-bf04-c3eba97b11e7" containerName="registry-server" Mar 09 13:48:21 crc kubenswrapper[4764]: E0309 13:48:21.483040 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1852f78c-18c6-481e-bf04-c3eba97b11e7" containerName="extract-content" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.483048 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1852f78c-18c6-481e-bf04-c3eba97b11e7" containerName="extract-content" Mar 09 13:48:21 crc kubenswrapper[4764]: E0309 13:48:21.483086 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1852f78c-18c6-481e-bf04-c3eba97b11e7" containerName="extract-utilities" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.483095 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1852f78c-18c6-481e-bf04-c3eba97b11e7" containerName="extract-utilities" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.483332 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1852f78c-18c6-481e-bf04-c3eba97b11e7" containerName="registry-server" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.483356 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab11b944-7857-4998-b32b-264ac7683616" containerName="oc" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.485210 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.500695 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5j7k8"] Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.543779 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqhfj\" (UniqueName: \"kubernetes.io/projected/4752ae6f-41a1-4958-a438-d02f33f433b9-kube-api-access-dqhfj\") pod \"redhat-marketplace-5j7k8\" (UID: \"4752ae6f-41a1-4958-a438-d02f33f433b9\") " pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.543970 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4752ae6f-41a1-4958-a438-d02f33f433b9-utilities\") pod \"redhat-marketplace-5j7k8\" (UID: \"4752ae6f-41a1-4958-a438-d02f33f433b9\") " pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.544025 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4752ae6f-41a1-4958-a438-d02f33f433b9-catalog-content\") pod \"redhat-marketplace-5j7k8\" (UID: \"4752ae6f-41a1-4958-a438-d02f33f433b9\") " pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.560075 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:48:21 crc kubenswrapper[4764]: E0309 13:48:21.560413 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.651350 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4752ae6f-41a1-4958-a438-d02f33f433b9-utilities\") pod \"redhat-marketplace-5j7k8\" (UID: \"4752ae6f-41a1-4958-a438-d02f33f433b9\") " pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.651450 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4752ae6f-41a1-4958-a438-d02f33f433b9-catalog-content\") pod \"redhat-marketplace-5j7k8\" (UID: \"4752ae6f-41a1-4958-a438-d02f33f433b9\") " pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.651532 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqhfj\" (UniqueName: \"kubernetes.io/projected/4752ae6f-41a1-4958-a438-d02f33f433b9-kube-api-access-dqhfj\") pod \"redhat-marketplace-5j7k8\" (UID: \"4752ae6f-41a1-4958-a438-d02f33f433b9\") " pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.652711 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4752ae6f-41a1-4958-a438-d02f33f433b9-catalog-content\") pod \"redhat-marketplace-5j7k8\" (UID: \"4752ae6f-41a1-4958-a438-d02f33f433b9\") " pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.652706 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4752ae6f-41a1-4958-a438-d02f33f433b9-utilities\") pod \"redhat-marketplace-5j7k8\" (UID: \"4752ae6f-41a1-4958-a438-d02f33f433b9\") " pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.675178 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqhfj\" (UniqueName: \"kubernetes.io/projected/4752ae6f-41a1-4958-a438-d02f33f433b9-kube-api-access-dqhfj\") pod \"redhat-marketplace-5j7k8\" (UID: \"4752ae6f-41a1-4958-a438-d02f33f433b9\") " pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:21 crc kubenswrapper[4764]: I0309 13:48:21.814840 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:22 crc kubenswrapper[4764]: I0309 13:48:22.315635 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5j7k8"] Mar 09 13:48:23 crc kubenswrapper[4764]: I0309 13:48:23.057222 4764 generic.go:334] "Generic (PLEG): container finished" podID="4752ae6f-41a1-4958-a438-d02f33f433b9" containerID="e26198f59e1d2cba30bef93d2fc072b025f9c4c68243d49420c21a271867be6c" exitCode=0 Mar 09 13:48:23 crc kubenswrapper[4764]: I0309 13:48:23.059108 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5j7k8" event={"ID":"4752ae6f-41a1-4958-a438-d02f33f433b9","Type":"ContainerDied","Data":"e26198f59e1d2cba30bef93d2fc072b025f9c4c68243d49420c21a271867be6c"} Mar 09 13:48:23 crc kubenswrapper[4764]: I0309 13:48:23.059228 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5j7k8" event={"ID":"4752ae6f-41a1-4958-a438-d02f33f433b9","Type":"ContainerStarted","Data":"cdfbd1bebce105b4241f3b0f024b996f4c76295bcaa2064aa1ca4accccba2294"} Mar 09 13:48:24 crc kubenswrapper[4764]: I0309 13:48:24.072441 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5j7k8" event={"ID":"4752ae6f-41a1-4958-a438-d02f33f433b9","Type":"ContainerStarted","Data":"60cd6fbe18efa3c4fddc9f30d3caa7b50e639752b292b13cb630a9f474a0231a"} Mar 09 13:48:25 crc kubenswrapper[4764]: I0309 13:48:25.084386 4764 generic.go:334] "Generic (PLEG): container finished" podID="4752ae6f-41a1-4958-a438-d02f33f433b9" containerID="60cd6fbe18efa3c4fddc9f30d3caa7b50e639752b292b13cb630a9f474a0231a" exitCode=0 Mar 09 13:48:25 crc kubenswrapper[4764]: I0309 13:48:25.084448 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5j7k8" event={"ID":"4752ae6f-41a1-4958-a438-d02f33f433b9","Type":"ContainerDied","Data":"60cd6fbe18efa3c4fddc9f30d3caa7b50e639752b292b13cb630a9f474a0231a"} Mar 09 13:48:26 crc kubenswrapper[4764]: I0309 13:48:26.097083 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5j7k8" event={"ID":"4752ae6f-41a1-4958-a438-d02f33f433b9","Type":"ContainerStarted","Data":"2e3c4f2311885cf47d183a10bee85def44f67398c81277a86e814ec507b51816"} Mar 09 13:48:26 crc kubenswrapper[4764]: I0309 13:48:26.125009 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5j7k8" podStartSLOduration=2.49528295 podStartE2EDuration="5.124980191s" podCreationTimestamp="2026-03-09 13:48:21 +0000 UTC" firstStartedPulling="2026-03-09 13:48:23.062090989 +0000 UTC m=+1658.312262897" lastFinishedPulling="2026-03-09 13:48:25.69178823 +0000 UTC m=+1660.941960138" observedRunningTime="2026-03-09 13:48:26.121449027 +0000 UTC m=+1661.371620955" watchObservedRunningTime="2026-03-09 13:48:26.124980191 +0000 UTC m=+1661.375152099" Mar 09 13:48:31 crc kubenswrapper[4764]: I0309 13:48:31.815281 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:31 crc kubenswrapper[4764]: I0309 13:48:31.816173 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:31 crc kubenswrapper[4764]: I0309 13:48:31.872159 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:32 crc kubenswrapper[4764]: I0309 13:48:32.216145 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:32 crc kubenswrapper[4764]: I0309 13:48:32.753303 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5j7k8"] Mar 09 13:48:34 crc kubenswrapper[4764]: I0309 13:48:34.189232 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5j7k8" podUID="4752ae6f-41a1-4958-a438-d02f33f433b9" containerName="registry-server" containerID="cri-o://2e3c4f2311885cf47d183a10bee85def44f67398c81277a86e814ec507b51816" gracePeriod=2 Mar 09 13:48:34 crc kubenswrapper[4764]: I0309 13:48:34.676400 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:34 crc kubenswrapper[4764]: I0309 13:48:34.775569 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4752ae6f-41a1-4958-a438-d02f33f433b9-catalog-content\") pod \"4752ae6f-41a1-4958-a438-d02f33f433b9\" (UID: \"4752ae6f-41a1-4958-a438-d02f33f433b9\") " Mar 09 13:48:34 crc kubenswrapper[4764]: I0309 13:48:34.775698 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4752ae6f-41a1-4958-a438-d02f33f433b9-utilities\") pod \"4752ae6f-41a1-4958-a438-d02f33f433b9\" (UID: \"4752ae6f-41a1-4958-a438-d02f33f433b9\") " Mar 09 13:48:34 crc kubenswrapper[4764]: I0309 13:48:34.775895 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqhfj\" (UniqueName: \"kubernetes.io/projected/4752ae6f-41a1-4958-a438-d02f33f433b9-kube-api-access-dqhfj\") pod \"4752ae6f-41a1-4958-a438-d02f33f433b9\" (UID: \"4752ae6f-41a1-4958-a438-d02f33f433b9\") " Mar 09 13:48:34 crc kubenswrapper[4764]: I0309 13:48:34.776815 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4752ae6f-41a1-4958-a438-d02f33f433b9-utilities" (OuterVolumeSpecName: "utilities") pod "4752ae6f-41a1-4958-a438-d02f33f433b9" (UID: "4752ae6f-41a1-4958-a438-d02f33f433b9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:48:34 crc kubenswrapper[4764]: I0309 13:48:34.784595 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4752ae6f-41a1-4958-a438-d02f33f433b9-kube-api-access-dqhfj" (OuterVolumeSpecName: "kube-api-access-dqhfj") pod "4752ae6f-41a1-4958-a438-d02f33f433b9" (UID: "4752ae6f-41a1-4958-a438-d02f33f433b9"). InnerVolumeSpecName "kube-api-access-dqhfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:48:34 crc kubenswrapper[4764]: I0309 13:48:34.813247 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4752ae6f-41a1-4958-a438-d02f33f433b9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4752ae6f-41a1-4958-a438-d02f33f433b9" (UID: "4752ae6f-41a1-4958-a438-d02f33f433b9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:48:34 crc kubenswrapper[4764]: I0309 13:48:34.879210 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4752ae6f-41a1-4958-a438-d02f33f433b9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:34 crc kubenswrapper[4764]: I0309 13:48:34.879265 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4752ae6f-41a1-4958-a438-d02f33f433b9-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:34 crc kubenswrapper[4764]: I0309 13:48:34.879342 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqhfj\" (UniqueName: \"kubernetes.io/projected/4752ae6f-41a1-4958-a438-d02f33f433b9-kube-api-access-dqhfj\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.206360 4764 generic.go:334] "Generic (PLEG): container finished" podID="4752ae6f-41a1-4958-a438-d02f33f433b9" containerID="2e3c4f2311885cf47d183a10bee85def44f67398c81277a86e814ec507b51816" exitCode=0 Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.206430 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5j7k8" event={"ID":"4752ae6f-41a1-4958-a438-d02f33f433b9","Type":"ContainerDied","Data":"2e3c4f2311885cf47d183a10bee85def44f67398c81277a86e814ec507b51816"} Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.206486 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5j7k8" event={"ID":"4752ae6f-41a1-4958-a438-d02f33f433b9","Type":"ContainerDied","Data":"cdfbd1bebce105b4241f3b0f024b996f4c76295bcaa2064aa1ca4accccba2294"} Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.206505 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5j7k8" Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.206523 4764 scope.go:117] "RemoveContainer" containerID="2e3c4f2311885cf47d183a10bee85def44f67398c81277a86e814ec507b51816" Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.252993 4764 scope.go:117] "RemoveContainer" containerID="60cd6fbe18efa3c4fddc9f30d3caa7b50e639752b292b13cb630a9f474a0231a" Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.271715 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5j7k8"] Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.280423 4764 scope.go:117] "RemoveContainer" containerID="e26198f59e1d2cba30bef93d2fc072b025f9c4c68243d49420c21a271867be6c" Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.286439 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5j7k8"] Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.322960 4764 scope.go:117] "RemoveContainer" containerID="2e3c4f2311885cf47d183a10bee85def44f67398c81277a86e814ec507b51816" Mar 09 13:48:35 crc kubenswrapper[4764]: E0309 13:48:35.323533 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e3c4f2311885cf47d183a10bee85def44f67398c81277a86e814ec507b51816\": container with ID starting with 2e3c4f2311885cf47d183a10bee85def44f67398c81277a86e814ec507b51816 not found: ID does not exist" containerID="2e3c4f2311885cf47d183a10bee85def44f67398c81277a86e814ec507b51816" Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.323576 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e3c4f2311885cf47d183a10bee85def44f67398c81277a86e814ec507b51816"} err="failed to get container status \"2e3c4f2311885cf47d183a10bee85def44f67398c81277a86e814ec507b51816\": rpc error: code = NotFound desc = could not find container \"2e3c4f2311885cf47d183a10bee85def44f67398c81277a86e814ec507b51816\": container with ID starting with 2e3c4f2311885cf47d183a10bee85def44f67398c81277a86e814ec507b51816 not found: ID does not exist" Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.323606 4764 scope.go:117] "RemoveContainer" containerID="60cd6fbe18efa3c4fddc9f30d3caa7b50e639752b292b13cb630a9f474a0231a" Mar 09 13:48:35 crc kubenswrapper[4764]: E0309 13:48:35.324139 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60cd6fbe18efa3c4fddc9f30d3caa7b50e639752b292b13cb630a9f474a0231a\": container with ID starting with 60cd6fbe18efa3c4fddc9f30d3caa7b50e639752b292b13cb630a9f474a0231a not found: ID does not exist" containerID="60cd6fbe18efa3c4fddc9f30d3caa7b50e639752b292b13cb630a9f474a0231a" Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.324171 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60cd6fbe18efa3c4fddc9f30d3caa7b50e639752b292b13cb630a9f474a0231a"} err="failed to get container status \"60cd6fbe18efa3c4fddc9f30d3caa7b50e639752b292b13cb630a9f474a0231a\": rpc error: code = NotFound desc = could not find container \"60cd6fbe18efa3c4fddc9f30d3caa7b50e639752b292b13cb630a9f474a0231a\": container with ID starting with 60cd6fbe18efa3c4fddc9f30d3caa7b50e639752b292b13cb630a9f474a0231a not found: ID does not exist" Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.324188 4764 scope.go:117] "RemoveContainer" containerID="e26198f59e1d2cba30bef93d2fc072b025f9c4c68243d49420c21a271867be6c" Mar 09 13:48:35 crc kubenswrapper[4764]: E0309 13:48:35.324628 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e26198f59e1d2cba30bef93d2fc072b025f9c4c68243d49420c21a271867be6c\": container with ID starting with e26198f59e1d2cba30bef93d2fc072b025f9c4c68243d49420c21a271867be6c not found: ID does not exist" containerID="e26198f59e1d2cba30bef93d2fc072b025f9c4c68243d49420c21a271867be6c" Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.324722 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e26198f59e1d2cba30bef93d2fc072b025f9c4c68243d49420c21a271867be6c"} err="failed to get container status \"e26198f59e1d2cba30bef93d2fc072b025f9c4c68243d49420c21a271867be6c\": rpc error: code = NotFound desc = could not find container \"e26198f59e1d2cba30bef93d2fc072b025f9c4c68243d49420c21a271867be6c\": container with ID starting with e26198f59e1d2cba30bef93d2fc072b025f9c4c68243d49420c21a271867be6c not found: ID does not exist" Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.571970 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:48:35 crc kubenswrapper[4764]: E0309 13:48:35.572467 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:48:35 crc kubenswrapper[4764]: I0309 13:48:35.579541 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4752ae6f-41a1-4958-a438-d02f33f433b9" path="/var/lib/kubelet/pods/4752ae6f-41a1-4958-a438-d02f33f433b9/volumes" Mar 09 13:48:36 crc kubenswrapper[4764]: I0309 13:48:36.218817 4764 generic.go:334] "Generic (PLEG): container finished" podID="fe0d9990-083b-428b-baec-a40ae99487db" containerID="6e4f8f0feb8ec9ecf660634549716b0c350cba173b98c033e4c7e09aa6bd108d" exitCode=0 Mar 09 13:48:36 crc kubenswrapper[4764]: I0309 13:48:36.218893 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" event={"ID":"fe0d9990-083b-428b-baec-a40ae99487db","Type":"ContainerDied","Data":"6e4f8f0feb8ec9ecf660634549716b0c350cba173b98c033e4c7e09aa6bd108d"} Mar 09 13:48:37 crc kubenswrapper[4764]: I0309 13:48:37.685200 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:48:37 crc kubenswrapper[4764]: I0309 13:48:37.846149 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-bootstrap-combined-ca-bundle\") pod \"fe0d9990-083b-428b-baec-a40ae99487db\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " Mar 09 13:48:37 crc kubenswrapper[4764]: I0309 13:48:37.846206 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrpw2\" (UniqueName: \"kubernetes.io/projected/fe0d9990-083b-428b-baec-a40ae99487db-kube-api-access-rrpw2\") pod \"fe0d9990-083b-428b-baec-a40ae99487db\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " Mar 09 13:48:37 crc kubenswrapper[4764]: I0309 13:48:37.846289 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-ssh-key-openstack-edpm-ipam\") pod \"fe0d9990-083b-428b-baec-a40ae99487db\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " Mar 09 13:48:37 crc kubenswrapper[4764]: I0309 13:48:37.846557 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-inventory\") pod \"fe0d9990-083b-428b-baec-a40ae99487db\" (UID: \"fe0d9990-083b-428b-baec-a40ae99487db\") " Mar 09 13:48:37 crc kubenswrapper[4764]: I0309 13:48:37.854580 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "fe0d9990-083b-428b-baec-a40ae99487db" (UID: "fe0d9990-083b-428b-baec-a40ae99487db"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:48:37 crc kubenswrapper[4764]: I0309 13:48:37.855094 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe0d9990-083b-428b-baec-a40ae99487db-kube-api-access-rrpw2" (OuterVolumeSpecName: "kube-api-access-rrpw2") pod "fe0d9990-083b-428b-baec-a40ae99487db" (UID: "fe0d9990-083b-428b-baec-a40ae99487db"). InnerVolumeSpecName "kube-api-access-rrpw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:48:37 crc kubenswrapper[4764]: I0309 13:48:37.877541 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-inventory" (OuterVolumeSpecName: "inventory") pod "fe0d9990-083b-428b-baec-a40ae99487db" (UID: "fe0d9990-083b-428b-baec-a40ae99487db"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:48:37 crc kubenswrapper[4764]: I0309 13:48:37.884100 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fe0d9990-083b-428b-baec-a40ae99487db" (UID: "fe0d9990-083b-428b-baec-a40ae99487db"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:48:37 crc kubenswrapper[4764]: I0309 13:48:37.949448 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:37 crc kubenswrapper[4764]: I0309 13:48:37.949509 4764 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:37 crc kubenswrapper[4764]: I0309 13:48:37.949525 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrpw2\" (UniqueName: \"kubernetes.io/projected/fe0d9990-083b-428b-baec-a40ae99487db-kube-api-access-rrpw2\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:37 crc kubenswrapper[4764]: I0309 13:48:37.949538 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe0d9990-083b-428b-baec-a40ae99487db-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.243911 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" event={"ID":"fe0d9990-083b-428b-baec-a40ae99487db","Type":"ContainerDied","Data":"88f1f6de4e55f98cf3fb91581e4ad15ec69d81b47c69d9187bda289cfb67da48"} Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.244461 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88f1f6de4e55f98cf3fb91581e4ad15ec69d81b47c69d9187bda289cfb67da48" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.244266 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.341848 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q"] Mar 09 13:48:38 crc kubenswrapper[4764]: E0309 13:48:38.342471 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4752ae6f-41a1-4958-a438-d02f33f433b9" containerName="registry-server" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.342503 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4752ae6f-41a1-4958-a438-d02f33f433b9" containerName="registry-server" Mar 09 13:48:38 crc kubenswrapper[4764]: E0309 13:48:38.342529 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4752ae6f-41a1-4958-a438-d02f33f433b9" containerName="extract-content" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.342539 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4752ae6f-41a1-4958-a438-d02f33f433b9" containerName="extract-content" Mar 09 13:48:38 crc kubenswrapper[4764]: E0309 13:48:38.342572 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe0d9990-083b-428b-baec-a40ae99487db" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.342586 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0d9990-083b-428b-baec-a40ae99487db" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 09 13:48:38 crc kubenswrapper[4764]: E0309 13:48:38.342603 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4752ae6f-41a1-4958-a438-d02f33f433b9" containerName="extract-utilities" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.342612 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4752ae6f-41a1-4958-a438-d02f33f433b9" containerName="extract-utilities" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.342807 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4752ae6f-41a1-4958-a438-d02f33f433b9" containerName="registry-server" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.342831 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe0d9990-083b-428b-baec-a40ae99487db" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.343637 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.349364 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.349793 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.349966 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.350077 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.354415 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q"] Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.461632 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q\" (UID: \"85b3887c-ea0d-4ca0-a862-0134f0ae08b5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.461821 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9cnl\" (UniqueName: \"kubernetes.io/projected/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-kube-api-access-h9cnl\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q\" (UID: \"85b3887c-ea0d-4ca0-a862-0134f0ae08b5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.461909 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q\" (UID: \"85b3887c-ea0d-4ca0-a862-0134f0ae08b5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.565860 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q\" (UID: \"85b3887c-ea0d-4ca0-a862-0134f0ae08b5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.566089 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q\" (UID: \"85b3887c-ea0d-4ca0-a862-0134f0ae08b5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.566184 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9cnl\" (UniqueName: \"kubernetes.io/projected/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-kube-api-access-h9cnl\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q\" (UID: \"85b3887c-ea0d-4ca0-a862-0134f0ae08b5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.571790 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q\" (UID: \"85b3887c-ea0d-4ca0-a862-0134f0ae08b5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.574905 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q\" (UID: \"85b3887c-ea0d-4ca0-a862-0134f0ae08b5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.592628 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9cnl\" (UniqueName: \"kubernetes.io/projected/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-kube-api-access-h9cnl\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q\" (UID: \"85b3887c-ea0d-4ca0-a862-0134f0ae08b5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" Mar 09 13:48:38 crc kubenswrapper[4764]: I0309 13:48:38.666180 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" Mar 09 13:48:39 crc kubenswrapper[4764]: I0309 13:48:39.235394 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q"] Mar 09 13:48:39 crc kubenswrapper[4764]: I0309 13:48:39.260098 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" event={"ID":"85b3887c-ea0d-4ca0-a862-0134f0ae08b5","Type":"ContainerStarted","Data":"57eae1f53cc5fcb5ec3680529144f8d305dbe72b97f96900aa48da343bdb99f9"} Mar 09 13:48:41 crc kubenswrapper[4764]: I0309 13:48:41.286912 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" event={"ID":"85b3887c-ea0d-4ca0-a862-0134f0ae08b5","Type":"ContainerStarted","Data":"8097883da308046a256f53d8042acf3d3dddd33e51cc0f93ed512385c36b57c0"} Mar 09 13:48:41 crc kubenswrapper[4764]: I0309 13:48:41.316364 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" podStartSLOduration=2.5255194530000002 podStartE2EDuration="3.316310071s" podCreationTimestamp="2026-03-09 13:48:38 +0000 UTC" firstStartedPulling="2026-03-09 13:48:39.24884586 +0000 UTC m=+1674.499017768" lastFinishedPulling="2026-03-09 13:48:40.039636468 +0000 UTC m=+1675.289808386" observedRunningTime="2026-03-09 13:48:41.306701245 +0000 UTC m=+1676.556873183" watchObservedRunningTime="2026-03-09 13:48:41.316310071 +0000 UTC m=+1676.566481979" Mar 09 13:48:46 crc kubenswrapper[4764]: I0309 13:48:46.560188 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:48:46 crc kubenswrapper[4764]: E0309 13:48:46.561379 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:48:59 crc kubenswrapper[4764]: I0309 13:48:59.560682 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:48:59 crc kubenswrapper[4764]: E0309 13:48:59.561849 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:49:05 crc kubenswrapper[4764]: I0309 13:49:05.786881 4764 scope.go:117] "RemoveContainer" containerID="81d1407cfee85c86ab81cdf9ec86e68979d06f5a586c4a54b9f7e6a537e80948" Mar 09 13:49:05 crc kubenswrapper[4764]: I0309 13:49:05.810595 4764 scope.go:117] "RemoveContainer" containerID="eb86255e700a4d6357cce0dc3140544f102a32bacc79bd2992bea6bc6385fe75" Mar 09 13:49:05 crc kubenswrapper[4764]: I0309 13:49:05.832542 4764 scope.go:117] "RemoveContainer" containerID="9f52cd378552f4424230b7014d94477e188c2eb48ce843f7df141a1298245bd6" Mar 09 13:49:14 crc kubenswrapper[4764]: I0309 13:49:14.560299 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:49:14 crc kubenswrapper[4764]: E0309 13:49:14.561336 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:49:25 crc kubenswrapper[4764]: I0309 13:49:25.560617 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:49:25 crc kubenswrapper[4764]: E0309 13:49:25.561728 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:49:37 crc kubenswrapper[4764]: I0309 13:49:37.560285 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:49:37 crc kubenswrapper[4764]: E0309 13:49:37.561370 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:49:47 crc kubenswrapper[4764]: I0309 13:49:47.983230 4764 generic.go:334] "Generic (PLEG): container finished" podID="85b3887c-ea0d-4ca0-a862-0134f0ae08b5" containerID="8097883da308046a256f53d8042acf3d3dddd33e51cc0f93ed512385c36b57c0" exitCode=0 Mar 09 13:49:47 crc kubenswrapper[4764]: I0309 13:49:47.983352 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" event={"ID":"85b3887c-ea0d-4ca0-a862-0134f0ae08b5","Type":"ContainerDied","Data":"8097883da308046a256f53d8042acf3d3dddd33e51cc0f93ed512385c36b57c0"} Mar 09 13:49:49 crc kubenswrapper[4764]: I0309 13:49:49.443095 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" Mar 09 13:49:49 crc kubenswrapper[4764]: I0309 13:49:49.567314 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9cnl\" (UniqueName: \"kubernetes.io/projected/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-kube-api-access-h9cnl\") pod \"85b3887c-ea0d-4ca0-a862-0134f0ae08b5\" (UID: \"85b3887c-ea0d-4ca0-a862-0134f0ae08b5\") " Mar 09 13:49:49 crc kubenswrapper[4764]: I0309 13:49:49.568253 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-ssh-key-openstack-edpm-ipam\") pod \"85b3887c-ea0d-4ca0-a862-0134f0ae08b5\" (UID: \"85b3887c-ea0d-4ca0-a862-0134f0ae08b5\") " Mar 09 13:49:49 crc kubenswrapper[4764]: I0309 13:49:49.568303 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-inventory\") pod \"85b3887c-ea0d-4ca0-a862-0134f0ae08b5\" (UID: \"85b3887c-ea0d-4ca0-a862-0134f0ae08b5\") " Mar 09 13:49:49 crc kubenswrapper[4764]: I0309 13:49:49.580025 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-kube-api-access-h9cnl" (OuterVolumeSpecName: "kube-api-access-h9cnl") pod "85b3887c-ea0d-4ca0-a862-0134f0ae08b5" (UID: "85b3887c-ea0d-4ca0-a862-0134f0ae08b5"). InnerVolumeSpecName "kube-api-access-h9cnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:49:49 crc kubenswrapper[4764]: I0309 13:49:49.597965 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "85b3887c-ea0d-4ca0-a862-0134f0ae08b5" (UID: "85b3887c-ea0d-4ca0-a862-0134f0ae08b5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:49:49 crc kubenswrapper[4764]: I0309 13:49:49.605313 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-inventory" (OuterVolumeSpecName: "inventory") pod "85b3887c-ea0d-4ca0-a862-0134f0ae08b5" (UID: "85b3887c-ea0d-4ca0-a862-0134f0ae08b5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:49:49 crc kubenswrapper[4764]: I0309 13:49:49.671736 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:49:49 crc kubenswrapper[4764]: I0309 13:49:49.671803 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:49:49 crc kubenswrapper[4764]: I0309 13:49:49.671823 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9cnl\" (UniqueName: \"kubernetes.io/projected/85b3887c-ea0d-4ca0-a862-0134f0ae08b5-kube-api-access-h9cnl\") on node \"crc\" DevicePath \"\"" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.009601 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" event={"ID":"85b3887c-ea0d-4ca0-a862-0134f0ae08b5","Type":"ContainerDied","Data":"57eae1f53cc5fcb5ec3680529144f8d305dbe72b97f96900aa48da343bdb99f9"} Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.009682 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57eae1f53cc5fcb5ec3680529144f8d305dbe72b97f96900aa48da343bdb99f9" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.010021 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.142926 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2"] Mar 09 13:49:50 crc kubenswrapper[4764]: E0309 13:49:50.143523 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85b3887c-ea0d-4ca0-a862-0134f0ae08b5" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.143550 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="85b3887c-ea0d-4ca0-a862-0134f0ae08b5" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.149917 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="85b3887c-ea0d-4ca0-a862-0134f0ae08b5" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.150925 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.153891 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.154385 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.162283 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.162711 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.191252 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2"] Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.285970 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/38445f30-348d-4c11-94c5-81bca885cc36-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-g84h2\" (UID: \"38445f30-348d-4c11-94c5-81bca885cc36\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.286086 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38445f30-348d-4c11-94c5-81bca885cc36-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-g84h2\" (UID: \"38445f30-348d-4c11-94c5-81bca885cc36\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.286126 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlf82\" (UniqueName: \"kubernetes.io/projected/38445f30-348d-4c11-94c5-81bca885cc36-kube-api-access-tlf82\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-g84h2\" (UID: \"38445f30-348d-4c11-94c5-81bca885cc36\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.388769 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/38445f30-348d-4c11-94c5-81bca885cc36-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-g84h2\" (UID: \"38445f30-348d-4c11-94c5-81bca885cc36\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.388951 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38445f30-348d-4c11-94c5-81bca885cc36-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-g84h2\" (UID: \"38445f30-348d-4c11-94c5-81bca885cc36\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.389015 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlf82\" (UniqueName: \"kubernetes.io/projected/38445f30-348d-4c11-94c5-81bca885cc36-kube-api-access-tlf82\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-g84h2\" (UID: \"38445f30-348d-4c11-94c5-81bca885cc36\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.394224 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/38445f30-348d-4c11-94c5-81bca885cc36-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-g84h2\" (UID: \"38445f30-348d-4c11-94c5-81bca885cc36\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.395549 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38445f30-348d-4c11-94c5-81bca885cc36-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-g84h2\" (UID: \"38445f30-348d-4c11-94c5-81bca885cc36\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.410317 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlf82\" (UniqueName: \"kubernetes.io/projected/38445f30-348d-4c11-94c5-81bca885cc36-kube-api-access-tlf82\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-g84h2\" (UID: \"38445f30-348d-4c11-94c5-81bca885cc36\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" Mar 09 13:49:50 crc kubenswrapper[4764]: I0309 13:49:50.481337 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" Mar 09 13:49:51 crc kubenswrapper[4764]: I0309 13:49:51.040854 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2"] Mar 09 13:49:51 crc kubenswrapper[4764]: I0309 13:49:51.560464 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:49:51 crc kubenswrapper[4764]: E0309 13:49:51.561373 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:49:52 crc kubenswrapper[4764]: I0309 13:49:52.032294 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" event={"ID":"38445f30-348d-4c11-94c5-81bca885cc36","Type":"ContainerStarted","Data":"f0aca49c15e8c97fb96eb53f9577a50ff9c6c25e52f97ab9fc13ad081c1cd506"} Mar 09 13:49:52 crc kubenswrapper[4764]: I0309 13:49:52.032856 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" event={"ID":"38445f30-348d-4c11-94c5-81bca885cc36","Type":"ContainerStarted","Data":"31a768dc42ab731fb41bb76184444fdc0ca9a4645ca28028647c2c27cf959d03"} Mar 09 13:49:52 crc kubenswrapper[4764]: I0309 13:49:52.061733 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" podStartSLOduration=1.632767631 podStartE2EDuration="2.061701388s" podCreationTimestamp="2026-03-09 13:49:50 +0000 UTC" firstStartedPulling="2026-03-09 13:49:51.045402218 +0000 UTC m=+1746.295574136" lastFinishedPulling="2026-03-09 13:49:51.474335965 +0000 UTC m=+1746.724507893" observedRunningTime="2026-03-09 13:49:52.04677134 +0000 UTC m=+1747.296943288" watchObservedRunningTime="2026-03-09 13:49:52.061701388 +0000 UTC m=+1747.311873336" Mar 09 13:49:57 crc kubenswrapper[4764]: I0309 13:49:57.094173 4764 generic.go:334] "Generic (PLEG): container finished" podID="38445f30-348d-4c11-94c5-81bca885cc36" containerID="f0aca49c15e8c97fb96eb53f9577a50ff9c6c25e52f97ab9fc13ad081c1cd506" exitCode=0 Mar 09 13:49:57 crc kubenswrapper[4764]: I0309 13:49:57.094290 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" event={"ID":"38445f30-348d-4c11-94c5-81bca885cc36","Type":"ContainerDied","Data":"f0aca49c15e8c97fb96eb53f9577a50ff9c6c25e52f97ab9fc13ad081c1cd506"} Mar 09 13:49:58 crc kubenswrapper[4764]: I0309 13:49:58.527431 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" Mar 09 13:49:58 crc kubenswrapper[4764]: I0309 13:49:58.688030 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/38445f30-348d-4c11-94c5-81bca885cc36-ssh-key-openstack-edpm-ipam\") pod \"38445f30-348d-4c11-94c5-81bca885cc36\" (UID: \"38445f30-348d-4c11-94c5-81bca885cc36\") " Mar 09 13:49:58 crc kubenswrapper[4764]: I0309 13:49:58.688585 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38445f30-348d-4c11-94c5-81bca885cc36-inventory\") pod \"38445f30-348d-4c11-94c5-81bca885cc36\" (UID: \"38445f30-348d-4c11-94c5-81bca885cc36\") " Mar 09 13:49:58 crc kubenswrapper[4764]: I0309 13:49:58.688745 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlf82\" (UniqueName: \"kubernetes.io/projected/38445f30-348d-4c11-94c5-81bca885cc36-kube-api-access-tlf82\") pod \"38445f30-348d-4c11-94c5-81bca885cc36\" (UID: \"38445f30-348d-4c11-94c5-81bca885cc36\") " Mar 09 13:49:58 crc kubenswrapper[4764]: I0309 13:49:58.704156 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38445f30-348d-4c11-94c5-81bca885cc36-kube-api-access-tlf82" (OuterVolumeSpecName: "kube-api-access-tlf82") pod "38445f30-348d-4c11-94c5-81bca885cc36" (UID: "38445f30-348d-4c11-94c5-81bca885cc36"). InnerVolumeSpecName "kube-api-access-tlf82". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:49:58 crc kubenswrapper[4764]: I0309 13:49:58.718003 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38445f30-348d-4c11-94c5-81bca885cc36-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "38445f30-348d-4c11-94c5-81bca885cc36" (UID: "38445f30-348d-4c11-94c5-81bca885cc36"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:49:58 crc kubenswrapper[4764]: I0309 13:49:58.723834 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38445f30-348d-4c11-94c5-81bca885cc36-inventory" (OuterVolumeSpecName: "inventory") pod "38445f30-348d-4c11-94c5-81bca885cc36" (UID: "38445f30-348d-4c11-94c5-81bca885cc36"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:49:58 crc kubenswrapper[4764]: I0309 13:49:58.791912 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/38445f30-348d-4c11-94c5-81bca885cc36-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:49:58 crc kubenswrapper[4764]: I0309 13:49:58.791956 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38445f30-348d-4c11-94c5-81bca885cc36-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:49:58 crc kubenswrapper[4764]: I0309 13:49:58.791965 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlf82\" (UniqueName: \"kubernetes.io/projected/38445f30-348d-4c11-94c5-81bca885cc36-kube-api-access-tlf82\") on node \"crc\" DevicePath \"\"" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.125390 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" event={"ID":"38445f30-348d-4c11-94c5-81bca885cc36","Type":"ContainerDied","Data":"31a768dc42ab731fb41bb76184444fdc0ca9a4645ca28028647c2c27cf959d03"} Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.125464 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31a768dc42ab731fb41bb76184444fdc0ca9a4645ca28028647c2c27cf959d03" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.125484 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.211809 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg"] Mar 09 13:49:59 crc kubenswrapper[4764]: E0309 13:49:59.212499 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38445f30-348d-4c11-94c5-81bca885cc36" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.212526 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="38445f30-348d-4c11-94c5-81bca885cc36" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.212841 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="38445f30-348d-4c11-94c5-81bca885cc36" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.213780 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.216307 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.216736 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.217194 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.217388 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.230367 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg"] Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.305599 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j2fzg\" (UID: \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.305833 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j2fzg\" (UID: \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.305865 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7snl\" (UniqueName: \"kubernetes.io/projected/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-kube-api-access-k7snl\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j2fzg\" (UID: \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.407716 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j2fzg\" (UID: \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.407761 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7snl\" (UniqueName: \"kubernetes.io/projected/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-kube-api-access-k7snl\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j2fzg\" (UID: \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.407851 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j2fzg\" (UID: \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.414350 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j2fzg\" (UID: \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.414561 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j2fzg\" (UID: \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.428252 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7snl\" (UniqueName: \"kubernetes.io/projected/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-kube-api-access-k7snl\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j2fzg\" (UID: \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" Mar 09 13:49:59 crc kubenswrapper[4764]: I0309 13:49:59.547808 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" Mar 09 13:50:00 crc kubenswrapper[4764]: I0309 13:50:00.106971 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg"] Mar 09 13:50:00 crc kubenswrapper[4764]: I0309 13:50:00.139846 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" event={"ID":"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add","Type":"ContainerStarted","Data":"63d45c8e8448e981e4b409a90ffed5471ff40c994d9269a761fcbf3a1ba24c89"} Mar 09 13:50:00 crc kubenswrapper[4764]: I0309 13:50:00.154038 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551070-x977g"] Mar 09 13:50:00 crc kubenswrapper[4764]: I0309 13:50:00.155877 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551070-x977g" Mar 09 13:50:00 crc kubenswrapper[4764]: I0309 13:50:00.166638 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551070-x977g"] Mar 09 13:50:00 crc kubenswrapper[4764]: I0309 13:50:00.193332 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:50:00 crc kubenswrapper[4764]: I0309 13:50:00.193689 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:50:00 crc kubenswrapper[4764]: I0309 13:50:00.195816 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:50:00 crc kubenswrapper[4764]: I0309 13:50:00.327206 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kq22\" (UniqueName: \"kubernetes.io/projected/14ef871d-e371-41df-9380-53505557d7ac-kube-api-access-7kq22\") pod \"auto-csr-approver-29551070-x977g\" (UID: \"14ef871d-e371-41df-9380-53505557d7ac\") " pod="openshift-infra/auto-csr-approver-29551070-x977g" Mar 09 13:50:00 crc kubenswrapper[4764]: I0309 13:50:00.430655 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kq22\" (UniqueName: \"kubernetes.io/projected/14ef871d-e371-41df-9380-53505557d7ac-kube-api-access-7kq22\") pod \"auto-csr-approver-29551070-x977g\" (UID: \"14ef871d-e371-41df-9380-53505557d7ac\") " pod="openshift-infra/auto-csr-approver-29551070-x977g" Mar 09 13:50:00 crc kubenswrapper[4764]: I0309 13:50:00.453210 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kq22\" (UniqueName: \"kubernetes.io/projected/14ef871d-e371-41df-9380-53505557d7ac-kube-api-access-7kq22\") pod \"auto-csr-approver-29551070-x977g\" (UID: \"14ef871d-e371-41df-9380-53505557d7ac\") " pod="openshift-infra/auto-csr-approver-29551070-x977g" Mar 09 13:50:00 crc kubenswrapper[4764]: I0309 13:50:00.537579 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551070-x977g" Mar 09 13:50:01 crc kubenswrapper[4764]: I0309 13:50:01.016652 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551070-x977g"] Mar 09 13:50:01 crc kubenswrapper[4764]: W0309 13:50:01.020180 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14ef871d_e371_41df_9380_53505557d7ac.slice/crio-669db91d9f1fcc1767a7b1f486e235588479de39ae56a765a944be467e990bb9 WatchSource:0}: Error finding container 669db91d9f1fcc1767a7b1f486e235588479de39ae56a765a944be467e990bb9: Status 404 returned error can't find the container with id 669db91d9f1fcc1767a7b1f486e235588479de39ae56a765a944be467e990bb9 Mar 09 13:50:01 crc kubenswrapper[4764]: I0309 13:50:01.151655 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551070-x977g" event={"ID":"14ef871d-e371-41df-9380-53505557d7ac","Type":"ContainerStarted","Data":"669db91d9f1fcc1767a7b1f486e235588479de39ae56a765a944be467e990bb9"} Mar 09 13:50:01 crc kubenswrapper[4764]: I0309 13:50:01.153867 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" event={"ID":"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add","Type":"ContainerStarted","Data":"1c7c071cdcf2595562a14fc8211f12dd741cead95f0450c54862b35cde78ac5b"} Mar 09 13:50:01 crc kubenswrapper[4764]: I0309 13:50:01.185411 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" podStartSLOduration=1.609421414 podStartE2EDuration="2.185376753s" podCreationTimestamp="2026-03-09 13:49:59 +0000 UTC" firstStartedPulling="2026-03-09 13:50:00.122509801 +0000 UTC m=+1755.372681709" lastFinishedPulling="2026-03-09 13:50:00.69846514 +0000 UTC m=+1755.948637048" observedRunningTime="2026-03-09 13:50:01.17699197 +0000 UTC m=+1756.427163898" watchObservedRunningTime="2026-03-09 13:50:01.185376753 +0000 UTC m=+1756.435548671" Mar 09 13:50:02 crc kubenswrapper[4764]: I0309 13:50:02.054559 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-d7e9-account-create-update-n7gsb"] Mar 09 13:50:02 crc kubenswrapper[4764]: I0309 13:50:02.068947 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-d7e9-account-create-update-n7gsb"] Mar 09 13:50:03 crc kubenswrapper[4764]: I0309 13:50:03.038811 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-kn2lh"] Mar 09 13:50:03 crc kubenswrapper[4764]: I0309 13:50:03.052570 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-wkxnp"] Mar 09 13:50:03 crc kubenswrapper[4764]: I0309 13:50:03.061930 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-kn2lh"] Mar 09 13:50:03 crc kubenswrapper[4764]: I0309 13:50:03.070150 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-wkxnp"] Mar 09 13:50:03 crc kubenswrapper[4764]: I0309 13:50:03.182099 4764 generic.go:334] "Generic (PLEG): container finished" podID="14ef871d-e371-41df-9380-53505557d7ac" containerID="e8df8bc509784d8d529ce5673fdcec9ae8d5b4cbcf9d86d0b27b355b5bffd393" exitCode=0 Mar 09 13:50:03 crc kubenswrapper[4764]: I0309 13:50:03.182160 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551070-x977g" event={"ID":"14ef871d-e371-41df-9380-53505557d7ac","Type":"ContainerDied","Data":"e8df8bc509784d8d529ce5673fdcec9ae8d5b4cbcf9d86d0b27b355b5bffd393"} Mar 09 13:50:03 crc kubenswrapper[4764]: I0309 13:50:03.560175 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:50:03 crc kubenswrapper[4764]: E0309 13:50:03.560468 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:50:03 crc kubenswrapper[4764]: I0309 13:50:03.591074 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="693ba99b-99d0-4b09-9f49-9deefe05abac" path="/var/lib/kubelet/pods/693ba99b-99d0-4b09-9f49-9deefe05abac/volumes" Mar 09 13:50:03 crc kubenswrapper[4764]: I0309 13:50:03.591826 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75f29150-3689-48a6-9248-b6774f85fcd2" path="/var/lib/kubelet/pods/75f29150-3689-48a6-9248-b6774f85fcd2/volumes" Mar 09 13:50:03 crc kubenswrapper[4764]: I0309 13:50:03.592464 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d681487-9af9-48e3-bb79-569b8c7bf26d" path="/var/lib/kubelet/pods/7d681487-9af9-48e3-bb79-569b8c7bf26d/volumes" Mar 09 13:50:04 crc kubenswrapper[4764]: I0309 13:50:04.040189 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-66ln9"] Mar 09 13:50:04 crc kubenswrapper[4764]: I0309 13:50:04.049855 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-594d-account-create-update-dxsw5"] Mar 09 13:50:04 crc kubenswrapper[4764]: I0309 13:50:04.061881 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-0f8b-account-create-update-mxbcn"] Mar 09 13:50:04 crc kubenswrapper[4764]: I0309 13:50:04.070901 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-0f8b-account-create-update-mxbcn"] Mar 09 13:50:04 crc kubenswrapper[4764]: I0309 13:50:04.078561 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-66ln9"] Mar 09 13:50:04 crc kubenswrapper[4764]: I0309 13:50:04.085426 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-594d-account-create-update-dxsw5"] Mar 09 13:50:04 crc kubenswrapper[4764]: I0309 13:50:04.537784 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551070-x977g" Mar 09 13:50:04 crc kubenswrapper[4764]: I0309 13:50:04.635236 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kq22\" (UniqueName: \"kubernetes.io/projected/14ef871d-e371-41df-9380-53505557d7ac-kube-api-access-7kq22\") pod \"14ef871d-e371-41df-9380-53505557d7ac\" (UID: \"14ef871d-e371-41df-9380-53505557d7ac\") " Mar 09 13:50:04 crc kubenswrapper[4764]: I0309 13:50:04.644100 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14ef871d-e371-41df-9380-53505557d7ac-kube-api-access-7kq22" (OuterVolumeSpecName: "kube-api-access-7kq22") pod "14ef871d-e371-41df-9380-53505557d7ac" (UID: "14ef871d-e371-41df-9380-53505557d7ac"). InnerVolumeSpecName "kube-api-access-7kq22". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:50:04 crc kubenswrapper[4764]: I0309 13:50:04.738142 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kq22\" (UniqueName: \"kubernetes.io/projected/14ef871d-e371-41df-9380-53505557d7ac-kube-api-access-7kq22\") on node \"crc\" DevicePath \"\"" Mar 09 13:50:05 crc kubenswrapper[4764]: I0309 13:50:05.207264 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551070-x977g" event={"ID":"14ef871d-e371-41df-9380-53505557d7ac","Type":"ContainerDied","Data":"669db91d9f1fcc1767a7b1f486e235588479de39ae56a765a944be467e990bb9"} Mar 09 13:50:05 crc kubenswrapper[4764]: I0309 13:50:05.207674 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="669db91d9f1fcc1767a7b1f486e235588479de39ae56a765a944be467e990bb9" Mar 09 13:50:05 crc kubenswrapper[4764]: I0309 13:50:05.207357 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551070-x977g" Mar 09 13:50:05 crc kubenswrapper[4764]: I0309 13:50:05.581617 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01e4dc90-6790-447b-ac2a-d2dfcde88d17" path="/var/lib/kubelet/pods/01e4dc90-6790-447b-ac2a-d2dfcde88d17/volumes" Mar 09 13:50:05 crc kubenswrapper[4764]: I0309 13:50:05.583226 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="811ef770-3be6-4f3b-9fc3-dee4df710c4f" path="/var/lib/kubelet/pods/811ef770-3be6-4f3b-9fc3-dee4df710c4f/volumes" Mar 09 13:50:05 crc kubenswrapper[4764]: I0309 13:50:05.585088 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d27c011-b8dd-4f14-9833-413f7a8faf8a" path="/var/lib/kubelet/pods/9d27c011-b8dd-4f14-9833-413f7a8faf8a/volumes" Mar 09 13:50:05 crc kubenswrapper[4764]: I0309 13:50:05.620997 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551064-mbrph"] Mar 09 13:50:05 crc kubenswrapper[4764]: I0309 13:50:05.632756 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551064-mbrph"] Mar 09 13:50:05 crc kubenswrapper[4764]: I0309 13:50:05.976810 4764 scope.go:117] "RemoveContainer" containerID="cb8ff00b99c398bb890b473678e6ce951f3d71cc55317946f469bc84cba9ae54" Mar 09 13:50:06 crc kubenswrapper[4764]: I0309 13:50:06.017490 4764 scope.go:117] "RemoveContainer" containerID="01ef1c726b7e7270c862ba5b9e31c73fecc7bf0ea188cc226f1bf52e6cb5af33" Mar 09 13:50:06 crc kubenswrapper[4764]: I0309 13:50:06.065153 4764 scope.go:117] "RemoveContainer" containerID="6bccf5846329079b445a150aae1cbe1d8637bb12a3644ba9afce7863cefc0fe8" Mar 09 13:50:06 crc kubenswrapper[4764]: I0309 13:50:06.115588 4764 scope.go:117] "RemoveContainer" containerID="98058d7a077d55dcc4dc3081744bc24f8ca7e82793695f8748e2850e19fbd5a3" Mar 09 13:50:06 crc kubenswrapper[4764]: I0309 13:50:06.162766 4764 scope.go:117] "RemoveContainer" containerID="da76ff68a5f6a4d7d50712be1138bbea55547749d69b4508eda52be21fd47ce6" Mar 09 13:50:06 crc kubenswrapper[4764]: I0309 13:50:06.208473 4764 scope.go:117] "RemoveContainer" containerID="6de1cfcabddc41034a2e1c58fb3ef2484991e0985876cbbdc822fe1df3641db1" Mar 09 13:50:07 crc kubenswrapper[4764]: I0309 13:50:07.571873 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77179ff3-861b-4aab-b1b2-db4d12041264" path="/var/lib/kubelet/pods/77179ff3-861b-4aab-b1b2-db4d12041264/volumes" Mar 09 13:50:14 crc kubenswrapper[4764]: I0309 13:50:14.560518 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:50:14 crc kubenswrapper[4764]: E0309 13:50:14.561573 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:50:25 crc kubenswrapper[4764]: I0309 13:50:25.052782 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-htpf8"] Mar 09 13:50:25 crc kubenswrapper[4764]: I0309 13:50:25.064978 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-htpf8"] Mar 09 13:50:25 crc kubenswrapper[4764]: I0309 13:50:25.575636 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88e0f4c9-1553-4aca-83f2-e0461ddf062b" path="/var/lib/kubelet/pods/88e0f4c9-1553-4aca-83f2-e0461ddf062b/volumes" Mar 09 13:50:26 crc kubenswrapper[4764]: I0309 13:50:26.559516 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:50:26 crc kubenswrapper[4764]: E0309 13:50:26.560214 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:50:31 crc kubenswrapper[4764]: I0309 13:50:31.039248 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-thhcb"] Mar 09 13:50:31 crc kubenswrapper[4764]: I0309 13:50:31.048735 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-mqv59"] Mar 09 13:50:31 crc kubenswrapper[4764]: I0309 13:50:31.082680 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-thhcb"] Mar 09 13:50:31 crc kubenswrapper[4764]: I0309 13:50:31.091900 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-15a9-account-create-update-5s8tj"] Mar 09 13:50:31 crc kubenswrapper[4764]: I0309 13:50:31.100868 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-mqv59"] Mar 09 13:50:31 crc kubenswrapper[4764]: I0309 13:50:31.111927 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-15a9-account-create-update-5s8tj"] Mar 09 13:50:31 crc kubenswrapper[4764]: I0309 13:50:31.570795 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46124175-b282-444f-8d9c-0397e35cf8ae" path="/var/lib/kubelet/pods/46124175-b282-444f-8d9c-0397e35cf8ae/volumes" Mar 09 13:50:31 crc kubenswrapper[4764]: I0309 13:50:31.571676 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="642b5df5-dec0-47cc-9595-02b254277452" path="/var/lib/kubelet/pods/642b5df5-dec0-47cc-9595-02b254277452/volumes" Mar 09 13:50:31 crc kubenswrapper[4764]: I0309 13:50:31.572437 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82410bc0-aa4c-450d-8fbc-67cfb9dd615b" path="/var/lib/kubelet/pods/82410bc0-aa4c-450d-8fbc-67cfb9dd615b/volumes" Mar 09 13:50:32 crc kubenswrapper[4764]: I0309 13:50:32.038527 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-a166-account-create-update-kswwc"] Mar 09 13:50:32 crc kubenswrapper[4764]: I0309 13:50:32.057521 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-gkf9g"] Mar 09 13:50:32 crc kubenswrapper[4764]: I0309 13:50:32.070006 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-6d6f-account-create-update-qxm8j"] Mar 09 13:50:32 crc kubenswrapper[4764]: I0309 13:50:32.081806 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-gkf9g"] Mar 09 13:50:32 crc kubenswrapper[4764]: I0309 13:50:32.090999 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-6d6f-account-create-update-qxm8j"] Mar 09 13:50:32 crc kubenswrapper[4764]: I0309 13:50:32.100059 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-a166-account-create-update-kswwc"] Mar 09 13:50:33 crc kubenswrapper[4764]: I0309 13:50:33.571595 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf72fda-56e5-427c-b2d0-8267613d8a9e" path="/var/lib/kubelet/pods/1bf72fda-56e5-427c-b2d0-8267613d8a9e/volumes" Mar 09 13:50:33 crc kubenswrapper[4764]: I0309 13:50:33.572573 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b86f9b8-6493-4a60-85b3-12057a6a8f65" path="/var/lib/kubelet/pods/5b86f9b8-6493-4a60-85b3-12057a6a8f65/volumes" Mar 09 13:50:33 crc kubenswrapper[4764]: I0309 13:50:33.574725 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad7d32c2-ffe4-43d5-8640-6219f863bc2a" path="/var/lib/kubelet/pods/ad7d32c2-ffe4-43d5-8640-6219f863bc2a/volumes" Mar 09 13:50:34 crc kubenswrapper[4764]: I0309 13:50:34.528933 4764 generic.go:334] "Generic (PLEG): container finished" podID="7ee15cfe-dd3c-4cc7-bf8f-b324397f4add" containerID="1c7c071cdcf2595562a14fc8211f12dd741cead95f0450c54862b35cde78ac5b" exitCode=0 Mar 09 13:50:34 crc kubenswrapper[4764]: I0309 13:50:34.529015 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" event={"ID":"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add","Type":"ContainerDied","Data":"1c7c071cdcf2595562a14fc8211f12dd741cead95f0450c54862b35cde78ac5b"} Mar 09 13:50:35 crc kubenswrapper[4764]: I0309 13:50:35.036048 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-vzxr2"] Mar 09 13:50:35 crc kubenswrapper[4764]: I0309 13:50:35.044339 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-vzxr2"] Mar 09 13:50:35 crc kubenswrapper[4764]: I0309 13:50:35.577229 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29e20119-f7d3-4b10-82c3-afbfa462c831" path="/var/lib/kubelet/pods/29e20119-f7d3-4b10-82c3-afbfa462c831/volumes" Mar 09 13:50:35 crc kubenswrapper[4764]: I0309 13:50:35.980354 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.104269 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-inventory\") pod \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\" (UID: \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\") " Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.104444 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7snl\" (UniqueName: \"kubernetes.io/projected/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-kube-api-access-k7snl\") pod \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\" (UID: \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\") " Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.104494 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-ssh-key-openstack-edpm-ipam\") pod \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\" (UID: \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\") " Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.115165 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-kube-api-access-k7snl" (OuterVolumeSpecName: "kube-api-access-k7snl") pod "7ee15cfe-dd3c-4cc7-bf8f-b324397f4add" (UID: "7ee15cfe-dd3c-4cc7-bf8f-b324397f4add"). InnerVolumeSpecName "kube-api-access-k7snl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:50:36 crc kubenswrapper[4764]: E0309 13:50:36.130935 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-ssh-key-openstack-edpm-ipam podName:7ee15cfe-dd3c-4cc7-bf8f-b324397f4add nodeName:}" failed. No retries permitted until 2026-03-09 13:50:36.630890801 +0000 UTC m=+1791.881062709 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key-openstack-edpm-ipam" (UniqueName: "kubernetes.io/secret/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-ssh-key-openstack-edpm-ipam") pod "7ee15cfe-dd3c-4cc7-bf8f-b324397f4add" (UID: "7ee15cfe-dd3c-4cc7-bf8f-b324397f4add") : error deleting /var/lib/kubelet/pods/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add/volume-subpaths: remove /var/lib/kubelet/pods/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add/volume-subpaths: no such file or directory Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.134039 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-inventory" (OuterVolumeSpecName: "inventory") pod "7ee15cfe-dd3c-4cc7-bf8f-b324397f4add" (UID: "7ee15cfe-dd3c-4cc7-bf8f-b324397f4add"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.207898 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.207937 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7snl\" (UniqueName: \"kubernetes.io/projected/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-kube-api-access-k7snl\") on node \"crc\" DevicePath \"\"" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.551363 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" event={"ID":"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add","Type":"ContainerDied","Data":"63d45c8e8448e981e4b409a90ffed5471ff40c994d9269a761fcbf3a1ba24c89"} Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.551855 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63d45c8e8448e981e4b409a90ffed5471ff40c994d9269a761fcbf3a1ba24c89" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.552136 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.658143 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh"] Mar 09 13:50:36 crc kubenswrapper[4764]: E0309 13:50:36.658614 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ee15cfe-dd3c-4cc7-bf8f-b324397f4add" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.658630 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ee15cfe-dd3c-4cc7-bf8f-b324397f4add" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:50:36 crc kubenswrapper[4764]: E0309 13:50:36.658660 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14ef871d-e371-41df-9380-53505557d7ac" containerName="oc" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.658667 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="14ef871d-e371-41df-9380-53505557d7ac" containerName="oc" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.658879 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ee15cfe-dd3c-4cc7-bf8f-b324397f4add" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.658890 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="14ef871d-e371-41df-9380-53505557d7ac" containerName="oc" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.659568 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh"] Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.659690 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.720770 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-ssh-key-openstack-edpm-ipam\") pod \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\" (UID: \"7ee15cfe-dd3c-4cc7-bf8f-b324397f4add\") " Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.725454 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7ee15cfe-dd3c-4cc7-bf8f-b324397f4add" (UID: "7ee15cfe-dd3c-4cc7-bf8f-b324397f4add"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.825356 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9fc5b263-ac73-4b6e-8e41-4ed508765c55-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh\" (UID: \"9fc5b263-ac73-4b6e-8e41-4ed508765c55\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.825461 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4v4b\" (UniqueName: \"kubernetes.io/projected/9fc5b263-ac73-4b6e-8e41-4ed508765c55-kube-api-access-p4v4b\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh\" (UID: \"9fc5b263-ac73-4b6e-8e41-4ed508765c55\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.825536 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fc5b263-ac73-4b6e-8e41-4ed508765c55-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh\" (UID: \"9fc5b263-ac73-4b6e-8e41-4ed508765c55\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.825637 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.927376 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9fc5b263-ac73-4b6e-8e41-4ed508765c55-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh\" (UID: \"9fc5b263-ac73-4b6e-8e41-4ed508765c55\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.927451 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4v4b\" (UniqueName: \"kubernetes.io/projected/9fc5b263-ac73-4b6e-8e41-4ed508765c55-kube-api-access-p4v4b\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh\" (UID: \"9fc5b263-ac73-4b6e-8e41-4ed508765c55\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.927521 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fc5b263-ac73-4b6e-8e41-4ed508765c55-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh\" (UID: \"9fc5b263-ac73-4b6e-8e41-4ed508765c55\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.930887 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fc5b263-ac73-4b6e-8e41-4ed508765c55-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh\" (UID: \"9fc5b263-ac73-4b6e-8e41-4ed508765c55\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.931030 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9fc5b263-ac73-4b6e-8e41-4ed508765c55-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh\" (UID: \"9fc5b263-ac73-4b6e-8e41-4ed508765c55\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" Mar 09 13:50:36 crc kubenswrapper[4764]: I0309 13:50:36.951814 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4v4b\" (UniqueName: \"kubernetes.io/projected/9fc5b263-ac73-4b6e-8e41-4ed508765c55-kube-api-access-p4v4b\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh\" (UID: \"9fc5b263-ac73-4b6e-8e41-4ed508765c55\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" Mar 09 13:50:37 crc kubenswrapper[4764]: I0309 13:50:37.020837 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" Mar 09 13:50:37 crc kubenswrapper[4764]: I0309 13:50:37.612473 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh"] Mar 09 13:50:37 crc kubenswrapper[4764]: I0309 13:50:37.641245 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:50:38 crc kubenswrapper[4764]: I0309 13:50:38.613896 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" event={"ID":"9fc5b263-ac73-4b6e-8e41-4ed508765c55","Type":"ContainerStarted","Data":"3910a772e45d62c401146624cbc27497eee00dea57c88064e7e1af0b907bbfcc"} Mar 09 13:50:38 crc kubenswrapper[4764]: I0309 13:50:38.614294 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" event={"ID":"9fc5b263-ac73-4b6e-8e41-4ed508765c55","Type":"ContainerStarted","Data":"8d3ad1d27f26f2b3a6ec93499c393cc129f483e4e450e92a9faf68cba3228e37"} Mar 09 13:50:38 crc kubenswrapper[4764]: I0309 13:50:38.640488 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" podStartSLOduration=2.149468757 podStartE2EDuration="2.640464689s" podCreationTimestamp="2026-03-09 13:50:36 +0000 UTC" firstStartedPulling="2026-03-09 13:50:37.639172 +0000 UTC m=+1792.889343908" lastFinishedPulling="2026-03-09 13:50:38.130167932 +0000 UTC m=+1793.380339840" observedRunningTime="2026-03-09 13:50:38.631982373 +0000 UTC m=+1793.882154281" watchObservedRunningTime="2026-03-09 13:50:38.640464689 +0000 UTC m=+1793.890636597" Mar 09 13:50:40 crc kubenswrapper[4764]: I0309 13:50:40.560619 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:50:40 crc kubenswrapper[4764]: E0309 13:50:40.561329 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:50:41 crc kubenswrapper[4764]: I0309 13:50:41.033247 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-rjx7v"] Mar 09 13:50:41 crc kubenswrapper[4764]: I0309 13:50:41.041932 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-rjx7v"] Mar 09 13:50:41 crc kubenswrapper[4764]: I0309 13:50:41.572077 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea9388c2-526b-49ff-8f42-03ca66ae08dd" path="/var/lib/kubelet/pods/ea9388c2-526b-49ff-8f42-03ca66ae08dd/volumes" Mar 09 13:50:42 crc kubenswrapper[4764]: I0309 13:50:42.658600 4764 generic.go:334] "Generic (PLEG): container finished" podID="9fc5b263-ac73-4b6e-8e41-4ed508765c55" containerID="3910a772e45d62c401146624cbc27497eee00dea57c88064e7e1af0b907bbfcc" exitCode=0 Mar 09 13:50:42 crc kubenswrapper[4764]: I0309 13:50:42.658728 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" event={"ID":"9fc5b263-ac73-4b6e-8e41-4ed508765c55","Type":"ContainerDied","Data":"3910a772e45d62c401146624cbc27497eee00dea57c88064e7e1af0b907bbfcc"} Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.127967 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.312166 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4v4b\" (UniqueName: \"kubernetes.io/projected/9fc5b263-ac73-4b6e-8e41-4ed508765c55-kube-api-access-p4v4b\") pod \"9fc5b263-ac73-4b6e-8e41-4ed508765c55\" (UID: \"9fc5b263-ac73-4b6e-8e41-4ed508765c55\") " Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.312488 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9fc5b263-ac73-4b6e-8e41-4ed508765c55-ssh-key-openstack-edpm-ipam\") pod \"9fc5b263-ac73-4b6e-8e41-4ed508765c55\" (UID: \"9fc5b263-ac73-4b6e-8e41-4ed508765c55\") " Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.312568 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fc5b263-ac73-4b6e-8e41-4ed508765c55-inventory\") pod \"9fc5b263-ac73-4b6e-8e41-4ed508765c55\" (UID: \"9fc5b263-ac73-4b6e-8e41-4ed508765c55\") " Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.319913 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fc5b263-ac73-4b6e-8e41-4ed508765c55-kube-api-access-p4v4b" (OuterVolumeSpecName: "kube-api-access-p4v4b") pod "9fc5b263-ac73-4b6e-8e41-4ed508765c55" (UID: "9fc5b263-ac73-4b6e-8e41-4ed508765c55"). InnerVolumeSpecName "kube-api-access-p4v4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.344322 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fc5b263-ac73-4b6e-8e41-4ed508765c55-inventory" (OuterVolumeSpecName: "inventory") pod "9fc5b263-ac73-4b6e-8e41-4ed508765c55" (UID: "9fc5b263-ac73-4b6e-8e41-4ed508765c55"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.347945 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fc5b263-ac73-4b6e-8e41-4ed508765c55-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9fc5b263-ac73-4b6e-8e41-4ed508765c55" (UID: "9fc5b263-ac73-4b6e-8e41-4ed508765c55"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.415487 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4v4b\" (UniqueName: \"kubernetes.io/projected/9fc5b263-ac73-4b6e-8e41-4ed508765c55-kube-api-access-p4v4b\") on node \"crc\" DevicePath \"\"" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.415837 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9fc5b263-ac73-4b6e-8e41-4ed508765c55-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.415991 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fc5b263-ac73-4b6e-8e41-4ed508765c55-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.684576 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" event={"ID":"9fc5b263-ac73-4b6e-8e41-4ed508765c55","Type":"ContainerDied","Data":"8d3ad1d27f26f2b3a6ec93499c393cc129f483e4e450e92a9faf68cba3228e37"} Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.684631 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.684672 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d3ad1d27f26f2b3a6ec93499c393cc129f483e4e450e92a9faf68cba3228e37" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.769981 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w"] Mar 09 13:50:44 crc kubenswrapper[4764]: E0309 13:50:44.770676 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fc5b263-ac73-4b6e-8e41-4ed508765c55" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.770699 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fc5b263-ac73-4b6e-8e41-4ed508765c55" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.770920 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fc5b263-ac73-4b6e-8e41-4ed508765c55" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.771836 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.775066 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.775148 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.775351 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.775396 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.781707 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w"] Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.826129 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1dbc4eda-5f77-4951-962f-9ed0b1308df0-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hl84w\" (UID: \"1dbc4eda-5f77-4951-962f-9ed0b1308df0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.826208 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p475q\" (UniqueName: \"kubernetes.io/projected/1dbc4eda-5f77-4951-962f-9ed0b1308df0-kube-api-access-p475q\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hl84w\" (UID: \"1dbc4eda-5f77-4951-962f-9ed0b1308df0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.826379 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1dbc4eda-5f77-4951-962f-9ed0b1308df0-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hl84w\" (UID: \"1dbc4eda-5f77-4951-962f-9ed0b1308df0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.928962 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1dbc4eda-5f77-4951-962f-9ed0b1308df0-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hl84w\" (UID: \"1dbc4eda-5f77-4951-962f-9ed0b1308df0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.929247 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1dbc4eda-5f77-4951-962f-9ed0b1308df0-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hl84w\" (UID: \"1dbc4eda-5f77-4951-962f-9ed0b1308df0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.929338 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p475q\" (UniqueName: \"kubernetes.io/projected/1dbc4eda-5f77-4951-962f-9ed0b1308df0-kube-api-access-p475q\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hl84w\" (UID: \"1dbc4eda-5f77-4951-962f-9ed0b1308df0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.937053 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1dbc4eda-5f77-4951-962f-9ed0b1308df0-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hl84w\" (UID: \"1dbc4eda-5f77-4951-962f-9ed0b1308df0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.939764 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1dbc4eda-5f77-4951-962f-9ed0b1308df0-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hl84w\" (UID: \"1dbc4eda-5f77-4951-962f-9ed0b1308df0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" Mar 09 13:50:44 crc kubenswrapper[4764]: I0309 13:50:44.951250 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p475q\" (UniqueName: \"kubernetes.io/projected/1dbc4eda-5f77-4951-962f-9ed0b1308df0-kube-api-access-p475q\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hl84w\" (UID: \"1dbc4eda-5f77-4951-962f-9ed0b1308df0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" Mar 09 13:50:45 crc kubenswrapper[4764]: I0309 13:50:45.099627 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" Mar 09 13:50:45 crc kubenswrapper[4764]: I0309 13:50:45.672117 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w"] Mar 09 13:50:45 crc kubenswrapper[4764]: I0309 13:50:45.696584 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" event={"ID":"1dbc4eda-5f77-4951-962f-9ed0b1308df0","Type":"ContainerStarted","Data":"785c9329b6d9887145f8b240e44234fa1c35f72d9b221c2ae8340f8c7636b913"} Mar 09 13:50:46 crc kubenswrapper[4764]: I0309 13:50:46.110271 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:50:46 crc kubenswrapper[4764]: I0309 13:50:46.709689 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" event={"ID":"1dbc4eda-5f77-4951-962f-9ed0b1308df0","Type":"ContainerStarted","Data":"b4cf48a07b982624103805e852925a23f44a6c1f17fbc126f6f6ed00345f5ccd"} Mar 09 13:50:46 crc kubenswrapper[4764]: I0309 13:50:46.743202 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" podStartSLOduration=2.315529596 podStartE2EDuration="2.743175559s" podCreationTimestamp="2026-03-09 13:50:44 +0000 UTC" firstStartedPulling="2026-03-09 13:50:45.680063502 +0000 UTC m=+1800.930235400" lastFinishedPulling="2026-03-09 13:50:46.107709455 +0000 UTC m=+1801.357881363" observedRunningTime="2026-03-09 13:50:46.734022045 +0000 UTC m=+1801.984194003" watchObservedRunningTime="2026-03-09 13:50:46.743175559 +0000 UTC m=+1801.993347467" Mar 09 13:50:52 crc kubenswrapper[4764]: I0309 13:50:52.560160 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:50:52 crc kubenswrapper[4764]: E0309 13:50:52.561445 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:51:04 crc kubenswrapper[4764]: I0309 13:51:04.055996 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-cmhtp"] Mar 09 13:51:04 crc kubenswrapper[4764]: I0309 13:51:04.065914 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-cmhtp"] Mar 09 13:51:05 crc kubenswrapper[4764]: I0309 13:51:05.569592 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:51:05 crc kubenswrapper[4764]: E0309 13:51:05.570362 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:51:05 crc kubenswrapper[4764]: I0309 13:51:05.574996 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34466abc-30eb-4a0c-b4ea-50b5ab368fa1" path="/var/lib/kubelet/pods/34466abc-30eb-4a0c-b4ea-50b5ab368fa1/volumes" Mar 09 13:51:06 crc kubenswrapper[4764]: I0309 13:51:06.365343 4764 scope.go:117] "RemoveContainer" containerID="6b1dbed6bf6e61f7c12ad4f9dbf8d714fbaa8de7054f1b548bd8d7d1b560e4d6" Mar 09 13:51:06 crc kubenswrapper[4764]: I0309 13:51:06.423536 4764 scope.go:117] "RemoveContainer" containerID="6b21bf7421d738d162c19d823aaa8d5749330a4586aa8ead5eb3f5fbb8ebcd4e" Mar 09 13:51:06 crc kubenswrapper[4764]: I0309 13:51:06.451390 4764 scope.go:117] "RemoveContainer" containerID="339bad8557439d7afa5c329a435cd35a995246fc604b25e4a8e250f1a7b99368" Mar 09 13:51:06 crc kubenswrapper[4764]: I0309 13:51:06.503759 4764 scope.go:117] "RemoveContainer" containerID="bffbf528e6b05eccf8c0e302264ead1f4f679cc141d19e6aa19cb09ac29fba17" Mar 09 13:51:06 crc kubenswrapper[4764]: I0309 13:51:06.576446 4764 scope.go:117] "RemoveContainer" containerID="06f1bf05b0fa436ae59020308ffb3c9a4a73f8d30f844f8ee3828c34f9dbc702" Mar 09 13:51:06 crc kubenswrapper[4764]: I0309 13:51:06.622392 4764 scope.go:117] "RemoveContainer" containerID="cf71be9d097dd827ce8f90129691408cafc7024d1662f6cee61aed9d868f11b5" Mar 09 13:51:06 crc kubenswrapper[4764]: I0309 13:51:06.651259 4764 scope.go:117] "RemoveContainer" containerID="ccd695e4102eac8ad1bb7aec25a2c1252558ea5824f39855b788c7a3324a1c3b" Mar 09 13:51:06 crc kubenswrapper[4764]: I0309 13:51:06.675206 4764 scope.go:117] "RemoveContainer" containerID="77b865a9fe4889c82ec1f58b4a21addc379165d3a6897c87913c6042aa1f357b" Mar 09 13:51:06 crc kubenswrapper[4764]: I0309 13:51:06.723451 4764 scope.go:117] "RemoveContainer" containerID="63ff43316b127bbf8f65a169da3d3d1192d7c9af3a1493dff44991331dc6c723" Mar 09 13:51:06 crc kubenswrapper[4764]: I0309 13:51:06.751960 4764 scope.go:117] "RemoveContainer" containerID="da404af8a75d74966ac6aec712c910704da97b03d85b23af80911b87587e1ef7" Mar 09 13:51:06 crc kubenswrapper[4764]: I0309 13:51:06.777370 4764 scope.go:117] "RemoveContainer" containerID="8799a3258f6d77e33d1068648c90a3371654d54435ab51ff0aa268e01baf2e9b" Mar 09 13:51:09 crc kubenswrapper[4764]: I0309 13:51:09.035876 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-bnpcj"] Mar 09 13:51:09 crc kubenswrapper[4764]: I0309 13:51:09.043931 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-bnpcj"] Mar 09 13:51:09 crc kubenswrapper[4764]: I0309 13:51:09.581302 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1004910c-0db4-4e3d-aac5-358a557ee268" path="/var/lib/kubelet/pods/1004910c-0db4-4e3d-aac5-358a557ee268/volumes" Mar 09 13:51:11 crc kubenswrapper[4764]: I0309 13:51:11.044554 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-9sj6m"] Mar 09 13:51:11 crc kubenswrapper[4764]: I0309 13:51:11.056045 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-9sj6m"] Mar 09 13:51:11 crc kubenswrapper[4764]: I0309 13:51:11.580946 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a338463-1443-4863-830e-0621abc3ed15" path="/var/lib/kubelet/pods/2a338463-1443-4863-830e-0621abc3ed15/volumes" Mar 09 13:51:19 crc kubenswrapper[4764]: I0309 13:51:19.559873 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:51:19 crc kubenswrapper[4764]: E0309 13:51:19.560890 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:51:22 crc kubenswrapper[4764]: I0309 13:51:22.048889 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-x9gvc"] Mar 09 13:51:22 crc kubenswrapper[4764]: I0309 13:51:22.057424 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-x9gvc"] Mar 09 13:51:23 crc kubenswrapper[4764]: I0309 13:51:23.573492 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb54f57d-afb6-4e53-be9a-4b22573a9450" path="/var/lib/kubelet/pods/cb54f57d-afb6-4e53-be9a-4b22573a9450/volumes" Mar 09 13:51:30 crc kubenswrapper[4764]: I0309 13:51:30.039140 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-dp5x6"] Mar 09 13:51:30 crc kubenswrapper[4764]: I0309 13:51:30.049010 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-dp5x6"] Mar 09 13:51:31 crc kubenswrapper[4764]: I0309 13:51:31.166714 4764 generic.go:334] "Generic (PLEG): container finished" podID="1dbc4eda-5f77-4951-962f-9ed0b1308df0" containerID="b4cf48a07b982624103805e852925a23f44a6c1f17fbc126f6f6ed00345f5ccd" exitCode=0 Mar 09 13:51:31 crc kubenswrapper[4764]: I0309 13:51:31.166784 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" event={"ID":"1dbc4eda-5f77-4951-962f-9ed0b1308df0","Type":"ContainerDied","Data":"b4cf48a07b982624103805e852925a23f44a6c1f17fbc126f6f6ed00345f5ccd"} Mar 09 13:51:31 crc kubenswrapper[4764]: I0309 13:51:31.573760 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74146b7d-9780-4d2d-9454-853296f88955" path="/var/lib/kubelet/pods/74146b7d-9780-4d2d-9454-853296f88955/volumes" Mar 09 13:51:32 crc kubenswrapper[4764]: I0309 13:51:32.676486 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" Mar 09 13:51:32 crc kubenswrapper[4764]: I0309 13:51:32.732167 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1dbc4eda-5f77-4951-962f-9ed0b1308df0-inventory\") pod \"1dbc4eda-5f77-4951-962f-9ed0b1308df0\" (UID: \"1dbc4eda-5f77-4951-962f-9ed0b1308df0\") " Mar 09 13:51:32 crc kubenswrapper[4764]: I0309 13:51:32.732744 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1dbc4eda-5f77-4951-962f-9ed0b1308df0-ssh-key-openstack-edpm-ipam\") pod \"1dbc4eda-5f77-4951-962f-9ed0b1308df0\" (UID: \"1dbc4eda-5f77-4951-962f-9ed0b1308df0\") " Mar 09 13:51:32 crc kubenswrapper[4764]: I0309 13:51:32.733213 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p475q\" (UniqueName: \"kubernetes.io/projected/1dbc4eda-5f77-4951-962f-9ed0b1308df0-kube-api-access-p475q\") pod \"1dbc4eda-5f77-4951-962f-9ed0b1308df0\" (UID: \"1dbc4eda-5f77-4951-962f-9ed0b1308df0\") " Mar 09 13:51:32 crc kubenswrapper[4764]: I0309 13:51:32.740331 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dbc4eda-5f77-4951-962f-9ed0b1308df0-kube-api-access-p475q" (OuterVolumeSpecName: "kube-api-access-p475q") pod "1dbc4eda-5f77-4951-962f-9ed0b1308df0" (UID: "1dbc4eda-5f77-4951-962f-9ed0b1308df0"). InnerVolumeSpecName "kube-api-access-p475q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:51:32 crc kubenswrapper[4764]: I0309 13:51:32.765596 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dbc4eda-5f77-4951-962f-9ed0b1308df0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1dbc4eda-5f77-4951-962f-9ed0b1308df0" (UID: "1dbc4eda-5f77-4951-962f-9ed0b1308df0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:51:32 crc kubenswrapper[4764]: I0309 13:51:32.765980 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dbc4eda-5f77-4951-962f-9ed0b1308df0-inventory" (OuterVolumeSpecName: "inventory") pod "1dbc4eda-5f77-4951-962f-9ed0b1308df0" (UID: "1dbc4eda-5f77-4951-962f-9ed0b1308df0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:51:32 crc kubenswrapper[4764]: I0309 13:51:32.834589 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p475q\" (UniqueName: \"kubernetes.io/projected/1dbc4eda-5f77-4951-962f-9ed0b1308df0-kube-api-access-p475q\") on node \"crc\" DevicePath \"\"" Mar 09 13:51:32 crc kubenswrapper[4764]: I0309 13:51:32.834630 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1dbc4eda-5f77-4951-962f-9ed0b1308df0-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:51:32 crc kubenswrapper[4764]: I0309 13:51:32.834655 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1dbc4eda-5f77-4951-962f-9ed0b1308df0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.191151 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" event={"ID":"1dbc4eda-5f77-4951-962f-9ed0b1308df0","Type":"ContainerDied","Data":"785c9329b6d9887145f8b240e44234fa1c35f72d9b221c2ae8340f8c7636b913"} Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.191215 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="785c9329b6d9887145f8b240e44234fa1c35f72d9b221c2ae8340f8c7636b913" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.191237 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.282572 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-ljpp5"] Mar 09 13:51:33 crc kubenswrapper[4764]: E0309 13:51:33.283033 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dbc4eda-5f77-4951-962f-9ed0b1308df0" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.283055 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dbc4eda-5f77-4951-962f-9ed0b1308df0" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.283225 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dbc4eda-5f77-4951-962f-9ed0b1308df0" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.283957 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.287635 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.288562 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.288722 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.288946 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.299010 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-ljpp5"] Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.347007 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-ljpp5\" (UID: \"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2\") " pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.347091 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b66q\" (UniqueName: \"kubernetes.io/projected/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-kube-api-access-8b66q\") pod \"ssh-known-hosts-edpm-deployment-ljpp5\" (UID: \"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2\") " pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.347684 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-ljpp5\" (UID: \"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2\") " pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.449900 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-ljpp5\" (UID: \"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2\") " pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.450021 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-ljpp5\" (UID: \"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2\") " pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.450120 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b66q\" (UniqueName: \"kubernetes.io/projected/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-kube-api-access-8b66q\") pod \"ssh-known-hosts-edpm-deployment-ljpp5\" (UID: \"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2\") " pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.456024 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-ljpp5\" (UID: \"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2\") " pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.458179 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-ljpp5\" (UID: \"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2\") " pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.471991 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b66q\" (UniqueName: \"kubernetes.io/projected/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-kube-api-access-8b66q\") pod \"ssh-known-hosts-edpm-deployment-ljpp5\" (UID: \"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2\") " pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.561254 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:51:33 crc kubenswrapper[4764]: E0309 13:51:33.561486 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:51:33 crc kubenswrapper[4764]: I0309 13:51:33.679164 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" Mar 09 13:51:34 crc kubenswrapper[4764]: I0309 13:51:34.239689 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-ljpp5"] Mar 09 13:51:35 crc kubenswrapper[4764]: I0309 13:51:35.215605 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" event={"ID":"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2","Type":"ContainerStarted","Data":"a382abef431f7e8dce93c27f5f5efe18631bdeb380db25088eb96acfd690b493"} Mar 09 13:51:35 crc kubenswrapper[4764]: I0309 13:51:35.216075 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" event={"ID":"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2","Type":"ContainerStarted","Data":"590f3b4e6365bbda04e720f9a91726b9088576faa0e67efa55c191f171559bef"} Mar 09 13:51:35 crc kubenswrapper[4764]: I0309 13:51:35.239304 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" podStartSLOduration=1.776795481 podStartE2EDuration="2.23928054s" podCreationTimestamp="2026-03-09 13:51:33 +0000 UTC" firstStartedPulling="2026-03-09 13:51:34.248662562 +0000 UTC m=+1849.498834480" lastFinishedPulling="2026-03-09 13:51:34.711147611 +0000 UTC m=+1849.961319539" observedRunningTime="2026-03-09 13:51:35.237610106 +0000 UTC m=+1850.487782064" watchObservedRunningTime="2026-03-09 13:51:35.23928054 +0000 UTC m=+1850.489452448" Mar 09 13:51:42 crc kubenswrapper[4764]: I0309 13:51:42.289864 4764 generic.go:334] "Generic (PLEG): container finished" podID="9ec29e7b-3537-459a-bfb4-acc93e1e5ec2" containerID="a382abef431f7e8dce93c27f5f5efe18631bdeb380db25088eb96acfd690b493" exitCode=0 Mar 09 13:51:42 crc kubenswrapper[4764]: I0309 13:51:42.290049 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" event={"ID":"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2","Type":"ContainerDied","Data":"a382abef431f7e8dce93c27f5f5efe18631bdeb380db25088eb96acfd690b493"} Mar 09 13:51:43 crc kubenswrapper[4764]: I0309 13:51:43.760345 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" Mar 09 13:51:43 crc kubenswrapper[4764]: I0309 13:51:43.801905 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-inventory-0\") pod \"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2\" (UID: \"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2\") " Mar 09 13:51:43 crc kubenswrapper[4764]: I0309 13:51:43.801996 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-ssh-key-openstack-edpm-ipam\") pod \"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2\" (UID: \"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2\") " Mar 09 13:51:43 crc kubenswrapper[4764]: I0309 13:51:43.802179 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b66q\" (UniqueName: \"kubernetes.io/projected/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-kube-api-access-8b66q\") pod \"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2\" (UID: \"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2\") " Mar 09 13:51:43 crc kubenswrapper[4764]: I0309 13:51:43.833321 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-kube-api-access-8b66q" (OuterVolumeSpecName: "kube-api-access-8b66q") pod "9ec29e7b-3537-459a-bfb4-acc93e1e5ec2" (UID: "9ec29e7b-3537-459a-bfb4-acc93e1e5ec2"). InnerVolumeSpecName "kube-api-access-8b66q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:51:43 crc kubenswrapper[4764]: I0309 13:51:43.836858 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9ec29e7b-3537-459a-bfb4-acc93e1e5ec2" (UID: "9ec29e7b-3537-459a-bfb4-acc93e1e5ec2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:51:43 crc kubenswrapper[4764]: I0309 13:51:43.861307 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "9ec29e7b-3537-459a-bfb4-acc93e1e5ec2" (UID: "9ec29e7b-3537-459a-bfb4-acc93e1e5ec2"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:51:43 crc kubenswrapper[4764]: I0309 13:51:43.905919 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b66q\" (UniqueName: \"kubernetes.io/projected/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-kube-api-access-8b66q\") on node \"crc\" DevicePath \"\"" Mar 09 13:51:43 crc kubenswrapper[4764]: I0309 13:51:43.905967 4764 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 09 13:51:43 crc kubenswrapper[4764]: I0309 13:51:43.905982 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.312074 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" event={"ID":"9ec29e7b-3537-459a-bfb4-acc93e1e5ec2","Type":"ContainerDied","Data":"590f3b4e6365bbda04e720f9a91726b9088576faa0e67efa55c191f171559bef"} Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.312464 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="590f3b4e6365bbda04e720f9a91726b9088576faa0e67efa55c191f171559bef" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.312164 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ljpp5" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.404461 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb"] Mar 09 13:51:44 crc kubenswrapper[4764]: E0309 13:51:44.405178 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ec29e7b-3537-459a-bfb4-acc93e1e5ec2" containerName="ssh-known-hosts-edpm-deployment" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.405207 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ec29e7b-3537-459a-bfb4-acc93e1e5ec2" containerName="ssh-known-hosts-edpm-deployment" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.405432 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ec29e7b-3537-459a-bfb4-acc93e1e5ec2" containerName="ssh-known-hosts-edpm-deployment" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.406362 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.408633 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.408994 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.409033 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.409345 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.423392 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb"] Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.537465 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-x4txb\" (UID: \"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.537735 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-x4txb\" (UID: \"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.537803 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnhps\" (UniqueName: \"kubernetes.io/projected/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-kube-api-access-lnhps\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-x4txb\" (UID: \"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.639427 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-x4txb\" (UID: \"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.639818 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnhps\" (UniqueName: \"kubernetes.io/projected/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-kube-api-access-lnhps\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-x4txb\" (UID: \"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.640018 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-x4txb\" (UID: \"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.646383 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-x4txb\" (UID: \"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.646442 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-x4txb\" (UID: \"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.660067 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnhps\" (UniqueName: \"kubernetes.io/projected/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-kube-api-access-lnhps\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-x4txb\" (UID: \"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" Mar 09 13:51:44 crc kubenswrapper[4764]: I0309 13:51:44.728585 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" Mar 09 13:51:45 crc kubenswrapper[4764]: I0309 13:51:45.316370 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb"] Mar 09 13:51:45 crc kubenswrapper[4764]: I0309 13:51:45.570138 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:51:45 crc kubenswrapper[4764]: E0309 13:51:45.570968 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:51:45 crc kubenswrapper[4764]: I0309 13:51:45.841110 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:51:46 crc kubenswrapper[4764]: I0309 13:51:46.352146 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" event={"ID":"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c","Type":"ContainerStarted","Data":"b589f4b8fd5031689b4581120afd3a37af8b44cf26e7b85ac17d85bceae4a6f2"} Mar 09 13:51:46 crc kubenswrapper[4764]: I0309 13:51:46.352716 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" event={"ID":"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c","Type":"ContainerStarted","Data":"75cbd682df57607b22391ec4a112086032d72c0b62435c3d41b1c5b15dfde7fd"} Mar 09 13:51:46 crc kubenswrapper[4764]: I0309 13:51:46.372381 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" podStartSLOduration=1.871959952 podStartE2EDuration="2.372358882s" podCreationTimestamp="2026-03-09 13:51:44 +0000 UTC" firstStartedPulling="2026-03-09 13:51:45.337401291 +0000 UTC m=+1860.587573209" lastFinishedPulling="2026-03-09 13:51:45.837800211 +0000 UTC m=+1861.087972139" observedRunningTime="2026-03-09 13:51:46.368987202 +0000 UTC m=+1861.619159110" watchObservedRunningTime="2026-03-09 13:51:46.372358882 +0000 UTC m=+1861.622530790" Mar 09 13:51:53 crc kubenswrapper[4764]: I0309 13:51:53.423511 4764 generic.go:334] "Generic (PLEG): container finished" podID="d5f7b3e4-69e8-4529-9973-63b6af6b5e5c" containerID="b589f4b8fd5031689b4581120afd3a37af8b44cf26e7b85ac17d85bceae4a6f2" exitCode=0 Mar 09 13:51:53 crc kubenswrapper[4764]: I0309 13:51:53.423622 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" event={"ID":"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c","Type":"ContainerDied","Data":"b589f4b8fd5031689b4581120afd3a37af8b44cf26e7b85ac17d85bceae4a6f2"} Mar 09 13:51:54 crc kubenswrapper[4764]: I0309 13:51:54.882837 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" Mar 09 13:51:54 crc kubenswrapper[4764]: I0309 13:51:54.995772 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnhps\" (UniqueName: \"kubernetes.io/projected/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-kube-api-access-lnhps\") pod \"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c\" (UID: \"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c\") " Mar 09 13:51:54 crc kubenswrapper[4764]: I0309 13:51:54.996111 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-inventory\") pod \"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c\" (UID: \"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c\") " Mar 09 13:51:54 crc kubenswrapper[4764]: I0309 13:51:54.996151 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-ssh-key-openstack-edpm-ipam\") pod \"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c\" (UID: \"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c\") " Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.003522 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-kube-api-access-lnhps" (OuterVolumeSpecName: "kube-api-access-lnhps") pod "d5f7b3e4-69e8-4529-9973-63b6af6b5e5c" (UID: "d5f7b3e4-69e8-4529-9973-63b6af6b5e5c"). InnerVolumeSpecName "kube-api-access-lnhps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.037208 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-inventory" (OuterVolumeSpecName: "inventory") pod "d5f7b3e4-69e8-4529-9973-63b6af6b5e5c" (UID: "d5f7b3e4-69e8-4529-9973-63b6af6b5e5c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.037526 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d5f7b3e4-69e8-4529-9973-63b6af6b5e5c" (UID: "d5f7b3e4-69e8-4529-9973-63b6af6b5e5c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.099499 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnhps\" (UniqueName: \"kubernetes.io/projected/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-kube-api-access-lnhps\") on node \"crc\" DevicePath \"\"" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.099572 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.099594 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.452133 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" event={"ID":"d5f7b3e4-69e8-4529-9973-63b6af6b5e5c","Type":"ContainerDied","Data":"75cbd682df57607b22391ec4a112086032d72c0b62435c3d41b1c5b15dfde7fd"} Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.452189 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75cbd682df57607b22391ec4a112086032d72c0b62435c3d41b1c5b15dfde7fd" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.452238 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.637610 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw"] Mar 09 13:51:55 crc kubenswrapper[4764]: E0309 13:51:55.638144 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5f7b3e4-69e8-4529-9973-63b6af6b5e5c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.638162 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5f7b3e4-69e8-4529-9973-63b6af6b5e5c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.638356 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5f7b3e4-69e8-4529-9973-63b6af6b5e5c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.639486 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.642191 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.642218 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.642538 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.642991 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.656999 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw"] Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.712337 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65n7r\" (UniqueName: \"kubernetes.io/projected/c0456561-9f16-4a32-b3ec-6ab6aa808b76-kube-api-access-65n7r\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw\" (UID: \"c0456561-9f16-4a32-b3ec-6ab6aa808b76\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.712425 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0456561-9f16-4a32-b3ec-6ab6aa808b76-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw\" (UID: \"c0456561-9f16-4a32-b3ec-6ab6aa808b76\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.712661 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0456561-9f16-4a32-b3ec-6ab6aa808b76-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw\" (UID: \"c0456561-9f16-4a32-b3ec-6ab6aa808b76\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.814896 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0456561-9f16-4a32-b3ec-6ab6aa808b76-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw\" (UID: \"c0456561-9f16-4a32-b3ec-6ab6aa808b76\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.815306 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0456561-9f16-4a32-b3ec-6ab6aa808b76-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw\" (UID: \"c0456561-9f16-4a32-b3ec-6ab6aa808b76\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.815553 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65n7r\" (UniqueName: \"kubernetes.io/projected/c0456561-9f16-4a32-b3ec-6ab6aa808b76-kube-api-access-65n7r\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw\" (UID: \"c0456561-9f16-4a32-b3ec-6ab6aa808b76\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.819508 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0456561-9f16-4a32-b3ec-6ab6aa808b76-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw\" (UID: \"c0456561-9f16-4a32-b3ec-6ab6aa808b76\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.822683 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0456561-9f16-4a32-b3ec-6ab6aa808b76-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw\" (UID: \"c0456561-9f16-4a32-b3ec-6ab6aa808b76\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.834631 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65n7r\" (UniqueName: \"kubernetes.io/projected/c0456561-9f16-4a32-b3ec-6ab6aa808b76-kube-api-access-65n7r\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw\" (UID: \"c0456561-9f16-4a32-b3ec-6ab6aa808b76\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" Mar 09 13:51:55 crc kubenswrapper[4764]: I0309 13:51:55.960711 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" Mar 09 13:51:56 crc kubenswrapper[4764]: I0309 13:51:56.530927 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw"] Mar 09 13:51:57 crc kubenswrapper[4764]: I0309 13:51:57.473173 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" event={"ID":"c0456561-9f16-4a32-b3ec-6ab6aa808b76","Type":"ContainerStarted","Data":"860e53941ce3fa2c04ed10b54cb5f6fa59f9d4631186670908ce0c966139e37f"} Mar 09 13:51:57 crc kubenswrapper[4764]: I0309 13:51:57.474853 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" event={"ID":"c0456561-9f16-4a32-b3ec-6ab6aa808b76","Type":"ContainerStarted","Data":"7d9162164851ca102e807aeca92eff9a13dffecebd98044e7f6059bda0fecc3a"} Mar 09 13:51:57 crc kubenswrapper[4764]: I0309 13:51:57.510843 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" podStartSLOduration=2.078802439 podStartE2EDuration="2.510810984s" podCreationTimestamp="2026-03-09 13:51:55 +0000 UTC" firstStartedPulling="2026-03-09 13:51:56.539717397 +0000 UTC m=+1871.789889305" lastFinishedPulling="2026-03-09 13:51:56.971725942 +0000 UTC m=+1872.221897850" observedRunningTime="2026-03-09 13:51:57.500998213 +0000 UTC m=+1872.751170171" watchObservedRunningTime="2026-03-09 13:51:57.510810984 +0000 UTC m=+1872.760982922" Mar 09 13:52:00 crc kubenswrapper[4764]: I0309 13:52:00.142587 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551072-2q8rg"] Mar 09 13:52:00 crc kubenswrapper[4764]: I0309 13:52:00.144485 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551072-2q8rg" Mar 09 13:52:00 crc kubenswrapper[4764]: I0309 13:52:00.146811 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:52:00 crc kubenswrapper[4764]: I0309 13:52:00.147153 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:52:00 crc kubenswrapper[4764]: I0309 13:52:00.147791 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:52:00 crc kubenswrapper[4764]: I0309 13:52:00.153414 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551072-2q8rg"] Mar 09 13:52:00 crc kubenswrapper[4764]: I0309 13:52:00.213971 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czbpz\" (UniqueName: \"kubernetes.io/projected/f422975f-b0ee-4ef9-be32-3aac0003a54d-kube-api-access-czbpz\") pod \"auto-csr-approver-29551072-2q8rg\" (UID: \"f422975f-b0ee-4ef9-be32-3aac0003a54d\") " pod="openshift-infra/auto-csr-approver-29551072-2q8rg" Mar 09 13:52:00 crc kubenswrapper[4764]: I0309 13:52:00.316963 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czbpz\" (UniqueName: \"kubernetes.io/projected/f422975f-b0ee-4ef9-be32-3aac0003a54d-kube-api-access-czbpz\") pod \"auto-csr-approver-29551072-2q8rg\" (UID: \"f422975f-b0ee-4ef9-be32-3aac0003a54d\") " pod="openshift-infra/auto-csr-approver-29551072-2q8rg" Mar 09 13:52:00 crc kubenswrapper[4764]: I0309 13:52:00.338102 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czbpz\" (UniqueName: \"kubernetes.io/projected/f422975f-b0ee-4ef9-be32-3aac0003a54d-kube-api-access-czbpz\") pod \"auto-csr-approver-29551072-2q8rg\" (UID: \"f422975f-b0ee-4ef9-be32-3aac0003a54d\") " pod="openshift-infra/auto-csr-approver-29551072-2q8rg" Mar 09 13:52:00 crc kubenswrapper[4764]: I0309 13:52:00.479425 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551072-2q8rg" Mar 09 13:52:00 crc kubenswrapper[4764]: I0309 13:52:00.560859 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:52:00 crc kubenswrapper[4764]: E0309 13:52:00.561190 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:52:00 crc kubenswrapper[4764]: I0309 13:52:00.939672 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551072-2q8rg"] Mar 09 13:52:00 crc kubenswrapper[4764]: W0309 13:52:00.946041 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf422975f_b0ee_4ef9_be32_3aac0003a54d.slice/crio-0dbd38e51658ebae4316351941269fda1f723a2cb9d62a11c545d824da270a51 WatchSource:0}: Error finding container 0dbd38e51658ebae4316351941269fda1f723a2cb9d62a11c545d824da270a51: Status 404 returned error can't find the container with id 0dbd38e51658ebae4316351941269fda1f723a2cb9d62a11c545d824da270a51 Mar 09 13:52:01 crc kubenswrapper[4764]: I0309 13:52:01.517849 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551072-2q8rg" event={"ID":"f422975f-b0ee-4ef9-be32-3aac0003a54d","Type":"ContainerStarted","Data":"0dbd38e51658ebae4316351941269fda1f723a2cb9d62a11c545d824da270a51"} Mar 09 13:52:02 crc kubenswrapper[4764]: I0309 13:52:02.532207 4764 generic.go:334] "Generic (PLEG): container finished" podID="f422975f-b0ee-4ef9-be32-3aac0003a54d" containerID="c5badba421cb7b57d824a0a46932addc59fbe992b12c15d75cb49304a00e2761" exitCode=0 Mar 09 13:52:02 crc kubenswrapper[4764]: I0309 13:52:02.532334 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551072-2q8rg" event={"ID":"f422975f-b0ee-4ef9-be32-3aac0003a54d","Type":"ContainerDied","Data":"c5badba421cb7b57d824a0a46932addc59fbe992b12c15d75cb49304a00e2761"} Mar 09 13:52:03 crc kubenswrapper[4764]: I0309 13:52:03.937348 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551072-2q8rg" Mar 09 13:52:04 crc kubenswrapper[4764]: I0309 13:52:04.004687 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czbpz\" (UniqueName: \"kubernetes.io/projected/f422975f-b0ee-4ef9-be32-3aac0003a54d-kube-api-access-czbpz\") pod \"f422975f-b0ee-4ef9-be32-3aac0003a54d\" (UID: \"f422975f-b0ee-4ef9-be32-3aac0003a54d\") " Mar 09 13:52:04 crc kubenswrapper[4764]: I0309 13:52:04.015387 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f422975f-b0ee-4ef9-be32-3aac0003a54d-kube-api-access-czbpz" (OuterVolumeSpecName: "kube-api-access-czbpz") pod "f422975f-b0ee-4ef9-be32-3aac0003a54d" (UID: "f422975f-b0ee-4ef9-be32-3aac0003a54d"). InnerVolumeSpecName "kube-api-access-czbpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:52:04 crc kubenswrapper[4764]: I0309 13:52:04.107018 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czbpz\" (UniqueName: \"kubernetes.io/projected/f422975f-b0ee-4ef9-be32-3aac0003a54d-kube-api-access-czbpz\") on node \"crc\" DevicePath \"\"" Mar 09 13:52:04 crc kubenswrapper[4764]: I0309 13:52:04.557592 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551072-2q8rg" event={"ID":"f422975f-b0ee-4ef9-be32-3aac0003a54d","Type":"ContainerDied","Data":"0dbd38e51658ebae4316351941269fda1f723a2cb9d62a11c545d824da270a51"} Mar 09 13:52:04 crc kubenswrapper[4764]: I0309 13:52:04.558218 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dbd38e51658ebae4316351941269fda1f723a2cb9d62a11c545d824da270a51" Mar 09 13:52:04 crc kubenswrapper[4764]: I0309 13:52:04.557678 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551072-2q8rg" Mar 09 13:52:05 crc kubenswrapper[4764]: I0309 13:52:05.021545 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551066-q2m88"] Mar 09 13:52:05 crc kubenswrapper[4764]: I0309 13:52:05.030972 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551066-q2m88"] Mar 09 13:52:05 crc kubenswrapper[4764]: I0309 13:52:05.589034 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f277802-4cc0-41e2-90f9-a9e2ac441979" path="/var/lib/kubelet/pods/2f277802-4cc0-41e2-90f9-a9e2ac441979/volumes" Mar 09 13:52:06 crc kubenswrapper[4764]: I0309 13:52:06.617552 4764 generic.go:334] "Generic (PLEG): container finished" podID="c0456561-9f16-4a32-b3ec-6ab6aa808b76" containerID="860e53941ce3fa2c04ed10b54cb5f6fa59f9d4631186670908ce0c966139e37f" exitCode=0 Mar 09 13:52:06 crc kubenswrapper[4764]: I0309 13:52:06.617616 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" event={"ID":"c0456561-9f16-4a32-b3ec-6ab6aa808b76","Type":"ContainerDied","Data":"860e53941ce3fa2c04ed10b54cb5f6fa59f9d4631186670908ce0c966139e37f"} Mar 09 13:52:07 crc kubenswrapper[4764]: I0309 13:52:07.025684 4764 scope.go:117] "RemoveContainer" containerID="bcb17cd274a1a85d7439c286f89b2351717de890e1de960b1f945d832ef377e9" Mar 09 13:52:07 crc kubenswrapper[4764]: I0309 13:52:07.073191 4764 scope.go:117] "RemoveContainer" containerID="ad8cc18bf0e9a68496606eb3c3aef5b4008faa06bd6343718c7f8c425fd14c07" Mar 09 13:52:07 crc kubenswrapper[4764]: I0309 13:52:07.129520 4764 scope.go:117] "RemoveContainer" containerID="e95470c676ddedadba89efafc3707fbce528908b0bfa879405e032128d81cc49" Mar 09 13:52:07 crc kubenswrapper[4764]: I0309 13:52:07.178231 4764 scope.go:117] "RemoveContainer" containerID="ec23e6a8c58e7f0bc7312e60b63b0c20314c347b59fda500de1bb63ca3c5e6c6" Mar 09 13:52:07 crc kubenswrapper[4764]: I0309 13:52:07.216774 4764 scope.go:117] "RemoveContainer" containerID="1a80c7f5de2d815f10d1b147b52b1c923bf4e3266278afd841a98b5bc66a8ad6" Mar 09 13:52:08 crc kubenswrapper[4764]: I0309 13:52:08.012848 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" Mar 09 13:52:08 crc kubenswrapper[4764]: I0309 13:52:08.099804 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65n7r\" (UniqueName: \"kubernetes.io/projected/c0456561-9f16-4a32-b3ec-6ab6aa808b76-kube-api-access-65n7r\") pod \"c0456561-9f16-4a32-b3ec-6ab6aa808b76\" (UID: \"c0456561-9f16-4a32-b3ec-6ab6aa808b76\") " Mar 09 13:52:08 crc kubenswrapper[4764]: I0309 13:52:08.099902 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0456561-9f16-4a32-b3ec-6ab6aa808b76-ssh-key-openstack-edpm-ipam\") pod \"c0456561-9f16-4a32-b3ec-6ab6aa808b76\" (UID: \"c0456561-9f16-4a32-b3ec-6ab6aa808b76\") " Mar 09 13:52:08 crc kubenswrapper[4764]: I0309 13:52:08.100001 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0456561-9f16-4a32-b3ec-6ab6aa808b76-inventory\") pod \"c0456561-9f16-4a32-b3ec-6ab6aa808b76\" (UID: \"c0456561-9f16-4a32-b3ec-6ab6aa808b76\") " Mar 09 13:52:08 crc kubenswrapper[4764]: I0309 13:52:08.107041 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0456561-9f16-4a32-b3ec-6ab6aa808b76-kube-api-access-65n7r" (OuterVolumeSpecName: "kube-api-access-65n7r") pod "c0456561-9f16-4a32-b3ec-6ab6aa808b76" (UID: "c0456561-9f16-4a32-b3ec-6ab6aa808b76"). InnerVolumeSpecName "kube-api-access-65n7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:52:08 crc kubenswrapper[4764]: I0309 13:52:08.127509 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0456561-9f16-4a32-b3ec-6ab6aa808b76-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c0456561-9f16-4a32-b3ec-6ab6aa808b76" (UID: "c0456561-9f16-4a32-b3ec-6ab6aa808b76"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:52:08 crc kubenswrapper[4764]: I0309 13:52:08.133079 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0456561-9f16-4a32-b3ec-6ab6aa808b76-inventory" (OuterVolumeSpecName: "inventory") pod "c0456561-9f16-4a32-b3ec-6ab6aa808b76" (UID: "c0456561-9f16-4a32-b3ec-6ab6aa808b76"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:52:08 crc kubenswrapper[4764]: I0309 13:52:08.201278 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65n7r\" (UniqueName: \"kubernetes.io/projected/c0456561-9f16-4a32-b3ec-6ab6aa808b76-kube-api-access-65n7r\") on node \"crc\" DevicePath \"\"" Mar 09 13:52:08 crc kubenswrapper[4764]: I0309 13:52:08.201309 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0456561-9f16-4a32-b3ec-6ab6aa808b76-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:52:08 crc kubenswrapper[4764]: I0309 13:52:08.201323 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0456561-9f16-4a32-b3ec-6ab6aa808b76-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:52:08 crc kubenswrapper[4764]: I0309 13:52:08.643611 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" event={"ID":"c0456561-9f16-4a32-b3ec-6ab6aa808b76","Type":"ContainerDied","Data":"7d9162164851ca102e807aeca92eff9a13dffecebd98044e7f6059bda0fecc3a"} Mar 09 13:52:08 crc kubenswrapper[4764]: I0309 13:52:08.643689 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d9162164851ca102e807aeca92eff9a13dffecebd98044e7f6059bda0fecc3a" Mar 09 13:52:08 crc kubenswrapper[4764]: I0309 13:52:08.643770 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw" Mar 09 13:52:10 crc kubenswrapper[4764]: I0309 13:52:10.034307 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-tchwl"] Mar 09 13:52:10 crc kubenswrapper[4764]: I0309 13:52:10.047883 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-tchwl"] Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.038738 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-tbf9j"] Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.048337 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-mnqg7"] Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.057527 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-f7d8-account-create-update-rp748"] Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.069818 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-mnqg7"] Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.080367 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-tbf9j"] Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.090348 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cc98-account-create-update-sjfqm"] Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.098806 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-78eb-account-create-update-2dqgt"] Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.107921 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-f7d8-account-create-update-rp748"] Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.115776 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cc98-account-create-update-sjfqm"] Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.127106 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-78eb-account-create-update-2dqgt"] Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.572613 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3db8af07-1310-4cd5-be07-3fd062fe89a7" path="/var/lib/kubelet/pods/3db8af07-1310-4cd5-be07-3fd062fe89a7/volumes" Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.573463 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c" path="/var/lib/kubelet/pods/5d6e8b3d-dc6c-4b1a-946f-1392d1a5222c/volumes" Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.574243 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66003ca3-e579-4dab-b714-b5b2baa26bad" path="/var/lib/kubelet/pods/66003ca3-e579-4dab-b714-b5b2baa26bad/volumes" Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.575031 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fa35355-06e1-403f-9691-92398769ac09" path="/var/lib/kubelet/pods/8fa35355-06e1-403f-9691-92398769ac09/volumes" Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.576183 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a75ea85a-1e66-4e8d-92d7-6f9b766abfda" path="/var/lib/kubelet/pods/a75ea85a-1e66-4e8d-92d7-6f9b766abfda/volumes" Mar 09 13:52:11 crc kubenswrapper[4764]: I0309 13:52:11.576807 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5daba6a-a01a-4400-aa87-01f9efd3abd8" path="/var/lib/kubelet/pods/b5daba6a-a01a-4400-aa87-01f9efd3abd8/volumes" Mar 09 13:52:12 crc kubenswrapper[4764]: I0309 13:52:12.560167 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:52:12 crc kubenswrapper[4764]: E0309 13:52:12.560535 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:52:23 crc kubenswrapper[4764]: I0309 13:52:23.559617 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:52:23 crc kubenswrapper[4764]: E0309 13:52:23.560826 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:52:38 crc kubenswrapper[4764]: I0309 13:52:38.560213 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:52:38 crc kubenswrapper[4764]: I0309 13:52:38.957672 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"c4a2ad399af00232f256b54f6dba8cd11d56a98b4177187d6164b2b8cca69e65"} Mar 09 13:52:39 crc kubenswrapper[4764]: I0309 13:52:39.059501 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kkcml"] Mar 09 13:52:39 crc kubenswrapper[4764]: I0309 13:52:39.071690 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kkcml"] Mar 09 13:52:39 crc kubenswrapper[4764]: I0309 13:52:39.572341 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0a40476-ff1d-443d-846f-a54cd956aaa3" path="/var/lib/kubelet/pods/c0a40476-ff1d-443d-846f-a54cd956aaa3/volumes" Mar 09 13:52:54 crc kubenswrapper[4764]: I0309 13:52:54.755234 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7zckj"] Mar 09 13:52:54 crc kubenswrapper[4764]: E0309 13:52:54.756566 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0456561-9f16-4a32-b3ec-6ab6aa808b76" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:52:54 crc kubenswrapper[4764]: I0309 13:52:54.756589 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0456561-9f16-4a32-b3ec-6ab6aa808b76" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:52:54 crc kubenswrapper[4764]: E0309 13:52:54.756611 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f422975f-b0ee-4ef9-be32-3aac0003a54d" containerName="oc" Mar 09 13:52:54 crc kubenswrapper[4764]: I0309 13:52:54.756619 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f422975f-b0ee-4ef9-be32-3aac0003a54d" containerName="oc" Mar 09 13:52:54 crc kubenswrapper[4764]: I0309 13:52:54.756850 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f422975f-b0ee-4ef9-be32-3aac0003a54d" containerName="oc" Mar 09 13:52:54 crc kubenswrapper[4764]: I0309 13:52:54.756862 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0456561-9f16-4a32-b3ec-6ab6aa808b76" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:52:54 crc kubenswrapper[4764]: I0309 13:52:54.758487 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:52:54 crc kubenswrapper[4764]: I0309 13:52:54.768920 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7zckj"] Mar 09 13:52:54 crc kubenswrapper[4764]: I0309 13:52:54.870421 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53cd597b-146b-436f-9f54-1fa50726458b-utilities\") pod \"certified-operators-7zckj\" (UID: \"53cd597b-146b-436f-9f54-1fa50726458b\") " pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:52:54 crc kubenswrapper[4764]: I0309 13:52:54.870520 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53cd597b-146b-436f-9f54-1fa50726458b-catalog-content\") pod \"certified-operators-7zckj\" (UID: \"53cd597b-146b-436f-9f54-1fa50726458b\") " pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:52:54 crc kubenswrapper[4764]: I0309 13:52:54.871033 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtj56\" (UniqueName: \"kubernetes.io/projected/53cd597b-146b-436f-9f54-1fa50726458b-kube-api-access-wtj56\") pod \"certified-operators-7zckj\" (UID: \"53cd597b-146b-436f-9f54-1fa50726458b\") " pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:52:54 crc kubenswrapper[4764]: I0309 13:52:54.973788 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53cd597b-146b-436f-9f54-1fa50726458b-utilities\") pod \"certified-operators-7zckj\" (UID: \"53cd597b-146b-436f-9f54-1fa50726458b\") " pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:52:54 crc kubenswrapper[4764]: I0309 13:52:54.973933 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53cd597b-146b-436f-9f54-1fa50726458b-catalog-content\") pod \"certified-operators-7zckj\" (UID: \"53cd597b-146b-436f-9f54-1fa50726458b\") " pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:52:54 crc kubenswrapper[4764]: I0309 13:52:54.974023 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtj56\" (UniqueName: \"kubernetes.io/projected/53cd597b-146b-436f-9f54-1fa50726458b-kube-api-access-wtj56\") pod \"certified-operators-7zckj\" (UID: \"53cd597b-146b-436f-9f54-1fa50726458b\") " pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:52:54 crc kubenswrapper[4764]: I0309 13:52:54.974431 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53cd597b-146b-436f-9f54-1fa50726458b-utilities\") pod \"certified-operators-7zckj\" (UID: \"53cd597b-146b-436f-9f54-1fa50726458b\") " pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:52:54 crc kubenswrapper[4764]: I0309 13:52:54.974610 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53cd597b-146b-436f-9f54-1fa50726458b-catalog-content\") pod \"certified-operators-7zckj\" (UID: \"53cd597b-146b-436f-9f54-1fa50726458b\") " pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:52:55 crc kubenswrapper[4764]: I0309 13:52:54.998432 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtj56\" (UniqueName: \"kubernetes.io/projected/53cd597b-146b-436f-9f54-1fa50726458b-kube-api-access-wtj56\") pod \"certified-operators-7zckj\" (UID: \"53cd597b-146b-436f-9f54-1fa50726458b\") " pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:52:55 crc kubenswrapper[4764]: I0309 13:52:55.084151 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:52:55 crc kubenswrapper[4764]: I0309 13:52:55.646574 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7zckj"] Mar 09 13:52:56 crc kubenswrapper[4764]: I0309 13:52:56.129178 4764 generic.go:334] "Generic (PLEG): container finished" podID="53cd597b-146b-436f-9f54-1fa50726458b" containerID="0adb8a88fa46fd13d25ba3d3e14b2b447ec8398ba9dd882cdf16f9bcdadba4c0" exitCode=0 Mar 09 13:52:56 crc kubenswrapper[4764]: I0309 13:52:56.129316 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zckj" event={"ID":"53cd597b-146b-436f-9f54-1fa50726458b","Type":"ContainerDied","Data":"0adb8a88fa46fd13d25ba3d3e14b2b447ec8398ba9dd882cdf16f9bcdadba4c0"} Mar 09 13:52:56 crc kubenswrapper[4764]: I0309 13:52:56.130362 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zckj" event={"ID":"53cd597b-146b-436f-9f54-1fa50726458b","Type":"ContainerStarted","Data":"f91c88d7aa8ba5dc934caea9344568d20f4e3ca0531d94b4dcabea782162e573"} Mar 09 13:52:57 crc kubenswrapper[4764]: I0309 13:52:57.141032 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zckj" event={"ID":"53cd597b-146b-436f-9f54-1fa50726458b","Type":"ContainerStarted","Data":"52679f972d1aa1ea08ab209079a34253c07c25a5a7b269c4910c13b1a6d5748c"} Mar 09 13:52:58 crc kubenswrapper[4764]: I0309 13:52:58.153411 4764 generic.go:334] "Generic (PLEG): container finished" podID="53cd597b-146b-436f-9f54-1fa50726458b" containerID="52679f972d1aa1ea08ab209079a34253c07c25a5a7b269c4910c13b1a6d5748c" exitCode=0 Mar 09 13:52:58 crc kubenswrapper[4764]: I0309 13:52:58.153479 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zckj" event={"ID":"53cd597b-146b-436f-9f54-1fa50726458b","Type":"ContainerDied","Data":"52679f972d1aa1ea08ab209079a34253c07c25a5a7b269c4910c13b1a6d5748c"} Mar 09 13:52:59 crc kubenswrapper[4764]: I0309 13:52:59.164534 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zckj" event={"ID":"53cd597b-146b-436f-9f54-1fa50726458b","Type":"ContainerStarted","Data":"919bab3d178ecf7ae1c880f3441a6903919b868385a924c647dfdacd21b81aa9"} Mar 09 13:52:59 crc kubenswrapper[4764]: I0309 13:52:59.230396 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7zckj" podStartSLOduration=2.755607779 podStartE2EDuration="5.230368892s" podCreationTimestamp="2026-03-09 13:52:54 +0000 UTC" firstStartedPulling="2026-03-09 13:52:56.131165571 +0000 UTC m=+1931.381337469" lastFinishedPulling="2026-03-09 13:52:58.605926674 +0000 UTC m=+1933.856098582" observedRunningTime="2026-03-09 13:52:59.214013536 +0000 UTC m=+1934.464185464" watchObservedRunningTime="2026-03-09 13:52:59.230368892 +0000 UTC m=+1934.480540820" Mar 09 13:53:03 crc kubenswrapper[4764]: I0309 13:53:03.049141 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-5k46p"] Mar 09 13:53:03 crc kubenswrapper[4764]: I0309 13:53:03.063776 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nlxjx"] Mar 09 13:53:03 crc kubenswrapper[4764]: I0309 13:53:03.077517 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-5k46p"] Mar 09 13:53:03 crc kubenswrapper[4764]: I0309 13:53:03.091119 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nlxjx"] Mar 09 13:53:03 crc kubenswrapper[4764]: I0309 13:53:03.575193 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f09c604-028e-4965-aef8-6005ae365be9" path="/var/lib/kubelet/pods/9f09c604-028e-4965-aef8-6005ae365be9/volumes" Mar 09 13:53:03 crc kubenswrapper[4764]: I0309 13:53:03.576181 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b60c99da-3ae5-4340-bcb0-870731679c16" path="/var/lib/kubelet/pods/b60c99da-3ae5-4340-bcb0-870731679c16/volumes" Mar 09 13:53:05 crc kubenswrapper[4764]: I0309 13:53:05.085310 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:53:05 crc kubenswrapper[4764]: I0309 13:53:05.085548 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:53:05 crc kubenswrapper[4764]: I0309 13:53:05.136118 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:53:05 crc kubenswrapper[4764]: I0309 13:53:05.291237 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:53:07 crc kubenswrapper[4764]: I0309 13:53:07.414815 4764 scope.go:117] "RemoveContainer" containerID="83d6da43ad98ba05734f9db9d33ae5993158ca7ef62b6b91c4677e94546cef00" Mar 09 13:53:07 crc kubenswrapper[4764]: I0309 13:53:07.457033 4764 scope.go:117] "RemoveContainer" containerID="68e1ea9724cff446a48722f5d3beb35f95107e4f92a2482a35467b3ee65b6d69" Mar 09 13:53:07 crc kubenswrapper[4764]: I0309 13:53:07.479717 4764 scope.go:117] "RemoveContainer" containerID="d7855c57065ea118e0a7a66a2f52d85558ba845a318ccc1f11c06fba1afae771" Mar 09 13:53:07 crc kubenswrapper[4764]: I0309 13:53:07.542334 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7zckj"] Mar 09 13:53:07 crc kubenswrapper[4764]: I0309 13:53:07.554913 4764 scope.go:117] "RemoveContainer" containerID="2c7f017bce7c92c14d6be609c4899f3333d712b1c6dc036ede4a982ad2477f70" Mar 09 13:53:07 crc kubenswrapper[4764]: I0309 13:53:07.580370 4764 scope.go:117] "RemoveContainer" containerID="ea1b24536350a8debd8d3a73db534419bac8b767469b0a62084cf6812d4fbd33" Mar 09 13:53:07 crc kubenswrapper[4764]: I0309 13:53:07.627539 4764 scope.go:117] "RemoveContainer" containerID="17c99031bb7ad39e3a6bf2e953eeaec9098edceedb7c2ac4d40bb3b9857101a1" Mar 09 13:53:07 crc kubenswrapper[4764]: I0309 13:53:07.670132 4764 scope.go:117] "RemoveContainer" containerID="5ac07a8dfc2c0d6d538f5fde06f5c7806d78d08f4964c253a85bccef288caf3e" Mar 09 13:53:07 crc kubenswrapper[4764]: I0309 13:53:07.697682 4764 scope.go:117] "RemoveContainer" containerID="01f8a44cba8ccff51deb3a73f62a17d6b208f1f8fb17e65f80892f3b0a90b431" Mar 09 13:53:07 crc kubenswrapper[4764]: I0309 13:53:07.738926 4764 scope.go:117] "RemoveContainer" containerID="d053ca844a6ffecdf233cee0796c2188368f4f6b2a5097474b6a14b438549f4b" Mar 09 13:53:08 crc kubenswrapper[4764]: I0309 13:53:08.268693 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7zckj" podUID="53cd597b-146b-436f-9f54-1fa50726458b" containerName="registry-server" containerID="cri-o://919bab3d178ecf7ae1c880f3441a6903919b868385a924c647dfdacd21b81aa9" gracePeriod=2 Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.233858 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.294338 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtj56\" (UniqueName: \"kubernetes.io/projected/53cd597b-146b-436f-9f54-1fa50726458b-kube-api-access-wtj56\") pod \"53cd597b-146b-436f-9f54-1fa50726458b\" (UID: \"53cd597b-146b-436f-9f54-1fa50726458b\") " Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.294570 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53cd597b-146b-436f-9f54-1fa50726458b-catalog-content\") pod \"53cd597b-146b-436f-9f54-1fa50726458b\" (UID: \"53cd597b-146b-436f-9f54-1fa50726458b\") " Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.294612 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53cd597b-146b-436f-9f54-1fa50726458b-utilities\") pod \"53cd597b-146b-436f-9f54-1fa50726458b\" (UID: \"53cd597b-146b-436f-9f54-1fa50726458b\") " Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.296009 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53cd597b-146b-436f-9f54-1fa50726458b-utilities" (OuterVolumeSpecName: "utilities") pod "53cd597b-146b-436f-9f54-1fa50726458b" (UID: "53cd597b-146b-436f-9f54-1fa50726458b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.299008 4764 generic.go:334] "Generic (PLEG): container finished" podID="53cd597b-146b-436f-9f54-1fa50726458b" containerID="919bab3d178ecf7ae1c880f3441a6903919b868385a924c647dfdacd21b81aa9" exitCode=0 Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.299128 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zckj" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.299128 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zckj" event={"ID":"53cd597b-146b-436f-9f54-1fa50726458b","Type":"ContainerDied","Data":"919bab3d178ecf7ae1c880f3441a6903919b868385a924c647dfdacd21b81aa9"} Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.299615 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zckj" event={"ID":"53cd597b-146b-436f-9f54-1fa50726458b","Type":"ContainerDied","Data":"f91c88d7aa8ba5dc934caea9344568d20f4e3ca0531d94b4dcabea782162e573"} Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.299683 4764 scope.go:117] "RemoveContainer" containerID="919bab3d178ecf7ae1c880f3441a6903919b868385a924c647dfdacd21b81aa9" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.304043 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53cd597b-146b-436f-9f54-1fa50726458b-kube-api-access-wtj56" (OuterVolumeSpecName: "kube-api-access-wtj56") pod "53cd597b-146b-436f-9f54-1fa50726458b" (UID: "53cd597b-146b-436f-9f54-1fa50726458b"). InnerVolumeSpecName "kube-api-access-wtj56". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.364625 4764 scope.go:117] "RemoveContainer" containerID="52679f972d1aa1ea08ab209079a34253c07c25a5a7b269c4910c13b1a6d5748c" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.381883 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53cd597b-146b-436f-9f54-1fa50726458b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "53cd597b-146b-436f-9f54-1fa50726458b" (UID: "53cd597b-146b-436f-9f54-1fa50726458b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.388006 4764 scope.go:117] "RemoveContainer" containerID="0adb8a88fa46fd13d25ba3d3e14b2b447ec8398ba9dd882cdf16f9bcdadba4c0" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.398053 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53cd597b-146b-436f-9f54-1fa50726458b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.398086 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53cd597b-146b-436f-9f54-1fa50726458b-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.398099 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtj56\" (UniqueName: \"kubernetes.io/projected/53cd597b-146b-436f-9f54-1fa50726458b-kube-api-access-wtj56\") on node \"crc\" DevicePath \"\"" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.431407 4764 scope.go:117] "RemoveContainer" containerID="919bab3d178ecf7ae1c880f3441a6903919b868385a924c647dfdacd21b81aa9" Mar 09 13:53:09 crc kubenswrapper[4764]: E0309 13:53:09.436313 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"919bab3d178ecf7ae1c880f3441a6903919b868385a924c647dfdacd21b81aa9\": container with ID starting with 919bab3d178ecf7ae1c880f3441a6903919b868385a924c647dfdacd21b81aa9 not found: ID does not exist" containerID="919bab3d178ecf7ae1c880f3441a6903919b868385a924c647dfdacd21b81aa9" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.436369 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"919bab3d178ecf7ae1c880f3441a6903919b868385a924c647dfdacd21b81aa9"} err="failed to get container status \"919bab3d178ecf7ae1c880f3441a6903919b868385a924c647dfdacd21b81aa9\": rpc error: code = NotFound desc = could not find container \"919bab3d178ecf7ae1c880f3441a6903919b868385a924c647dfdacd21b81aa9\": container with ID starting with 919bab3d178ecf7ae1c880f3441a6903919b868385a924c647dfdacd21b81aa9 not found: ID does not exist" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.436401 4764 scope.go:117] "RemoveContainer" containerID="52679f972d1aa1ea08ab209079a34253c07c25a5a7b269c4910c13b1a6d5748c" Mar 09 13:53:09 crc kubenswrapper[4764]: E0309 13:53:09.437134 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52679f972d1aa1ea08ab209079a34253c07c25a5a7b269c4910c13b1a6d5748c\": container with ID starting with 52679f972d1aa1ea08ab209079a34253c07c25a5a7b269c4910c13b1a6d5748c not found: ID does not exist" containerID="52679f972d1aa1ea08ab209079a34253c07c25a5a7b269c4910c13b1a6d5748c" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.437160 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52679f972d1aa1ea08ab209079a34253c07c25a5a7b269c4910c13b1a6d5748c"} err="failed to get container status \"52679f972d1aa1ea08ab209079a34253c07c25a5a7b269c4910c13b1a6d5748c\": rpc error: code = NotFound desc = could not find container \"52679f972d1aa1ea08ab209079a34253c07c25a5a7b269c4910c13b1a6d5748c\": container with ID starting with 52679f972d1aa1ea08ab209079a34253c07c25a5a7b269c4910c13b1a6d5748c not found: ID does not exist" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.437176 4764 scope.go:117] "RemoveContainer" containerID="0adb8a88fa46fd13d25ba3d3e14b2b447ec8398ba9dd882cdf16f9bcdadba4c0" Mar 09 13:53:09 crc kubenswrapper[4764]: E0309 13:53:09.438869 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0adb8a88fa46fd13d25ba3d3e14b2b447ec8398ba9dd882cdf16f9bcdadba4c0\": container with ID starting with 0adb8a88fa46fd13d25ba3d3e14b2b447ec8398ba9dd882cdf16f9bcdadba4c0 not found: ID does not exist" containerID="0adb8a88fa46fd13d25ba3d3e14b2b447ec8398ba9dd882cdf16f9bcdadba4c0" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.438905 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0adb8a88fa46fd13d25ba3d3e14b2b447ec8398ba9dd882cdf16f9bcdadba4c0"} err="failed to get container status \"0adb8a88fa46fd13d25ba3d3e14b2b447ec8398ba9dd882cdf16f9bcdadba4c0\": rpc error: code = NotFound desc = could not find container \"0adb8a88fa46fd13d25ba3d3e14b2b447ec8398ba9dd882cdf16f9bcdadba4c0\": container with ID starting with 0adb8a88fa46fd13d25ba3d3e14b2b447ec8398ba9dd882cdf16f9bcdadba4c0 not found: ID does not exist" Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.637093 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7zckj"] Mar 09 13:53:09 crc kubenswrapper[4764]: I0309 13:53:09.651899 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7zckj"] Mar 09 13:53:11 crc kubenswrapper[4764]: I0309 13:53:11.585911 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53cd597b-146b-436f-9f54-1fa50726458b" path="/var/lib/kubelet/pods/53cd597b-146b-436f-9f54-1fa50726458b/volumes" Mar 09 13:53:48 crc kubenswrapper[4764]: I0309 13:53:48.052412 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-q8h4m"] Mar 09 13:53:48 crc kubenswrapper[4764]: I0309 13:53:48.092787 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-q8h4m"] Mar 09 13:53:49 crc kubenswrapper[4764]: I0309 13:53:49.574225 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d98526d5-8eaa-44a7-a25d-662a4fc8758b" path="/var/lib/kubelet/pods/d98526d5-8eaa-44a7-a25d-662a4fc8758b/volumes" Mar 09 13:54:00 crc kubenswrapper[4764]: I0309 13:54:00.163302 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551074-ztbcz"] Mar 09 13:54:00 crc kubenswrapper[4764]: E0309 13:54:00.165921 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53cd597b-146b-436f-9f54-1fa50726458b" containerName="registry-server" Mar 09 13:54:00 crc kubenswrapper[4764]: I0309 13:54:00.166044 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="53cd597b-146b-436f-9f54-1fa50726458b" containerName="registry-server" Mar 09 13:54:00 crc kubenswrapper[4764]: E0309 13:54:00.166147 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53cd597b-146b-436f-9f54-1fa50726458b" containerName="extract-utilities" Mar 09 13:54:00 crc kubenswrapper[4764]: I0309 13:54:00.166224 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="53cd597b-146b-436f-9f54-1fa50726458b" containerName="extract-utilities" Mar 09 13:54:00 crc kubenswrapper[4764]: E0309 13:54:00.166334 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53cd597b-146b-436f-9f54-1fa50726458b" containerName="extract-content" Mar 09 13:54:00 crc kubenswrapper[4764]: I0309 13:54:00.166412 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="53cd597b-146b-436f-9f54-1fa50726458b" containerName="extract-content" Mar 09 13:54:00 crc kubenswrapper[4764]: I0309 13:54:00.166791 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="53cd597b-146b-436f-9f54-1fa50726458b" containerName="registry-server" Mar 09 13:54:00 crc kubenswrapper[4764]: I0309 13:54:00.168026 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551074-ztbcz" Mar 09 13:54:00 crc kubenswrapper[4764]: I0309 13:54:00.171517 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:54:00 crc kubenswrapper[4764]: I0309 13:54:00.172156 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:54:00 crc kubenswrapper[4764]: I0309 13:54:00.172435 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:54:00 crc kubenswrapper[4764]: I0309 13:54:00.182786 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551074-ztbcz"] Mar 09 13:54:00 crc kubenswrapper[4764]: I0309 13:54:00.262418 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj6fm\" (UniqueName: \"kubernetes.io/projected/909c58d5-d4d7-4042-94f0-df77bda9590a-kube-api-access-vj6fm\") pod \"auto-csr-approver-29551074-ztbcz\" (UID: \"909c58d5-d4d7-4042-94f0-df77bda9590a\") " pod="openshift-infra/auto-csr-approver-29551074-ztbcz" Mar 09 13:54:00 crc kubenswrapper[4764]: I0309 13:54:00.365536 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj6fm\" (UniqueName: \"kubernetes.io/projected/909c58d5-d4d7-4042-94f0-df77bda9590a-kube-api-access-vj6fm\") pod \"auto-csr-approver-29551074-ztbcz\" (UID: \"909c58d5-d4d7-4042-94f0-df77bda9590a\") " pod="openshift-infra/auto-csr-approver-29551074-ztbcz" Mar 09 13:54:00 crc kubenswrapper[4764]: I0309 13:54:00.397198 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj6fm\" (UniqueName: \"kubernetes.io/projected/909c58d5-d4d7-4042-94f0-df77bda9590a-kube-api-access-vj6fm\") pod \"auto-csr-approver-29551074-ztbcz\" (UID: \"909c58d5-d4d7-4042-94f0-df77bda9590a\") " pod="openshift-infra/auto-csr-approver-29551074-ztbcz" Mar 09 13:54:00 crc kubenswrapper[4764]: I0309 13:54:00.507955 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551074-ztbcz" Mar 09 13:54:01 crc kubenswrapper[4764]: I0309 13:54:01.015092 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551074-ztbcz"] Mar 09 13:54:01 crc kubenswrapper[4764]: W0309 13:54:01.024422 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod909c58d5_d4d7_4042_94f0_df77bda9590a.slice/crio-faf0a4667a66e59562a883a2c9fef864e6e3bfde2c7c8428277f182730288329 WatchSource:0}: Error finding container faf0a4667a66e59562a883a2c9fef864e6e3bfde2c7c8428277f182730288329: Status 404 returned error can't find the container with id faf0a4667a66e59562a883a2c9fef864e6e3bfde2c7c8428277f182730288329 Mar 09 13:54:01 crc kubenswrapper[4764]: I0309 13:54:01.837564 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551074-ztbcz" event={"ID":"909c58d5-d4d7-4042-94f0-df77bda9590a","Type":"ContainerStarted","Data":"faf0a4667a66e59562a883a2c9fef864e6e3bfde2c7c8428277f182730288329"} Mar 09 13:54:02 crc kubenswrapper[4764]: I0309 13:54:02.851936 4764 generic.go:334] "Generic (PLEG): container finished" podID="909c58d5-d4d7-4042-94f0-df77bda9590a" containerID="5510a27aa618536b31f12ced254c914aa21f71e4d8962e547b119bb29d1548f1" exitCode=0 Mar 09 13:54:02 crc kubenswrapper[4764]: I0309 13:54:02.852067 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551074-ztbcz" event={"ID":"909c58d5-d4d7-4042-94f0-df77bda9590a","Type":"ContainerDied","Data":"5510a27aa618536b31f12ced254c914aa21f71e4d8962e547b119bb29d1548f1"} Mar 09 13:54:04 crc kubenswrapper[4764]: I0309 13:54:04.182261 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551074-ztbcz" Mar 09 13:54:04 crc kubenswrapper[4764]: I0309 13:54:04.251740 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vj6fm\" (UniqueName: \"kubernetes.io/projected/909c58d5-d4d7-4042-94f0-df77bda9590a-kube-api-access-vj6fm\") pod \"909c58d5-d4d7-4042-94f0-df77bda9590a\" (UID: \"909c58d5-d4d7-4042-94f0-df77bda9590a\") " Mar 09 13:54:04 crc kubenswrapper[4764]: I0309 13:54:04.258948 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/909c58d5-d4d7-4042-94f0-df77bda9590a-kube-api-access-vj6fm" (OuterVolumeSpecName: "kube-api-access-vj6fm") pod "909c58d5-d4d7-4042-94f0-df77bda9590a" (UID: "909c58d5-d4d7-4042-94f0-df77bda9590a"). InnerVolumeSpecName "kube-api-access-vj6fm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:54:04 crc kubenswrapper[4764]: I0309 13:54:04.355607 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vj6fm\" (UniqueName: \"kubernetes.io/projected/909c58d5-d4d7-4042-94f0-df77bda9590a-kube-api-access-vj6fm\") on node \"crc\" DevicePath \"\"" Mar 09 13:54:04 crc kubenswrapper[4764]: I0309 13:54:04.873348 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551074-ztbcz" event={"ID":"909c58d5-d4d7-4042-94f0-df77bda9590a","Type":"ContainerDied","Data":"faf0a4667a66e59562a883a2c9fef864e6e3bfde2c7c8428277f182730288329"} Mar 09 13:54:04 crc kubenswrapper[4764]: I0309 13:54:04.873410 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="faf0a4667a66e59562a883a2c9fef864e6e3bfde2c7c8428277f182730288329" Mar 09 13:54:04 crc kubenswrapper[4764]: I0309 13:54:04.873464 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551074-ztbcz" Mar 09 13:54:05 crc kubenswrapper[4764]: I0309 13:54:05.261364 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551068-v6md5"] Mar 09 13:54:05 crc kubenswrapper[4764]: I0309 13:54:05.270864 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551068-v6md5"] Mar 09 13:54:05 crc kubenswrapper[4764]: I0309 13:54:05.575625 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab11b944-7857-4998-b32b-264ac7683616" path="/var/lib/kubelet/pods/ab11b944-7857-4998-b32b-264ac7683616/volumes" Mar 09 13:54:07 crc kubenswrapper[4764]: I0309 13:54:07.931189 4764 scope.go:117] "RemoveContainer" containerID="2a7ab8e616981504d9d31e4fc4313f083401f97f7e60e7b3cdc2825dfc09335b" Mar 09 13:54:07 crc kubenswrapper[4764]: I0309 13:54:07.993075 4764 scope.go:117] "RemoveContainer" containerID="858a961159a6d1cec6dd22f3429517714ebd1b6a16fd98b28235cb60fc1a94ee" Mar 09 13:54:58 crc kubenswrapper[4764]: I0309 13:54:58.370371 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:54:58 crc kubenswrapper[4764]: I0309 13:54:58.371077 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:55:28 crc kubenswrapper[4764]: I0309 13:55:28.370258 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:55:28 crc kubenswrapper[4764]: I0309 13:55:28.371051 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:55:58 crc kubenswrapper[4764]: I0309 13:55:58.371031 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:55:58 crc kubenswrapper[4764]: I0309 13:55:58.371967 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:55:58 crc kubenswrapper[4764]: I0309 13:55:58.372059 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:55:58 crc kubenswrapper[4764]: I0309 13:55:58.373422 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c4a2ad399af00232f256b54f6dba8cd11d56a98b4177187d6164b2b8cca69e65"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:55:58 crc kubenswrapper[4764]: I0309 13:55:58.373498 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://c4a2ad399af00232f256b54f6dba8cd11d56a98b4177187d6164b2b8cca69e65" gracePeriod=600 Mar 09 13:55:59 crc kubenswrapper[4764]: I0309 13:55:59.009927 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="c4a2ad399af00232f256b54f6dba8cd11d56a98b4177187d6164b2b8cca69e65" exitCode=0 Mar 09 13:55:59 crc kubenswrapper[4764]: I0309 13:55:59.010000 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"c4a2ad399af00232f256b54f6dba8cd11d56a98b4177187d6164b2b8cca69e65"} Mar 09 13:55:59 crc kubenswrapper[4764]: I0309 13:55:59.010755 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211"} Mar 09 13:55:59 crc kubenswrapper[4764]: I0309 13:55:59.010796 4764 scope.go:117] "RemoveContainer" containerID="a87b87af769425372dc2009201df4d5e0d5a91fd68dbae94c9002e0b7115dafb" Mar 09 13:56:00 crc kubenswrapper[4764]: I0309 13:56:00.161191 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551076-qrqw5"] Mar 09 13:56:00 crc kubenswrapper[4764]: E0309 13:56:00.162352 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="909c58d5-d4d7-4042-94f0-df77bda9590a" containerName="oc" Mar 09 13:56:00 crc kubenswrapper[4764]: I0309 13:56:00.162373 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="909c58d5-d4d7-4042-94f0-df77bda9590a" containerName="oc" Mar 09 13:56:00 crc kubenswrapper[4764]: I0309 13:56:00.162614 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="909c58d5-d4d7-4042-94f0-df77bda9590a" containerName="oc" Mar 09 13:56:00 crc kubenswrapper[4764]: I0309 13:56:00.163705 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551076-qrqw5" Mar 09 13:56:00 crc kubenswrapper[4764]: I0309 13:56:00.166808 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:56:00 crc kubenswrapper[4764]: I0309 13:56:00.167483 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:56:00 crc kubenswrapper[4764]: I0309 13:56:00.167625 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:56:00 crc kubenswrapper[4764]: I0309 13:56:00.169939 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551076-qrqw5"] Mar 09 13:56:00 crc kubenswrapper[4764]: I0309 13:56:00.258276 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8v9c\" (UniqueName: \"kubernetes.io/projected/7a9d864e-dad7-4c7d-a639-d4042bb3339d-kube-api-access-v8v9c\") pod \"auto-csr-approver-29551076-qrqw5\" (UID: \"7a9d864e-dad7-4c7d-a639-d4042bb3339d\") " pod="openshift-infra/auto-csr-approver-29551076-qrqw5" Mar 09 13:56:00 crc kubenswrapper[4764]: I0309 13:56:00.361843 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8v9c\" (UniqueName: \"kubernetes.io/projected/7a9d864e-dad7-4c7d-a639-d4042bb3339d-kube-api-access-v8v9c\") pod \"auto-csr-approver-29551076-qrqw5\" (UID: \"7a9d864e-dad7-4c7d-a639-d4042bb3339d\") " pod="openshift-infra/auto-csr-approver-29551076-qrqw5" Mar 09 13:56:00 crc kubenswrapper[4764]: I0309 13:56:00.389530 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8v9c\" (UniqueName: \"kubernetes.io/projected/7a9d864e-dad7-4c7d-a639-d4042bb3339d-kube-api-access-v8v9c\") pod \"auto-csr-approver-29551076-qrqw5\" (UID: \"7a9d864e-dad7-4c7d-a639-d4042bb3339d\") " pod="openshift-infra/auto-csr-approver-29551076-qrqw5" Mar 09 13:56:00 crc kubenswrapper[4764]: I0309 13:56:00.489271 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551076-qrqw5" Mar 09 13:56:00 crc kubenswrapper[4764]: I0309 13:56:00.960259 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551076-qrqw5"] Mar 09 13:56:00 crc kubenswrapper[4764]: I0309 13:56:00.976971 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.034316 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.043763 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.046043 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551076-qrqw5" event={"ID":"7a9d864e-dad7-4c7d-a639-d4042bb3339d","Type":"ContainerStarted","Data":"71b0efab7d289e8510ef41b131942c784f08b8a9d5fbe745d921f9dc131aa089"} Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.058706 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.066906 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-ljpp5"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.095106 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.105385 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.115332 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-g84h2"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.123333 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.129993 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-n4wgh"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.137164 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-89zj7"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.143933 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.152471 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lzvr"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.161095 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-ljpp5"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.168500 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.175698 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-x4txb"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.182276 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.189582 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-j2fzg"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.200863 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-84wxw"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.215217 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4pj8q"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.229958 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hl84w"] Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.571882 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07f61b11-aba4-469c-a5ed-9566f1951559" path="/var/lib/kubelet/pods/07f61b11-aba4-469c-a5ed-9566f1951559/volumes" Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.572788 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dbc4eda-5f77-4951-962f-9ed0b1308df0" path="/var/lib/kubelet/pods/1dbc4eda-5f77-4951-962f-9ed0b1308df0/volumes" Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.573401 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38445f30-348d-4c11-94c5-81bca885cc36" path="/var/lib/kubelet/pods/38445f30-348d-4c11-94c5-81bca885cc36/volumes" Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.573968 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ee15cfe-dd3c-4cc7-bf8f-b324397f4add" path="/var/lib/kubelet/pods/7ee15cfe-dd3c-4cc7-bf8f-b324397f4add/volumes" Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.575092 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85b3887c-ea0d-4ca0-a862-0134f0ae08b5" path="/var/lib/kubelet/pods/85b3887c-ea0d-4ca0-a862-0134f0ae08b5/volumes" Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.575634 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ec29e7b-3537-459a-bfb4-acc93e1e5ec2" path="/var/lib/kubelet/pods/9ec29e7b-3537-459a-bfb4-acc93e1e5ec2/volumes" Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.576156 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fc5b263-ac73-4b6e-8e41-4ed508765c55" path="/var/lib/kubelet/pods/9fc5b263-ac73-4b6e-8e41-4ed508765c55/volumes" Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.577195 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0456561-9f16-4a32-b3ec-6ab6aa808b76" path="/var/lib/kubelet/pods/c0456561-9f16-4a32-b3ec-6ab6aa808b76/volumes" Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.577740 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5f7b3e4-69e8-4529-9973-63b6af6b5e5c" path="/var/lib/kubelet/pods/d5f7b3e4-69e8-4529-9973-63b6af6b5e5c/volumes" Mar 09 13:56:01 crc kubenswrapper[4764]: I0309 13:56:01.578277 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe0d9990-083b-428b-baec-a40ae99487db" path="/var/lib/kubelet/pods/fe0d9990-083b-428b-baec-a40ae99487db/volumes" Mar 09 13:56:03 crc kubenswrapper[4764]: I0309 13:56:03.070112 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551076-qrqw5" event={"ID":"7a9d864e-dad7-4c7d-a639-d4042bb3339d","Type":"ContainerStarted","Data":"8f590f50cdda09a93b8757a7d03d71c1018f7d81bfe8e8784e29856175854a29"} Mar 09 13:56:03 crc kubenswrapper[4764]: I0309 13:56:03.102890 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551076-qrqw5" podStartSLOduration=1.472397691 podStartE2EDuration="3.102865283s" podCreationTimestamp="2026-03-09 13:56:00 +0000 UTC" firstStartedPulling="2026-03-09 13:56:00.976624903 +0000 UTC m=+2116.226796821" lastFinishedPulling="2026-03-09 13:56:02.607092505 +0000 UTC m=+2117.857264413" observedRunningTime="2026-03-09 13:56:03.090560625 +0000 UTC m=+2118.340732543" watchObservedRunningTime="2026-03-09 13:56:03.102865283 +0000 UTC m=+2118.353037201" Mar 09 13:56:04 crc kubenswrapper[4764]: I0309 13:56:04.082888 4764 generic.go:334] "Generic (PLEG): container finished" podID="7a9d864e-dad7-4c7d-a639-d4042bb3339d" containerID="8f590f50cdda09a93b8757a7d03d71c1018f7d81bfe8e8784e29856175854a29" exitCode=0 Mar 09 13:56:04 crc kubenswrapper[4764]: I0309 13:56:04.082959 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551076-qrqw5" event={"ID":"7a9d864e-dad7-4c7d-a639-d4042bb3339d","Type":"ContainerDied","Data":"8f590f50cdda09a93b8757a7d03d71c1018f7d81bfe8e8784e29856175854a29"} Mar 09 13:56:05 crc kubenswrapper[4764]: I0309 13:56:05.470234 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551076-qrqw5" Mar 09 13:56:05 crc kubenswrapper[4764]: I0309 13:56:05.577521 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8v9c\" (UniqueName: \"kubernetes.io/projected/7a9d864e-dad7-4c7d-a639-d4042bb3339d-kube-api-access-v8v9c\") pod \"7a9d864e-dad7-4c7d-a639-d4042bb3339d\" (UID: \"7a9d864e-dad7-4c7d-a639-d4042bb3339d\") " Mar 09 13:56:05 crc kubenswrapper[4764]: I0309 13:56:05.584745 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a9d864e-dad7-4c7d-a639-d4042bb3339d-kube-api-access-v8v9c" (OuterVolumeSpecName: "kube-api-access-v8v9c") pod "7a9d864e-dad7-4c7d-a639-d4042bb3339d" (UID: "7a9d864e-dad7-4c7d-a639-d4042bb3339d"). InnerVolumeSpecName "kube-api-access-v8v9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:56:05 crc kubenswrapper[4764]: I0309 13:56:05.680670 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8v9c\" (UniqueName: \"kubernetes.io/projected/7a9d864e-dad7-4c7d-a639-d4042bb3339d-kube-api-access-v8v9c\") on node \"crc\" DevicePath \"\"" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.104559 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551076-qrqw5" event={"ID":"7a9d864e-dad7-4c7d-a639-d4042bb3339d","Type":"ContainerDied","Data":"71b0efab7d289e8510ef41b131942c784f08b8a9d5fbe745d921f9dc131aa089"} Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.104889 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71b0efab7d289e8510ef41b131942c784f08b8a9d5fbe745d921f9dc131aa089" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.104696 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551076-qrqw5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.177611 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551070-x977g"] Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.191822 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551070-x977g"] Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.631956 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5"] Mar 09 13:56:06 crc kubenswrapper[4764]: E0309 13:56:06.634046 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a9d864e-dad7-4c7d-a639-d4042bb3339d" containerName="oc" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.634174 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a9d864e-dad7-4c7d-a639-d4042bb3339d" containerName="oc" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.634501 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a9d864e-dad7-4c7d-a639-d4042bb3339d" containerName="oc" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.635600 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.638879 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.639142 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.639477 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.639702 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.640508 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.648309 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5"] Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.701723 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.701781 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.701848 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.701901 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptkxs\" (UniqueName: \"kubernetes.io/projected/de1bf125-47e1-499c-9cfe-ffbd5c03d194-kube-api-access-ptkxs\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.702095 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.804007 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptkxs\" (UniqueName: \"kubernetes.io/projected/de1bf125-47e1-499c-9cfe-ffbd5c03d194-kube-api-access-ptkxs\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.804067 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.804180 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.804238 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.804279 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.809854 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.810059 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.812033 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.813773 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.826478 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptkxs\" (UniqueName: \"kubernetes.io/projected/de1bf125-47e1-499c-9cfe-ffbd5c03d194-kube-api-access-ptkxs\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:06 crc kubenswrapper[4764]: I0309 13:56:06.958005 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:07 crc kubenswrapper[4764]: I0309 13:56:07.477268 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5"] Mar 09 13:56:07 crc kubenswrapper[4764]: W0309 13:56:07.490257 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde1bf125_47e1_499c_9cfe_ffbd5c03d194.slice/crio-56f8a50aaf961365025d9562e61abf237ceb3b14bb828ea24cce3d59f02d7a83 WatchSource:0}: Error finding container 56f8a50aaf961365025d9562e61abf237ceb3b14bb828ea24cce3d59f02d7a83: Status 404 returned error can't find the container with id 56f8a50aaf961365025d9562e61abf237ceb3b14bb828ea24cce3d59f02d7a83 Mar 09 13:56:07 crc kubenswrapper[4764]: I0309 13:56:07.592375 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14ef871d-e371-41df-9380-53505557d7ac" path="/var/lib/kubelet/pods/14ef871d-e371-41df-9380-53505557d7ac/volumes" Mar 09 13:56:08 crc kubenswrapper[4764]: I0309 13:56:08.122959 4764 scope.go:117] "RemoveContainer" containerID="f0aca49c15e8c97fb96eb53f9577a50ff9c6c25e52f97ab9fc13ad081c1cd506" Mar 09 13:56:08 crc kubenswrapper[4764]: I0309 13:56:08.123672 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" event={"ID":"de1bf125-47e1-499c-9cfe-ffbd5c03d194","Type":"ContainerStarted","Data":"56f8a50aaf961365025d9562e61abf237ceb3b14bb828ea24cce3d59f02d7a83"} Mar 09 13:56:08 crc kubenswrapper[4764]: I0309 13:56:08.170043 4764 scope.go:117] "RemoveContainer" containerID="8097883da308046a256f53d8042acf3d3dddd33e51cc0f93ed512385c36b57c0" Mar 09 13:56:08 crc kubenswrapper[4764]: I0309 13:56:08.240543 4764 scope.go:117] "RemoveContainer" containerID="bde45a06faa972981715117bca5bea2ffa8f521b6f7a639569a4330b9afefe5a" Mar 09 13:56:08 crc kubenswrapper[4764]: I0309 13:56:08.274617 4764 scope.go:117] "RemoveContainer" containerID="1c7c071cdcf2595562a14fc8211f12dd741cead95f0450c54862b35cde78ac5b" Mar 09 13:56:08 crc kubenswrapper[4764]: I0309 13:56:08.318321 4764 scope.go:117] "RemoveContainer" containerID="6e4f8f0feb8ec9ecf660634549716b0c350cba173b98c033e4c7e09aa6bd108d" Mar 09 13:56:08 crc kubenswrapper[4764]: I0309 13:56:08.396679 4764 scope.go:117] "RemoveContainer" containerID="e8df8bc509784d8d529ce5673fdcec9ae8d5b4cbcf9d86d0b27b355b5bffd393" Mar 09 13:56:09 crc kubenswrapper[4764]: I0309 13:56:09.134492 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" event={"ID":"de1bf125-47e1-499c-9cfe-ffbd5c03d194","Type":"ContainerStarted","Data":"f3660ffb8bd5fb3872ac27e05718cfd3efdfa21835cdaa2254ffd4445a2924fc"} Mar 09 13:56:09 crc kubenswrapper[4764]: I0309 13:56:09.155114 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" podStartSLOduration=2.745221798 podStartE2EDuration="3.155096334s" podCreationTimestamp="2026-03-09 13:56:06 +0000 UTC" firstStartedPulling="2026-03-09 13:56:07.494026565 +0000 UTC m=+2122.744198483" lastFinishedPulling="2026-03-09 13:56:07.903901111 +0000 UTC m=+2123.154073019" observedRunningTime="2026-03-09 13:56:09.151326454 +0000 UTC m=+2124.401498352" watchObservedRunningTime="2026-03-09 13:56:09.155096334 +0000 UTC m=+2124.405268242" Mar 09 13:56:19 crc kubenswrapper[4764]: I0309 13:56:19.226858 4764 generic.go:334] "Generic (PLEG): container finished" podID="de1bf125-47e1-499c-9cfe-ffbd5c03d194" containerID="f3660ffb8bd5fb3872ac27e05718cfd3efdfa21835cdaa2254ffd4445a2924fc" exitCode=0 Mar 09 13:56:19 crc kubenswrapper[4764]: I0309 13:56:19.226984 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" event={"ID":"de1bf125-47e1-499c-9cfe-ffbd5c03d194","Type":"ContainerDied","Data":"f3660ffb8bd5fb3872ac27e05718cfd3efdfa21835cdaa2254ffd4445a2924fc"} Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.657088 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.721935 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-ceph\") pod \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.721991 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptkxs\" (UniqueName: \"kubernetes.io/projected/de1bf125-47e1-499c-9cfe-ffbd5c03d194-kube-api-access-ptkxs\") pod \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.722085 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-inventory\") pod \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.722217 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-ssh-key-openstack-edpm-ipam\") pod \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.722251 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-repo-setup-combined-ca-bundle\") pod \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\" (UID: \"de1bf125-47e1-499c-9cfe-ffbd5c03d194\") " Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.729909 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de1bf125-47e1-499c-9cfe-ffbd5c03d194-kube-api-access-ptkxs" (OuterVolumeSpecName: "kube-api-access-ptkxs") pod "de1bf125-47e1-499c-9cfe-ffbd5c03d194" (UID: "de1bf125-47e1-499c-9cfe-ffbd5c03d194"). InnerVolumeSpecName "kube-api-access-ptkxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.730048 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "de1bf125-47e1-499c-9cfe-ffbd5c03d194" (UID: "de1bf125-47e1-499c-9cfe-ffbd5c03d194"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.731245 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-ceph" (OuterVolumeSpecName: "ceph") pod "de1bf125-47e1-499c-9cfe-ffbd5c03d194" (UID: "de1bf125-47e1-499c-9cfe-ffbd5c03d194"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.751954 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-inventory" (OuterVolumeSpecName: "inventory") pod "de1bf125-47e1-499c-9cfe-ffbd5c03d194" (UID: "de1bf125-47e1-499c-9cfe-ffbd5c03d194"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.766374 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "de1bf125-47e1-499c-9cfe-ffbd5c03d194" (UID: "de1bf125-47e1-499c-9cfe-ffbd5c03d194"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.825241 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.825288 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.825305 4764 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.825317 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/de1bf125-47e1-499c-9cfe-ffbd5c03d194-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 13:56:20 crc kubenswrapper[4764]: I0309 13:56:20.825330 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptkxs\" (UniqueName: \"kubernetes.io/projected/de1bf125-47e1-499c-9cfe-ffbd5c03d194-kube-api-access-ptkxs\") on node \"crc\" DevicePath \"\"" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.244241 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" event={"ID":"de1bf125-47e1-499c-9cfe-ffbd5c03d194","Type":"ContainerDied","Data":"56f8a50aaf961365025d9562e61abf237ceb3b14bb828ea24cce3d59f02d7a83"} Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.244300 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56f8a50aaf961365025d9562e61abf237ceb3b14bb828ea24cce3d59f02d7a83" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.244351 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.328270 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk"] Mar 09 13:56:21 crc kubenswrapper[4764]: E0309 13:56:21.328853 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de1bf125-47e1-499c-9cfe-ffbd5c03d194" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.328878 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="de1bf125-47e1-499c-9cfe-ffbd5c03d194" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.329054 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="de1bf125-47e1-499c-9cfe-ffbd5c03d194" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.329871 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.334100 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.334121 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.334516 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.334673 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.334732 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.346601 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk"] Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.438111 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.438589 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.438681 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.438764 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4gb8\" (UniqueName: \"kubernetes.io/projected/7bd401e1-1592-4b49-8eb2-b6dcba296b36-kube-api-access-x4gb8\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.438926 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.540440 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.540498 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.540539 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4gb8\" (UniqueName: \"kubernetes.io/projected/7bd401e1-1592-4b49-8eb2-b6dcba296b36-kube-api-access-x4gb8\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.540635 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.541126 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.546367 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.546370 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.547849 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.550287 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.558673 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4gb8\" (UniqueName: \"kubernetes.io/projected/7bd401e1-1592-4b49-8eb2-b6dcba296b36-kube-api-access-x4gb8\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:21 crc kubenswrapper[4764]: I0309 13:56:21.658613 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:56:22 crc kubenswrapper[4764]: W0309 13:56:22.205074 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bd401e1_1592_4b49_8eb2_b6dcba296b36.slice/crio-a87e4c6da3d7df1147833b51aecae9346095d370c45db0367159e0a123965636 WatchSource:0}: Error finding container a87e4c6da3d7df1147833b51aecae9346095d370c45db0367159e0a123965636: Status 404 returned error can't find the container with id a87e4c6da3d7df1147833b51aecae9346095d370c45db0367159e0a123965636 Mar 09 13:56:22 crc kubenswrapper[4764]: I0309 13:56:22.210295 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk"] Mar 09 13:56:22 crc kubenswrapper[4764]: I0309 13:56:22.255916 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" event={"ID":"7bd401e1-1592-4b49-8eb2-b6dcba296b36","Type":"ContainerStarted","Data":"a87e4c6da3d7df1147833b51aecae9346095d370c45db0367159e0a123965636"} Mar 09 13:56:23 crc kubenswrapper[4764]: I0309 13:56:23.267889 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" event={"ID":"7bd401e1-1592-4b49-8eb2-b6dcba296b36","Type":"ContainerStarted","Data":"2aa45ebc227a6925b794add5ccb7315a01e0a8cf206d439f62db0022923d1a2b"} Mar 09 13:56:23 crc kubenswrapper[4764]: I0309 13:56:23.292565 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" podStartSLOduration=1.549942337 podStartE2EDuration="2.29253615s" podCreationTimestamp="2026-03-09 13:56:21 +0000 UTC" firstStartedPulling="2026-03-09 13:56:22.209407891 +0000 UTC m=+2137.459579799" lastFinishedPulling="2026-03-09 13:56:22.952001704 +0000 UTC m=+2138.202173612" observedRunningTime="2026-03-09 13:56:23.287153516 +0000 UTC m=+2138.537325424" watchObservedRunningTime="2026-03-09 13:56:23.29253615 +0000 UTC m=+2138.542708058" Mar 09 13:57:08 crc kubenswrapper[4764]: I0309 13:57:08.573725 4764 scope.go:117] "RemoveContainer" containerID="3910a772e45d62c401146624cbc27497eee00dea57c88064e7e1af0b907bbfcc" Mar 09 13:57:08 crc kubenswrapper[4764]: I0309 13:57:08.616689 4764 scope.go:117] "RemoveContainer" containerID="b4cf48a07b982624103805e852925a23f44a6c1f17fbc126f6f6ed00345f5ccd" Mar 09 13:57:56 crc kubenswrapper[4764]: I0309 13:57:56.467488 4764 generic.go:334] "Generic (PLEG): container finished" podID="7bd401e1-1592-4b49-8eb2-b6dcba296b36" containerID="2aa45ebc227a6925b794add5ccb7315a01e0a8cf206d439f62db0022923d1a2b" exitCode=0 Mar 09 13:57:56 crc kubenswrapper[4764]: I0309 13:57:56.467578 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" event={"ID":"7bd401e1-1592-4b49-8eb2-b6dcba296b36","Type":"ContainerDied","Data":"2aa45ebc227a6925b794add5ccb7315a01e0a8cf206d439f62db0022923d1a2b"} Mar 09 13:57:57 crc kubenswrapper[4764]: I0309 13:57:57.950060 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.058607 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-ssh-key-openstack-edpm-ipam\") pod \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.059082 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-inventory\") pod \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.059164 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4gb8\" (UniqueName: \"kubernetes.io/projected/7bd401e1-1592-4b49-8eb2-b6dcba296b36-kube-api-access-x4gb8\") pod \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.059219 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-bootstrap-combined-ca-bundle\") pod \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.059993 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-ceph\") pod \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\" (UID: \"7bd401e1-1592-4b49-8eb2-b6dcba296b36\") " Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.067010 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bd401e1-1592-4b49-8eb2-b6dcba296b36-kube-api-access-x4gb8" (OuterVolumeSpecName: "kube-api-access-x4gb8") pod "7bd401e1-1592-4b49-8eb2-b6dcba296b36" (UID: "7bd401e1-1592-4b49-8eb2-b6dcba296b36"). InnerVolumeSpecName "kube-api-access-x4gb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.067876 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-ceph" (OuterVolumeSpecName: "ceph") pod "7bd401e1-1592-4b49-8eb2-b6dcba296b36" (UID: "7bd401e1-1592-4b49-8eb2-b6dcba296b36"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.069956 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "7bd401e1-1592-4b49-8eb2-b6dcba296b36" (UID: "7bd401e1-1592-4b49-8eb2-b6dcba296b36"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.088361 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7bd401e1-1592-4b49-8eb2-b6dcba296b36" (UID: "7bd401e1-1592-4b49-8eb2-b6dcba296b36"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.091976 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-inventory" (OuterVolumeSpecName: "inventory") pod "7bd401e1-1592-4b49-8eb2-b6dcba296b36" (UID: "7bd401e1-1592-4b49-8eb2-b6dcba296b36"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.163266 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.163481 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4gb8\" (UniqueName: \"kubernetes.io/projected/7bd401e1-1592-4b49-8eb2-b6dcba296b36-kube-api-access-x4gb8\") on node \"crc\" DevicePath \"\"" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.163615 4764 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.163703 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.163799 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7bd401e1-1592-4b49-8eb2-b6dcba296b36-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.370785 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.370885 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.491846 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" event={"ID":"7bd401e1-1592-4b49-8eb2-b6dcba296b36","Type":"ContainerDied","Data":"a87e4c6da3d7df1147833b51aecae9346095d370c45db0367159e0a123965636"} Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.491910 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a87e4c6da3d7df1147833b51aecae9346095d370c45db0367159e0a123965636" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.491934 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.594037 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2"] Mar 09 13:57:58 crc kubenswrapper[4764]: E0309 13:57:58.594542 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bd401e1-1592-4b49-8eb2-b6dcba296b36" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.594567 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bd401e1-1592-4b49-8eb2-b6dcba296b36" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.594788 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bd401e1-1592-4b49-8eb2-b6dcba296b36" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.595492 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.599694 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.599799 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.600035 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.601865 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.602365 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.616610 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2"] Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.675009 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j42q\" (UniqueName: \"kubernetes.io/projected/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-kube-api-access-6j42q\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-trkg2\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.675093 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-trkg2\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.675484 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-trkg2\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.675694 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-trkg2\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.778157 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-trkg2\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.778498 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-trkg2\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.778683 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-trkg2\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.778925 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j42q\" (UniqueName: \"kubernetes.io/projected/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-kube-api-access-6j42q\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-trkg2\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.784442 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-trkg2\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.785293 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-trkg2\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.786461 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-trkg2\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.800147 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j42q\" (UniqueName: \"kubernetes.io/projected/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-kube-api-access-6j42q\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-trkg2\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:57:58 crc kubenswrapper[4764]: I0309 13:57:58.921378 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:57:59 crc kubenswrapper[4764]: I0309 13:57:59.511902 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2"] Mar 09 13:58:00 crc kubenswrapper[4764]: I0309 13:58:00.198429 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551078-z7ms2"] Mar 09 13:58:00 crc kubenswrapper[4764]: I0309 13:58:00.200788 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551078-z7ms2" Mar 09 13:58:00 crc kubenswrapper[4764]: I0309 13:58:00.204598 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 13:58:00 crc kubenswrapper[4764]: I0309 13:58:00.205831 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:58:00 crc kubenswrapper[4764]: I0309 13:58:00.205833 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:58:00 crc kubenswrapper[4764]: I0309 13:58:00.207238 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551078-z7ms2"] Mar 09 13:58:00 crc kubenswrapper[4764]: I0309 13:58:00.312266 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh64m\" (UniqueName: \"kubernetes.io/projected/f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9-kube-api-access-fh64m\") pod \"auto-csr-approver-29551078-z7ms2\" (UID: \"f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9\") " pod="openshift-infra/auto-csr-approver-29551078-z7ms2" Mar 09 13:58:00 crc kubenswrapper[4764]: I0309 13:58:00.415041 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh64m\" (UniqueName: \"kubernetes.io/projected/f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9-kube-api-access-fh64m\") pod \"auto-csr-approver-29551078-z7ms2\" (UID: \"f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9\") " pod="openshift-infra/auto-csr-approver-29551078-z7ms2" Mar 09 13:58:00 crc kubenswrapper[4764]: I0309 13:58:00.434093 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh64m\" (UniqueName: \"kubernetes.io/projected/f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9-kube-api-access-fh64m\") pod \"auto-csr-approver-29551078-z7ms2\" (UID: \"f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9\") " pod="openshift-infra/auto-csr-approver-29551078-z7ms2" Mar 09 13:58:00 crc kubenswrapper[4764]: I0309 13:58:00.510016 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" event={"ID":"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd","Type":"ContainerStarted","Data":"f10ba1ca1d3dcf319b1b4430ee734cd598ce5903404c6d034a09dcda8740d33c"} Mar 09 13:58:00 crc kubenswrapper[4764]: I0309 13:58:00.510077 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" event={"ID":"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd","Type":"ContainerStarted","Data":"29f661a48fc5aac47d0eb4e904f376b751a090d605e3f0814a502912e09baee7"} Mar 09 13:58:00 crc kubenswrapper[4764]: I0309 13:58:00.528507 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" podStartSLOduration=1.949712946 podStartE2EDuration="2.528485579s" podCreationTimestamp="2026-03-09 13:57:58 +0000 UTC" firstStartedPulling="2026-03-09 13:57:59.522734424 +0000 UTC m=+2234.772906332" lastFinishedPulling="2026-03-09 13:58:00.101507047 +0000 UTC m=+2235.351678965" observedRunningTime="2026-03-09 13:58:00.525789507 +0000 UTC m=+2235.775961415" watchObservedRunningTime="2026-03-09 13:58:00.528485579 +0000 UTC m=+2235.778657487" Mar 09 13:58:00 crc kubenswrapper[4764]: I0309 13:58:00.575681 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551078-z7ms2" Mar 09 13:58:01 crc kubenswrapper[4764]: I0309 13:58:01.025378 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551078-z7ms2"] Mar 09 13:58:01 crc kubenswrapper[4764]: W0309 13:58:01.035018 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf26fbbd6_fe1a_4ca6_82a8_e425edc3d3d9.slice/crio-185c831b6a62c35a276639340a74c1d8f2e9196ce4cf803064d20712a6e4370a WatchSource:0}: Error finding container 185c831b6a62c35a276639340a74c1d8f2e9196ce4cf803064d20712a6e4370a: Status 404 returned error can't find the container with id 185c831b6a62c35a276639340a74c1d8f2e9196ce4cf803064d20712a6e4370a Mar 09 13:58:01 crc kubenswrapper[4764]: I0309 13:58:01.522792 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551078-z7ms2" event={"ID":"f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9","Type":"ContainerStarted","Data":"185c831b6a62c35a276639340a74c1d8f2e9196ce4cf803064d20712a6e4370a"} Mar 09 13:58:02 crc kubenswrapper[4764]: I0309 13:58:02.533042 4764 generic.go:334] "Generic (PLEG): container finished" podID="f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9" containerID="e98df59174ca147f240e577b9eb7747712c4ce30b3cb22cafa3c795a0cc708fe" exitCode=0 Mar 09 13:58:02 crc kubenswrapper[4764]: I0309 13:58:02.533159 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551078-z7ms2" event={"ID":"f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9","Type":"ContainerDied","Data":"e98df59174ca147f240e577b9eb7747712c4ce30b3cb22cafa3c795a0cc708fe"} Mar 09 13:58:04 crc kubenswrapper[4764]: I0309 13:58:04.116661 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551078-z7ms2" Mar 09 13:58:04 crc kubenswrapper[4764]: I0309 13:58:04.199905 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh64m\" (UniqueName: \"kubernetes.io/projected/f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9-kube-api-access-fh64m\") pod \"f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9\" (UID: \"f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9\") " Mar 09 13:58:04 crc kubenswrapper[4764]: I0309 13:58:04.206701 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9-kube-api-access-fh64m" (OuterVolumeSpecName: "kube-api-access-fh64m") pod "f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9" (UID: "f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9"). InnerVolumeSpecName "kube-api-access-fh64m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:58:04 crc kubenswrapper[4764]: I0309 13:58:04.302741 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh64m\" (UniqueName: \"kubernetes.io/projected/f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9-kube-api-access-fh64m\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:04 crc kubenswrapper[4764]: I0309 13:58:04.557606 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551078-z7ms2" event={"ID":"f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9","Type":"ContainerDied","Data":"185c831b6a62c35a276639340a74c1d8f2e9196ce4cf803064d20712a6e4370a"} Mar 09 13:58:04 crc kubenswrapper[4764]: I0309 13:58:04.557730 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="185c831b6a62c35a276639340a74c1d8f2e9196ce4cf803064d20712a6e4370a" Mar 09 13:58:04 crc kubenswrapper[4764]: I0309 13:58:04.557733 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551078-z7ms2" Mar 09 13:58:05 crc kubenswrapper[4764]: I0309 13:58:05.201915 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551072-2q8rg"] Mar 09 13:58:05 crc kubenswrapper[4764]: I0309 13:58:05.219637 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551072-2q8rg"] Mar 09 13:58:05 crc kubenswrapper[4764]: I0309 13:58:05.571753 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f422975f-b0ee-4ef9-be32-3aac0003a54d" path="/var/lib/kubelet/pods/f422975f-b0ee-4ef9-be32-3aac0003a54d/volumes" Mar 09 13:58:08 crc kubenswrapper[4764]: I0309 13:58:08.720223 4764 scope.go:117] "RemoveContainer" containerID="c5badba421cb7b57d824a0a46932addc59fbe992b12c15d75cb49304a00e2761" Mar 09 13:58:08 crc kubenswrapper[4764]: I0309 13:58:08.763575 4764 scope.go:117] "RemoveContainer" containerID="860e53941ce3fa2c04ed10b54cb5f6fa59f9d4631186670908ce0c966139e37f" Mar 09 13:58:08 crc kubenswrapper[4764]: I0309 13:58:08.816979 4764 scope.go:117] "RemoveContainer" containerID="b589f4b8fd5031689b4581120afd3a37af8b44cf26e7b85ac17d85bceae4a6f2" Mar 09 13:58:08 crc kubenswrapper[4764]: I0309 13:58:08.858787 4764 scope.go:117] "RemoveContainer" containerID="a382abef431f7e8dce93c27f5f5efe18631bdeb380db25088eb96acfd690b493" Mar 09 13:58:12 crc kubenswrapper[4764]: I0309 13:58:12.760230 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f9bgf"] Mar 09 13:58:12 crc kubenswrapper[4764]: E0309 13:58:12.761225 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9" containerName="oc" Mar 09 13:58:12 crc kubenswrapper[4764]: I0309 13:58:12.761244 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9" containerName="oc" Mar 09 13:58:12 crc kubenswrapper[4764]: I0309 13:58:12.761489 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9" containerName="oc" Mar 09 13:58:12 crc kubenswrapper[4764]: I0309 13:58:12.785263 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:12 crc kubenswrapper[4764]: I0309 13:58:12.825348 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f9bgf"] Mar 09 13:58:12 crc kubenswrapper[4764]: I0309 13:58:12.890799 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcfsx\" (UniqueName: \"kubernetes.io/projected/1931244b-286c-4ad0-88f6-8377df60b155-kube-api-access-gcfsx\") pod \"community-operators-f9bgf\" (UID: \"1931244b-286c-4ad0-88f6-8377df60b155\") " pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:12 crc kubenswrapper[4764]: I0309 13:58:12.890887 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1931244b-286c-4ad0-88f6-8377df60b155-utilities\") pod \"community-operators-f9bgf\" (UID: \"1931244b-286c-4ad0-88f6-8377df60b155\") " pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:12 crc kubenswrapper[4764]: I0309 13:58:12.890908 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1931244b-286c-4ad0-88f6-8377df60b155-catalog-content\") pod \"community-operators-f9bgf\" (UID: \"1931244b-286c-4ad0-88f6-8377df60b155\") " pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:12 crc kubenswrapper[4764]: I0309 13:58:12.992978 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1931244b-286c-4ad0-88f6-8377df60b155-utilities\") pod \"community-operators-f9bgf\" (UID: \"1931244b-286c-4ad0-88f6-8377df60b155\") " pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:12 crc kubenswrapper[4764]: I0309 13:58:12.993045 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1931244b-286c-4ad0-88f6-8377df60b155-catalog-content\") pod \"community-operators-f9bgf\" (UID: \"1931244b-286c-4ad0-88f6-8377df60b155\") " pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:12 crc kubenswrapper[4764]: I0309 13:58:12.993253 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcfsx\" (UniqueName: \"kubernetes.io/projected/1931244b-286c-4ad0-88f6-8377df60b155-kube-api-access-gcfsx\") pod \"community-operators-f9bgf\" (UID: \"1931244b-286c-4ad0-88f6-8377df60b155\") " pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:12 crc kubenswrapper[4764]: I0309 13:58:12.993791 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1931244b-286c-4ad0-88f6-8377df60b155-utilities\") pod \"community-operators-f9bgf\" (UID: \"1931244b-286c-4ad0-88f6-8377df60b155\") " pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:12 crc kubenswrapper[4764]: I0309 13:58:12.993968 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1931244b-286c-4ad0-88f6-8377df60b155-catalog-content\") pod \"community-operators-f9bgf\" (UID: \"1931244b-286c-4ad0-88f6-8377df60b155\") " pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:13 crc kubenswrapper[4764]: I0309 13:58:13.022911 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcfsx\" (UniqueName: \"kubernetes.io/projected/1931244b-286c-4ad0-88f6-8377df60b155-kube-api-access-gcfsx\") pod \"community-operators-f9bgf\" (UID: \"1931244b-286c-4ad0-88f6-8377df60b155\") " pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:13 crc kubenswrapper[4764]: I0309 13:58:13.128295 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:13 crc kubenswrapper[4764]: I0309 13:58:13.714216 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f9bgf"] Mar 09 13:58:14 crc kubenswrapper[4764]: I0309 13:58:14.666712 4764 generic.go:334] "Generic (PLEG): container finished" podID="1931244b-286c-4ad0-88f6-8377df60b155" containerID="9c4e5d81c1d20062b0e3343e0f1ab702fc7945a117dd27b6ebbcbe5d4c9462db" exitCode=0 Mar 09 13:58:14 crc kubenswrapper[4764]: I0309 13:58:14.666783 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9bgf" event={"ID":"1931244b-286c-4ad0-88f6-8377df60b155","Type":"ContainerDied","Data":"9c4e5d81c1d20062b0e3343e0f1ab702fc7945a117dd27b6ebbcbe5d4c9462db"} Mar 09 13:58:14 crc kubenswrapper[4764]: I0309 13:58:14.667038 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9bgf" event={"ID":"1931244b-286c-4ad0-88f6-8377df60b155","Type":"ContainerStarted","Data":"5a07e8c2852fbb585a65409f1e4c8c4d2e35c904ba61adc7668c8876210cbafa"} Mar 09 13:58:15 crc kubenswrapper[4764]: I0309 13:58:15.678093 4764 generic.go:334] "Generic (PLEG): container finished" podID="1931244b-286c-4ad0-88f6-8377df60b155" containerID="b0c90e0f9557a0ca138f4428d931980ec160ff5828cc0e82cda4301e32feb3a6" exitCode=0 Mar 09 13:58:15 crc kubenswrapper[4764]: I0309 13:58:15.678157 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9bgf" event={"ID":"1931244b-286c-4ad0-88f6-8377df60b155","Type":"ContainerDied","Data":"b0c90e0f9557a0ca138f4428d931980ec160ff5828cc0e82cda4301e32feb3a6"} Mar 09 13:58:15 crc kubenswrapper[4764]: I0309 13:58:15.952076 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9ffpb"] Mar 09 13:58:15 crc kubenswrapper[4764]: I0309 13:58:15.954681 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:15 crc kubenswrapper[4764]: I0309 13:58:15.975617 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9ffpb"] Mar 09 13:58:16 crc kubenswrapper[4764]: I0309 13:58:16.058673 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57mcq\" (UniqueName: \"kubernetes.io/projected/fc434dcc-281c-4972-8abf-f1353e818c92-kube-api-access-57mcq\") pod \"redhat-operators-9ffpb\" (UID: \"fc434dcc-281c-4972-8abf-f1353e818c92\") " pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:16 crc kubenswrapper[4764]: I0309 13:58:16.058728 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc434dcc-281c-4972-8abf-f1353e818c92-utilities\") pod \"redhat-operators-9ffpb\" (UID: \"fc434dcc-281c-4972-8abf-f1353e818c92\") " pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:16 crc kubenswrapper[4764]: I0309 13:58:16.058766 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc434dcc-281c-4972-8abf-f1353e818c92-catalog-content\") pod \"redhat-operators-9ffpb\" (UID: \"fc434dcc-281c-4972-8abf-f1353e818c92\") " pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:16 crc kubenswrapper[4764]: I0309 13:58:16.160362 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc434dcc-281c-4972-8abf-f1353e818c92-catalog-content\") pod \"redhat-operators-9ffpb\" (UID: \"fc434dcc-281c-4972-8abf-f1353e818c92\") " pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:16 crc kubenswrapper[4764]: I0309 13:58:16.160568 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57mcq\" (UniqueName: \"kubernetes.io/projected/fc434dcc-281c-4972-8abf-f1353e818c92-kube-api-access-57mcq\") pod \"redhat-operators-9ffpb\" (UID: \"fc434dcc-281c-4972-8abf-f1353e818c92\") " pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:16 crc kubenswrapper[4764]: I0309 13:58:16.160599 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc434dcc-281c-4972-8abf-f1353e818c92-utilities\") pod \"redhat-operators-9ffpb\" (UID: \"fc434dcc-281c-4972-8abf-f1353e818c92\") " pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:16 crc kubenswrapper[4764]: I0309 13:58:16.161145 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc434dcc-281c-4972-8abf-f1353e818c92-utilities\") pod \"redhat-operators-9ffpb\" (UID: \"fc434dcc-281c-4972-8abf-f1353e818c92\") " pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:16 crc kubenswrapper[4764]: I0309 13:58:16.161158 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc434dcc-281c-4972-8abf-f1353e818c92-catalog-content\") pod \"redhat-operators-9ffpb\" (UID: \"fc434dcc-281c-4972-8abf-f1353e818c92\") " pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:16 crc kubenswrapper[4764]: I0309 13:58:16.188345 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57mcq\" (UniqueName: \"kubernetes.io/projected/fc434dcc-281c-4972-8abf-f1353e818c92-kube-api-access-57mcq\") pod \"redhat-operators-9ffpb\" (UID: \"fc434dcc-281c-4972-8abf-f1353e818c92\") " pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:16 crc kubenswrapper[4764]: I0309 13:58:16.290075 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:16 crc kubenswrapper[4764]: I0309 13:58:16.717895 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9bgf" event={"ID":"1931244b-286c-4ad0-88f6-8377df60b155","Type":"ContainerStarted","Data":"3bafd88819c82083f9d265cba9c11ed09b324c27b40c3b4f93655a2d0ec404d1"} Mar 09 13:58:16 crc kubenswrapper[4764]: I0309 13:58:16.756456 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f9bgf" podStartSLOduration=3.3328415590000002 podStartE2EDuration="4.756432112s" podCreationTimestamp="2026-03-09 13:58:12 +0000 UTC" firstStartedPulling="2026-03-09 13:58:14.66950032 +0000 UTC m=+2249.919672248" lastFinishedPulling="2026-03-09 13:58:16.093090893 +0000 UTC m=+2251.343262801" observedRunningTime="2026-03-09 13:58:16.750459443 +0000 UTC m=+2252.000631351" watchObservedRunningTime="2026-03-09 13:58:16.756432112 +0000 UTC m=+2252.006604030" Mar 09 13:58:16 crc kubenswrapper[4764]: I0309 13:58:16.839367 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9ffpb"] Mar 09 13:58:17 crc kubenswrapper[4764]: I0309 13:58:17.737117 4764 generic.go:334] "Generic (PLEG): container finished" podID="fc434dcc-281c-4972-8abf-f1353e818c92" containerID="61a825d63b826dd28b590964b6ad42d2157024d995f261bf76f1dc00cd69059e" exitCode=0 Mar 09 13:58:17 crc kubenswrapper[4764]: I0309 13:58:17.737195 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ffpb" event={"ID":"fc434dcc-281c-4972-8abf-f1353e818c92","Type":"ContainerDied","Data":"61a825d63b826dd28b590964b6ad42d2157024d995f261bf76f1dc00cd69059e"} Mar 09 13:58:17 crc kubenswrapper[4764]: I0309 13:58:17.737242 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ffpb" event={"ID":"fc434dcc-281c-4972-8abf-f1353e818c92","Type":"ContainerStarted","Data":"b93f8daca4444928beadd1d0399b3e6b30bb625dfbaf7a7e54bb81b9084ca36a"} Mar 09 13:58:18 crc kubenswrapper[4764]: I0309 13:58:18.748302 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ffpb" event={"ID":"fc434dcc-281c-4972-8abf-f1353e818c92","Type":"ContainerStarted","Data":"a7e8d42d98e3d5f94cb4b2154da79aff60ec5bc3c48a9fbccd39a399f98f3e94"} Mar 09 13:58:22 crc kubenswrapper[4764]: I0309 13:58:22.785008 4764 generic.go:334] "Generic (PLEG): container finished" podID="fc434dcc-281c-4972-8abf-f1353e818c92" containerID="a7e8d42d98e3d5f94cb4b2154da79aff60ec5bc3c48a9fbccd39a399f98f3e94" exitCode=0 Mar 09 13:58:22 crc kubenswrapper[4764]: I0309 13:58:22.785145 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ffpb" event={"ID":"fc434dcc-281c-4972-8abf-f1353e818c92","Type":"ContainerDied","Data":"a7e8d42d98e3d5f94cb4b2154da79aff60ec5bc3c48a9fbccd39a399f98f3e94"} Mar 09 13:58:23 crc kubenswrapper[4764]: I0309 13:58:23.138673 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:23 crc kubenswrapper[4764]: I0309 13:58:23.138763 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:23 crc kubenswrapper[4764]: I0309 13:58:23.198161 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:23 crc kubenswrapper[4764]: I0309 13:58:23.796070 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ffpb" event={"ID":"fc434dcc-281c-4972-8abf-f1353e818c92","Type":"ContainerStarted","Data":"732763b672e5e098d63dd45c1ee15b7732366f951097a6849b5bfe81e2f7e6d0"} Mar 09 13:58:23 crc kubenswrapper[4764]: I0309 13:58:23.828818 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9ffpb" podStartSLOduration=3.339945283 podStartE2EDuration="8.828789313s" podCreationTimestamp="2026-03-09 13:58:15 +0000 UTC" firstStartedPulling="2026-03-09 13:58:17.73944178 +0000 UTC m=+2252.989613698" lastFinishedPulling="2026-03-09 13:58:23.22828582 +0000 UTC m=+2258.478457728" observedRunningTime="2026-03-09 13:58:23.821514968 +0000 UTC m=+2259.071686896" watchObservedRunningTime="2026-03-09 13:58:23.828789313 +0000 UTC m=+2259.078961221" Mar 09 13:58:23 crc kubenswrapper[4764]: I0309 13:58:23.844492 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:24 crc kubenswrapper[4764]: I0309 13:58:24.807969 4764 generic.go:334] "Generic (PLEG): container finished" podID="5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd" containerID="f10ba1ca1d3dcf319b1b4430ee734cd598ce5903404c6d034a09dcda8740d33c" exitCode=0 Mar 09 13:58:24 crc kubenswrapper[4764]: I0309 13:58:24.808085 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" event={"ID":"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd","Type":"ContainerDied","Data":"f10ba1ca1d3dcf319b1b4430ee734cd598ce5903404c6d034a09dcda8740d33c"} Mar 09 13:58:25 crc kubenswrapper[4764]: I0309 13:58:25.148012 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f9bgf"] Mar 09 13:58:25 crc kubenswrapper[4764]: I0309 13:58:25.816727 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f9bgf" podUID="1931244b-286c-4ad0-88f6-8377df60b155" containerName="registry-server" containerID="cri-o://3bafd88819c82083f9d265cba9c11ed09b324c27b40c3b4f93655a2d0ec404d1" gracePeriod=2 Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.291135 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.291726 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.377139 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.384884 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.514095 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-ceph\") pod \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.514379 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j42q\" (UniqueName: \"kubernetes.io/projected/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-kube-api-access-6j42q\") pod \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.514440 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcfsx\" (UniqueName: \"kubernetes.io/projected/1931244b-286c-4ad0-88f6-8377df60b155-kube-api-access-gcfsx\") pod \"1931244b-286c-4ad0-88f6-8377df60b155\" (UID: \"1931244b-286c-4ad0-88f6-8377df60b155\") " Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.514470 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-ssh-key-openstack-edpm-ipam\") pod \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.514497 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1931244b-286c-4ad0-88f6-8377df60b155-catalog-content\") pod \"1931244b-286c-4ad0-88f6-8377df60b155\" (UID: \"1931244b-286c-4ad0-88f6-8377df60b155\") " Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.514520 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-inventory\") pod \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\" (UID: \"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd\") " Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.514584 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1931244b-286c-4ad0-88f6-8377df60b155-utilities\") pod \"1931244b-286c-4ad0-88f6-8377df60b155\" (UID: \"1931244b-286c-4ad0-88f6-8377df60b155\") " Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.515800 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1931244b-286c-4ad0-88f6-8377df60b155-utilities" (OuterVolumeSpecName: "utilities") pod "1931244b-286c-4ad0-88f6-8377df60b155" (UID: "1931244b-286c-4ad0-88f6-8377df60b155"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.521236 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1931244b-286c-4ad0-88f6-8377df60b155-kube-api-access-gcfsx" (OuterVolumeSpecName: "kube-api-access-gcfsx") pod "1931244b-286c-4ad0-88f6-8377df60b155" (UID: "1931244b-286c-4ad0-88f6-8377df60b155"). InnerVolumeSpecName "kube-api-access-gcfsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.521350 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-ceph" (OuterVolumeSpecName: "ceph") pod "5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd" (UID: "5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.522857 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-kube-api-access-6j42q" (OuterVolumeSpecName: "kube-api-access-6j42q") pod "5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd" (UID: "5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd"). InnerVolumeSpecName "kube-api-access-6j42q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.545587 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd" (UID: "5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.546986 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-inventory" (OuterVolumeSpecName: "inventory") pod "5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd" (UID: "5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.576446 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1931244b-286c-4ad0-88f6-8377df60b155-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1931244b-286c-4ad0-88f6-8377df60b155" (UID: "1931244b-286c-4ad0-88f6-8377df60b155"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.616556 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcfsx\" (UniqueName: \"kubernetes.io/projected/1931244b-286c-4ad0-88f6-8377df60b155-kube-api-access-gcfsx\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.616603 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1931244b-286c-4ad0-88f6-8377df60b155-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.616613 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.616627 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.616730 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1931244b-286c-4ad0-88f6-8377df60b155-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.616746 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.616757 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6j42q\" (UniqueName: \"kubernetes.io/projected/5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd-kube-api-access-6j42q\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.828944 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" event={"ID":"5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd","Type":"ContainerDied","Data":"29f661a48fc5aac47d0eb4e904f376b751a090d605e3f0814a502912e09baee7"} Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.829397 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29f661a48fc5aac47d0eb4e904f376b751a090d605e3f0814a502912e09baee7" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.829015 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-trkg2" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.831437 4764 generic.go:334] "Generic (PLEG): container finished" podID="1931244b-286c-4ad0-88f6-8377df60b155" containerID="3bafd88819c82083f9d265cba9c11ed09b324c27b40c3b4f93655a2d0ec404d1" exitCode=0 Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.831516 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9bgf" event={"ID":"1931244b-286c-4ad0-88f6-8377df60b155","Type":"ContainerDied","Data":"3bafd88819c82083f9d265cba9c11ed09b324c27b40c3b4f93655a2d0ec404d1"} Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.831560 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9bgf" event={"ID":"1931244b-286c-4ad0-88f6-8377df60b155","Type":"ContainerDied","Data":"5a07e8c2852fbb585a65409f1e4c8c4d2e35c904ba61adc7668c8876210cbafa"} Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.831572 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9bgf" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.831579 4764 scope.go:117] "RemoveContainer" containerID="3bafd88819c82083f9d265cba9c11ed09b324c27b40c3b4f93655a2d0ec404d1" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.863038 4764 scope.go:117] "RemoveContainer" containerID="b0c90e0f9557a0ca138f4428d931980ec160ff5828cc0e82cda4301e32feb3a6" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.900072 4764 scope.go:117] "RemoveContainer" containerID="9c4e5d81c1d20062b0e3343e0f1ab702fc7945a117dd27b6ebbcbe5d4c9462db" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.905063 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f9bgf"] Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.914633 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f9bgf"] Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.926951 4764 scope.go:117] "RemoveContainer" containerID="3bafd88819c82083f9d265cba9c11ed09b324c27b40c3b4f93655a2d0ec404d1" Mar 09 13:58:26 crc kubenswrapper[4764]: E0309 13:58:26.930985 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bafd88819c82083f9d265cba9c11ed09b324c27b40c3b4f93655a2d0ec404d1\": container with ID starting with 3bafd88819c82083f9d265cba9c11ed09b324c27b40c3b4f93655a2d0ec404d1 not found: ID does not exist" containerID="3bafd88819c82083f9d265cba9c11ed09b324c27b40c3b4f93655a2d0ec404d1" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.931050 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bafd88819c82083f9d265cba9c11ed09b324c27b40c3b4f93655a2d0ec404d1"} err="failed to get container status \"3bafd88819c82083f9d265cba9c11ed09b324c27b40c3b4f93655a2d0ec404d1\": rpc error: code = NotFound desc = could not find container \"3bafd88819c82083f9d265cba9c11ed09b324c27b40c3b4f93655a2d0ec404d1\": container with ID starting with 3bafd88819c82083f9d265cba9c11ed09b324c27b40c3b4f93655a2d0ec404d1 not found: ID does not exist" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.931092 4764 scope.go:117] "RemoveContainer" containerID="b0c90e0f9557a0ca138f4428d931980ec160ff5828cc0e82cda4301e32feb3a6" Mar 09 13:58:26 crc kubenswrapper[4764]: E0309 13:58:26.931469 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0c90e0f9557a0ca138f4428d931980ec160ff5828cc0e82cda4301e32feb3a6\": container with ID starting with b0c90e0f9557a0ca138f4428d931980ec160ff5828cc0e82cda4301e32feb3a6 not found: ID does not exist" containerID="b0c90e0f9557a0ca138f4428d931980ec160ff5828cc0e82cda4301e32feb3a6" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.931499 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0c90e0f9557a0ca138f4428d931980ec160ff5828cc0e82cda4301e32feb3a6"} err="failed to get container status \"b0c90e0f9557a0ca138f4428d931980ec160ff5828cc0e82cda4301e32feb3a6\": rpc error: code = NotFound desc = could not find container \"b0c90e0f9557a0ca138f4428d931980ec160ff5828cc0e82cda4301e32feb3a6\": container with ID starting with b0c90e0f9557a0ca138f4428d931980ec160ff5828cc0e82cda4301e32feb3a6 not found: ID does not exist" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.931516 4764 scope.go:117] "RemoveContainer" containerID="9c4e5d81c1d20062b0e3343e0f1ab702fc7945a117dd27b6ebbcbe5d4c9462db" Mar 09 13:58:26 crc kubenswrapper[4764]: E0309 13:58:26.932990 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c4e5d81c1d20062b0e3343e0f1ab702fc7945a117dd27b6ebbcbe5d4c9462db\": container with ID starting with 9c4e5d81c1d20062b0e3343e0f1ab702fc7945a117dd27b6ebbcbe5d4c9462db not found: ID does not exist" containerID="9c4e5d81c1d20062b0e3343e0f1ab702fc7945a117dd27b6ebbcbe5d4c9462db" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.933023 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c4e5d81c1d20062b0e3343e0f1ab702fc7945a117dd27b6ebbcbe5d4c9462db"} err="failed to get container status \"9c4e5d81c1d20062b0e3343e0f1ab702fc7945a117dd27b6ebbcbe5d4c9462db\": rpc error: code = NotFound desc = could not find container \"9c4e5d81c1d20062b0e3343e0f1ab702fc7945a117dd27b6ebbcbe5d4c9462db\": container with ID starting with 9c4e5d81c1d20062b0e3343e0f1ab702fc7945a117dd27b6ebbcbe5d4c9462db not found: ID does not exist" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.944841 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m"] Mar 09 13:58:26 crc kubenswrapper[4764]: E0309 13:58:26.945417 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1931244b-286c-4ad0-88f6-8377df60b155" containerName="extract-content" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.945437 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1931244b-286c-4ad0-88f6-8377df60b155" containerName="extract-content" Mar 09 13:58:26 crc kubenswrapper[4764]: E0309 13:58:26.945456 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1931244b-286c-4ad0-88f6-8377df60b155" containerName="extract-utilities" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.945463 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1931244b-286c-4ad0-88f6-8377df60b155" containerName="extract-utilities" Mar 09 13:58:26 crc kubenswrapper[4764]: E0309 13:58:26.945479 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1931244b-286c-4ad0-88f6-8377df60b155" containerName="registry-server" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.945485 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1931244b-286c-4ad0-88f6-8377df60b155" containerName="registry-server" Mar 09 13:58:26 crc kubenswrapper[4764]: E0309 13:58:26.945505 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.945512 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.945712 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.945729 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1931244b-286c-4ad0-88f6-8377df60b155" containerName="registry-server" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.946526 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.949982 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.950396 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.950584 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.951048 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.952036 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:58:26 crc kubenswrapper[4764]: I0309 13:58:26.956430 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m"] Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.028315 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.028381 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.028633 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzwdg\" (UniqueName: \"kubernetes.io/projected/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-kube-api-access-pzwdg\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.028857 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.130533 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.130619 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.130709 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzwdg\" (UniqueName: \"kubernetes.io/projected/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-kube-api-access-pzwdg\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.130831 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.136187 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.136268 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.140261 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.155165 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzwdg\" (UniqueName: \"kubernetes.io/projected/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-kube-api-access-pzwdg\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.308174 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.337905 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9ffpb" podUID="fc434dcc-281c-4972-8abf-f1353e818c92" containerName="registry-server" probeResult="failure" output=< Mar 09 13:58:27 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Mar 09 13:58:27 crc kubenswrapper[4764]: > Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.607068 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1931244b-286c-4ad0-88f6-8377df60b155" path="/var/lib/kubelet/pods/1931244b-286c-4ad0-88f6-8377df60b155/volumes" Mar 09 13:58:27 crc kubenswrapper[4764]: I0309 13:58:27.920105 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m"] Mar 09 13:58:27 crc kubenswrapper[4764]: W0309 13:58:27.925015 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93b0ad6c_7720_4b43_b65c_83b7b7a8c3ab.slice/crio-86eca36a2fd702dd87d457eb0f60f37cd82a4160e83acda801e0cfe2b332e2c9 WatchSource:0}: Error finding container 86eca36a2fd702dd87d457eb0f60f37cd82a4160e83acda801e0cfe2b332e2c9: Status 404 returned error can't find the container with id 86eca36a2fd702dd87d457eb0f60f37cd82a4160e83acda801e0cfe2b332e2c9 Mar 09 13:58:28 crc kubenswrapper[4764]: I0309 13:58:28.370078 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:58:28 crc kubenswrapper[4764]: I0309 13:58:28.370415 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:58:28 crc kubenswrapper[4764]: I0309 13:58:28.855252 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" event={"ID":"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab","Type":"ContainerStarted","Data":"68b7fc8b415d7fd18af743d52f845a521dd96c27435d9abcf4b8b98fd1b7cbb5"} Mar 09 13:58:28 crc kubenswrapper[4764]: I0309 13:58:28.855627 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" event={"ID":"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab","Type":"ContainerStarted","Data":"86eca36a2fd702dd87d457eb0f60f37cd82a4160e83acda801e0cfe2b332e2c9"} Mar 09 13:58:28 crc kubenswrapper[4764]: I0309 13:58:28.883975 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" podStartSLOduration=2.451100662 podStartE2EDuration="2.88395125s" podCreationTimestamp="2026-03-09 13:58:26 +0000 UTC" firstStartedPulling="2026-03-09 13:58:27.932352631 +0000 UTC m=+2263.182524539" lastFinishedPulling="2026-03-09 13:58:28.365203219 +0000 UTC m=+2263.615375127" observedRunningTime="2026-03-09 13:58:28.875954417 +0000 UTC m=+2264.126126335" watchObservedRunningTime="2026-03-09 13:58:28.88395125 +0000 UTC m=+2264.134123168" Mar 09 13:58:33 crc kubenswrapper[4764]: I0309 13:58:33.902476 4764 generic.go:334] "Generic (PLEG): container finished" podID="93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab" containerID="68b7fc8b415d7fd18af743d52f845a521dd96c27435d9abcf4b8b98fd1b7cbb5" exitCode=0 Mar 09 13:58:33 crc kubenswrapper[4764]: I0309 13:58:33.902711 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" event={"ID":"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab","Type":"ContainerDied","Data":"68b7fc8b415d7fd18af743d52f845a521dd96c27435d9abcf4b8b98fd1b7cbb5"} Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.387531 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.583333 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-inventory\") pod \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.583492 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-ssh-key-openstack-edpm-ipam\") pod \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.583580 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-ceph\") pod \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.583663 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzwdg\" (UniqueName: \"kubernetes.io/projected/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-kube-api-access-pzwdg\") pod \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\" (UID: \"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab\") " Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.602993 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-ceph" (OuterVolumeSpecName: "ceph") pod "93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab" (UID: "93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.603969 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-kube-api-access-pzwdg" (OuterVolumeSpecName: "kube-api-access-pzwdg") pod "93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab" (UID: "93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab"). InnerVolumeSpecName "kube-api-access-pzwdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.614540 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-inventory" (OuterVolumeSpecName: "inventory") pod "93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab" (UID: "93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.617931 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab" (UID: "93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.687267 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.687334 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.687354 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzwdg\" (UniqueName: \"kubernetes.io/projected/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-kube-api-access-pzwdg\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.687371 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.932010 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" event={"ID":"93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab","Type":"ContainerDied","Data":"86eca36a2fd702dd87d457eb0f60f37cd82a4160e83acda801e0cfe2b332e2c9"} Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.932083 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86eca36a2fd702dd87d457eb0f60f37cd82a4160e83acda801e0cfe2b332e2c9" Mar 09 13:58:35 crc kubenswrapper[4764]: I0309 13:58:35.932127 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.021757 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p"] Mar 09 13:58:36 crc kubenswrapper[4764]: E0309 13:58:36.022385 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.022416 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.022686 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.023678 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.026564 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.026628 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.026755 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.027074 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.031959 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p"] Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.042345 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.198371 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzzj2\" (UniqueName: \"kubernetes.io/projected/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-kube-api-access-dzzj2\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bhp5p\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.198449 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bhp5p\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.198479 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bhp5p\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.198946 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bhp5p\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.301322 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bhp5p\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.301569 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzzj2\" (UniqueName: \"kubernetes.io/projected/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-kube-api-access-dzzj2\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bhp5p\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.301631 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bhp5p\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.301730 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bhp5p\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.305927 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bhp5p\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.306832 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bhp5p\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.310545 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bhp5p\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.329532 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzzj2\" (UniqueName: \"kubernetes.io/projected/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-kube-api-access-dzzj2\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bhp5p\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.337626 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.350443 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.395713 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.899864 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p"] Mar 09 13:58:36 crc kubenswrapper[4764]: I0309 13:58:36.946365 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" event={"ID":"2d2ddcdd-77bf-4dc5-8170-02d297378dcb","Type":"ContainerStarted","Data":"be0e4cdf0dcf35386be7b745eaa83c77d9a655ec0e8c4111fb3dec8d28ca749e"} Mar 09 13:58:37 crc kubenswrapper[4764]: I0309 13:58:37.748205 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9ffpb"] Mar 09 13:58:37 crc kubenswrapper[4764]: I0309 13:58:37.958775 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" event={"ID":"2d2ddcdd-77bf-4dc5-8170-02d297378dcb","Type":"ContainerStarted","Data":"e0d17ea39e7555c976bc1934118f99c866e5a434f1c8b871b8eaacb7b52e87e5"} Mar 09 13:58:37 crc kubenswrapper[4764]: I0309 13:58:37.959092 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9ffpb" podUID="fc434dcc-281c-4972-8abf-f1353e818c92" containerName="registry-server" containerID="cri-o://732763b672e5e098d63dd45c1ee15b7732366f951097a6849b5bfe81e2f7e6d0" gracePeriod=2 Mar 09 13:58:37 crc kubenswrapper[4764]: I0309 13:58:37.984543 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" podStartSLOduration=2.548110863 podStartE2EDuration="2.984521147s" podCreationTimestamp="2026-03-09 13:58:35 +0000 UTC" firstStartedPulling="2026-03-09 13:58:36.904066398 +0000 UTC m=+2272.154238296" lastFinishedPulling="2026-03-09 13:58:37.340476672 +0000 UTC m=+2272.590648580" observedRunningTime="2026-03-09 13:58:37.983026817 +0000 UTC m=+2273.233198725" watchObservedRunningTime="2026-03-09 13:58:37.984521147 +0000 UTC m=+2273.234693055" Mar 09 13:58:38 crc kubenswrapper[4764]: I0309 13:58:38.408261 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:38 crc kubenswrapper[4764]: I0309 13:58:38.564123 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc434dcc-281c-4972-8abf-f1353e818c92-utilities\") pod \"fc434dcc-281c-4972-8abf-f1353e818c92\" (UID: \"fc434dcc-281c-4972-8abf-f1353e818c92\") " Mar 09 13:58:38 crc kubenswrapper[4764]: I0309 13:58:38.564250 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57mcq\" (UniqueName: \"kubernetes.io/projected/fc434dcc-281c-4972-8abf-f1353e818c92-kube-api-access-57mcq\") pod \"fc434dcc-281c-4972-8abf-f1353e818c92\" (UID: \"fc434dcc-281c-4972-8abf-f1353e818c92\") " Mar 09 13:58:38 crc kubenswrapper[4764]: I0309 13:58:38.564545 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc434dcc-281c-4972-8abf-f1353e818c92-catalog-content\") pod \"fc434dcc-281c-4972-8abf-f1353e818c92\" (UID: \"fc434dcc-281c-4972-8abf-f1353e818c92\") " Mar 09 13:58:38 crc kubenswrapper[4764]: I0309 13:58:38.565076 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc434dcc-281c-4972-8abf-f1353e818c92-utilities" (OuterVolumeSpecName: "utilities") pod "fc434dcc-281c-4972-8abf-f1353e818c92" (UID: "fc434dcc-281c-4972-8abf-f1353e818c92"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:58:38 crc kubenswrapper[4764]: I0309 13:58:38.580071 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc434dcc-281c-4972-8abf-f1353e818c92-kube-api-access-57mcq" (OuterVolumeSpecName: "kube-api-access-57mcq") pod "fc434dcc-281c-4972-8abf-f1353e818c92" (UID: "fc434dcc-281c-4972-8abf-f1353e818c92"). InnerVolumeSpecName "kube-api-access-57mcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:58:38 crc kubenswrapper[4764]: I0309 13:58:38.667390 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc434dcc-281c-4972-8abf-f1353e818c92-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:38 crc kubenswrapper[4764]: I0309 13:58:38.667435 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57mcq\" (UniqueName: \"kubernetes.io/projected/fc434dcc-281c-4972-8abf-f1353e818c92-kube-api-access-57mcq\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:38 crc kubenswrapper[4764]: I0309 13:58:38.718146 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc434dcc-281c-4972-8abf-f1353e818c92-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc434dcc-281c-4972-8abf-f1353e818c92" (UID: "fc434dcc-281c-4972-8abf-f1353e818c92"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:58:38 crc kubenswrapper[4764]: I0309 13:58:38.770050 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc434dcc-281c-4972-8abf-f1353e818c92-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:58:38 crc kubenswrapper[4764]: I0309 13:58:38.974007 4764 generic.go:334] "Generic (PLEG): container finished" podID="fc434dcc-281c-4972-8abf-f1353e818c92" containerID="732763b672e5e098d63dd45c1ee15b7732366f951097a6849b5bfe81e2f7e6d0" exitCode=0 Mar 09 13:58:38 crc kubenswrapper[4764]: I0309 13:58:38.974105 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9ffpb" Mar 09 13:58:38 crc kubenswrapper[4764]: I0309 13:58:38.974093 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ffpb" event={"ID":"fc434dcc-281c-4972-8abf-f1353e818c92","Type":"ContainerDied","Data":"732763b672e5e098d63dd45c1ee15b7732366f951097a6849b5bfe81e2f7e6d0"} Mar 09 13:58:38 crc kubenswrapper[4764]: I0309 13:58:38.974714 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ffpb" event={"ID":"fc434dcc-281c-4972-8abf-f1353e818c92","Type":"ContainerDied","Data":"b93f8daca4444928beadd1d0399b3e6b30bb625dfbaf7a7e54bb81b9084ca36a"} Mar 09 13:58:38 crc kubenswrapper[4764]: I0309 13:58:38.974756 4764 scope.go:117] "RemoveContainer" containerID="732763b672e5e098d63dd45c1ee15b7732366f951097a6849b5bfe81e2f7e6d0" Mar 09 13:58:39 crc kubenswrapper[4764]: I0309 13:58:39.009566 4764 scope.go:117] "RemoveContainer" containerID="a7e8d42d98e3d5f94cb4b2154da79aff60ec5bc3c48a9fbccd39a399f98f3e94" Mar 09 13:58:39 crc kubenswrapper[4764]: I0309 13:58:39.024498 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9ffpb"] Mar 09 13:58:39 crc kubenswrapper[4764]: I0309 13:58:39.031885 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9ffpb"] Mar 09 13:58:39 crc kubenswrapper[4764]: I0309 13:58:39.049705 4764 scope.go:117] "RemoveContainer" containerID="61a825d63b826dd28b590964b6ad42d2157024d995f261bf76f1dc00cd69059e" Mar 09 13:58:39 crc kubenswrapper[4764]: I0309 13:58:39.091993 4764 scope.go:117] "RemoveContainer" containerID="732763b672e5e098d63dd45c1ee15b7732366f951097a6849b5bfe81e2f7e6d0" Mar 09 13:58:39 crc kubenswrapper[4764]: E0309 13:58:39.092916 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"732763b672e5e098d63dd45c1ee15b7732366f951097a6849b5bfe81e2f7e6d0\": container with ID starting with 732763b672e5e098d63dd45c1ee15b7732366f951097a6849b5bfe81e2f7e6d0 not found: ID does not exist" containerID="732763b672e5e098d63dd45c1ee15b7732366f951097a6849b5bfe81e2f7e6d0" Mar 09 13:58:39 crc kubenswrapper[4764]: I0309 13:58:39.092983 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"732763b672e5e098d63dd45c1ee15b7732366f951097a6849b5bfe81e2f7e6d0"} err="failed to get container status \"732763b672e5e098d63dd45c1ee15b7732366f951097a6849b5bfe81e2f7e6d0\": rpc error: code = NotFound desc = could not find container \"732763b672e5e098d63dd45c1ee15b7732366f951097a6849b5bfe81e2f7e6d0\": container with ID starting with 732763b672e5e098d63dd45c1ee15b7732366f951097a6849b5bfe81e2f7e6d0 not found: ID does not exist" Mar 09 13:58:39 crc kubenswrapper[4764]: I0309 13:58:39.093039 4764 scope.go:117] "RemoveContainer" containerID="a7e8d42d98e3d5f94cb4b2154da79aff60ec5bc3c48a9fbccd39a399f98f3e94" Mar 09 13:58:39 crc kubenswrapper[4764]: E0309 13:58:39.093761 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7e8d42d98e3d5f94cb4b2154da79aff60ec5bc3c48a9fbccd39a399f98f3e94\": container with ID starting with a7e8d42d98e3d5f94cb4b2154da79aff60ec5bc3c48a9fbccd39a399f98f3e94 not found: ID does not exist" containerID="a7e8d42d98e3d5f94cb4b2154da79aff60ec5bc3c48a9fbccd39a399f98f3e94" Mar 09 13:58:39 crc kubenswrapper[4764]: I0309 13:58:39.093807 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e8d42d98e3d5f94cb4b2154da79aff60ec5bc3c48a9fbccd39a399f98f3e94"} err="failed to get container status \"a7e8d42d98e3d5f94cb4b2154da79aff60ec5bc3c48a9fbccd39a399f98f3e94\": rpc error: code = NotFound desc = could not find container \"a7e8d42d98e3d5f94cb4b2154da79aff60ec5bc3c48a9fbccd39a399f98f3e94\": container with ID starting with a7e8d42d98e3d5f94cb4b2154da79aff60ec5bc3c48a9fbccd39a399f98f3e94 not found: ID does not exist" Mar 09 13:58:39 crc kubenswrapper[4764]: I0309 13:58:39.093841 4764 scope.go:117] "RemoveContainer" containerID="61a825d63b826dd28b590964b6ad42d2157024d995f261bf76f1dc00cd69059e" Mar 09 13:58:39 crc kubenswrapper[4764]: E0309 13:58:39.094461 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61a825d63b826dd28b590964b6ad42d2157024d995f261bf76f1dc00cd69059e\": container with ID starting with 61a825d63b826dd28b590964b6ad42d2157024d995f261bf76f1dc00cd69059e not found: ID does not exist" containerID="61a825d63b826dd28b590964b6ad42d2157024d995f261bf76f1dc00cd69059e" Mar 09 13:58:39 crc kubenswrapper[4764]: I0309 13:58:39.094503 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61a825d63b826dd28b590964b6ad42d2157024d995f261bf76f1dc00cd69059e"} err="failed to get container status \"61a825d63b826dd28b590964b6ad42d2157024d995f261bf76f1dc00cd69059e\": rpc error: code = NotFound desc = could not find container \"61a825d63b826dd28b590964b6ad42d2157024d995f261bf76f1dc00cd69059e\": container with ID starting with 61a825d63b826dd28b590964b6ad42d2157024d995f261bf76f1dc00cd69059e not found: ID does not exist" Mar 09 13:58:39 crc kubenswrapper[4764]: I0309 13:58:39.572524 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc434dcc-281c-4972-8abf-f1353e818c92" path="/var/lib/kubelet/pods/fc434dcc-281c-4972-8abf-f1353e818c92/volumes" Mar 09 13:58:58 crc kubenswrapper[4764]: I0309 13:58:58.370636 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:58:58 crc kubenswrapper[4764]: I0309 13:58:58.371331 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:58:58 crc kubenswrapper[4764]: I0309 13:58:58.371380 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 13:58:58 crc kubenswrapper[4764]: I0309 13:58:58.372326 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:58:58 crc kubenswrapper[4764]: I0309 13:58:58.372398 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" gracePeriod=600 Mar 09 13:58:58 crc kubenswrapper[4764]: E0309 13:58:58.505728 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:58:59 crc kubenswrapper[4764]: I0309 13:58:59.177379 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" exitCode=0 Mar 09 13:58:59 crc kubenswrapper[4764]: I0309 13:58:59.177434 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211"} Mar 09 13:58:59 crc kubenswrapper[4764]: I0309 13:58:59.177926 4764 scope.go:117] "RemoveContainer" containerID="c4a2ad399af00232f256b54f6dba8cd11d56a98b4177187d6164b2b8cca69e65" Mar 09 13:58:59 crc kubenswrapper[4764]: I0309 13:58:59.179112 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 13:58:59 crc kubenswrapper[4764]: E0309 13:58:59.179609 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:59:09 crc kubenswrapper[4764]: I0309 13:59:09.277815 4764 generic.go:334] "Generic (PLEG): container finished" podID="2d2ddcdd-77bf-4dc5-8170-02d297378dcb" containerID="e0d17ea39e7555c976bc1934118f99c866e5a434f1c8b871b8eaacb7b52e87e5" exitCode=0 Mar 09 13:59:09 crc kubenswrapper[4764]: I0309 13:59:09.277867 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" event={"ID":"2d2ddcdd-77bf-4dc5-8170-02d297378dcb","Type":"ContainerDied","Data":"e0d17ea39e7555c976bc1934118f99c866e5a434f1c8b871b8eaacb7b52e87e5"} Mar 09 13:59:10 crc kubenswrapper[4764]: I0309 13:59:10.735519 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:59:10 crc kubenswrapper[4764]: I0309 13:59:10.820593 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzzj2\" (UniqueName: \"kubernetes.io/projected/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-kube-api-access-dzzj2\") pod \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " Mar 09 13:59:10 crc kubenswrapper[4764]: I0309 13:59:10.820732 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-inventory\") pod \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " Mar 09 13:59:10 crc kubenswrapper[4764]: I0309 13:59:10.821002 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-ceph\") pod \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " Mar 09 13:59:10 crc kubenswrapper[4764]: I0309 13:59:10.821081 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-ssh-key-openstack-edpm-ipam\") pod \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\" (UID: \"2d2ddcdd-77bf-4dc5-8170-02d297378dcb\") " Mar 09 13:59:10 crc kubenswrapper[4764]: I0309 13:59:10.827407 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-ceph" (OuterVolumeSpecName: "ceph") pod "2d2ddcdd-77bf-4dc5-8170-02d297378dcb" (UID: "2d2ddcdd-77bf-4dc5-8170-02d297378dcb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:59:10 crc kubenswrapper[4764]: I0309 13:59:10.837247 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-kube-api-access-dzzj2" (OuterVolumeSpecName: "kube-api-access-dzzj2") pod "2d2ddcdd-77bf-4dc5-8170-02d297378dcb" (UID: "2d2ddcdd-77bf-4dc5-8170-02d297378dcb"). InnerVolumeSpecName "kube-api-access-dzzj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:59:10 crc kubenswrapper[4764]: I0309 13:59:10.850277 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2d2ddcdd-77bf-4dc5-8170-02d297378dcb" (UID: "2d2ddcdd-77bf-4dc5-8170-02d297378dcb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:59:10 crc kubenswrapper[4764]: I0309 13:59:10.850361 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-inventory" (OuterVolumeSpecName: "inventory") pod "2d2ddcdd-77bf-4dc5-8170-02d297378dcb" (UID: "2d2ddcdd-77bf-4dc5-8170-02d297378dcb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:59:10 crc kubenswrapper[4764]: I0309 13:59:10.926143 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:59:10 crc kubenswrapper[4764]: I0309 13:59:10.926477 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzzj2\" (UniqueName: \"kubernetes.io/projected/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-kube-api-access-dzzj2\") on node \"crc\" DevicePath \"\"" Mar 09 13:59:10 crc kubenswrapper[4764]: I0309 13:59:10.926556 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:59:10 crc kubenswrapper[4764]: I0309 13:59:10.926631 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d2ddcdd-77bf-4dc5-8170-02d297378dcb-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.297141 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" event={"ID":"2d2ddcdd-77bf-4dc5-8170-02d297378dcb","Type":"ContainerDied","Data":"be0e4cdf0dcf35386be7b745eaa83c77d9a655ec0e8c4111fb3dec8d28ca749e"} Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.297220 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be0e4cdf0dcf35386be7b745eaa83c77d9a655ec0e8c4111fb3dec8d28ca749e" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.297237 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bhp5p" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.400942 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv"] Mar 09 13:59:11 crc kubenswrapper[4764]: E0309 13:59:11.401475 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc434dcc-281c-4972-8abf-f1353e818c92" containerName="registry-server" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.401504 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc434dcc-281c-4972-8abf-f1353e818c92" containerName="registry-server" Mar 09 13:59:11 crc kubenswrapper[4764]: E0309 13:59:11.401533 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d2ddcdd-77bf-4dc5-8170-02d297378dcb" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.401546 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d2ddcdd-77bf-4dc5-8170-02d297378dcb" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:59:11 crc kubenswrapper[4764]: E0309 13:59:11.401568 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc434dcc-281c-4972-8abf-f1353e818c92" containerName="extract-content" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.401577 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc434dcc-281c-4972-8abf-f1353e818c92" containerName="extract-content" Mar 09 13:59:11 crc kubenswrapper[4764]: E0309 13:59:11.401589 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc434dcc-281c-4972-8abf-f1353e818c92" containerName="extract-utilities" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.401597 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc434dcc-281c-4972-8abf-f1353e818c92" containerName="extract-utilities" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.401893 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d2ddcdd-77bf-4dc5-8170-02d297378dcb" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.401946 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc434dcc-281c-4972-8abf-f1353e818c92" containerName="registry-server" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.402832 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.405084 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.405127 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.405587 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.405789 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.410582 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.414431 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv"] Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.545037 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.545160 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.545211 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs48l\" (UniqueName: \"kubernetes.io/projected/cbffc6a1-81df-479c-b40e-3f865c187a73-kube-api-access-xs48l\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.545242 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.646885 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.647407 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs48l\" (UniqueName: \"kubernetes.io/projected/cbffc6a1-81df-479c-b40e-3f865c187a73-kube-api-access-xs48l\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.647531 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.648267 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.652400 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.652644 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.655254 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.665350 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs48l\" (UniqueName: \"kubernetes.io/projected/cbffc6a1-81df-479c-b40e-3f865c187a73-kube-api-access-xs48l\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:11 crc kubenswrapper[4764]: I0309 13:59:11.721589 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:12 crc kubenswrapper[4764]: I0309 13:59:12.268122 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv"] Mar 09 13:59:12 crc kubenswrapper[4764]: I0309 13:59:12.308682 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" event={"ID":"cbffc6a1-81df-479c-b40e-3f865c187a73","Type":"ContainerStarted","Data":"2851c1e5435a239fbe006f90989fb7fbe06dda075e0fa33347b764f78afacfae"} Mar 09 13:59:13 crc kubenswrapper[4764]: I0309 13:59:13.320033 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" event={"ID":"cbffc6a1-81df-479c-b40e-3f865c187a73","Type":"ContainerStarted","Data":"a872ead50e271b964852737e0eae6103b43599f8554d92a0cf781c21b93321d7"} Mar 09 13:59:13 crc kubenswrapper[4764]: I0309 13:59:13.345043 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" podStartSLOduration=1.805215078 podStartE2EDuration="2.345017991s" podCreationTimestamp="2026-03-09 13:59:11 +0000 UTC" firstStartedPulling="2026-03-09 13:59:12.275309019 +0000 UTC m=+2307.525480927" lastFinishedPulling="2026-03-09 13:59:12.815111932 +0000 UTC m=+2308.065283840" observedRunningTime="2026-03-09 13:59:13.342878444 +0000 UTC m=+2308.593050352" watchObservedRunningTime="2026-03-09 13:59:13.345017991 +0000 UTC m=+2308.595189909" Mar 09 13:59:14 crc kubenswrapper[4764]: I0309 13:59:14.561024 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 13:59:14 crc kubenswrapper[4764]: E0309 13:59:14.562148 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:59:17 crc kubenswrapper[4764]: I0309 13:59:17.377695 4764 generic.go:334] "Generic (PLEG): container finished" podID="cbffc6a1-81df-479c-b40e-3f865c187a73" containerID="a872ead50e271b964852737e0eae6103b43599f8554d92a0cf781c21b93321d7" exitCode=0 Mar 09 13:59:17 crc kubenswrapper[4764]: I0309 13:59:17.377801 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" event={"ID":"cbffc6a1-81df-479c-b40e-3f865c187a73","Type":"ContainerDied","Data":"a872ead50e271b964852737e0eae6103b43599f8554d92a0cf781c21b93321d7"} Mar 09 13:59:18 crc kubenswrapper[4764]: I0309 13:59:18.857873 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:18 crc kubenswrapper[4764]: I0309 13:59:18.913598 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-ceph\") pod \"cbffc6a1-81df-479c-b40e-3f865c187a73\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " Mar 09 13:59:18 crc kubenswrapper[4764]: I0309 13:59:18.913764 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-inventory\") pod \"cbffc6a1-81df-479c-b40e-3f865c187a73\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " Mar 09 13:59:18 crc kubenswrapper[4764]: I0309 13:59:18.913924 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs48l\" (UniqueName: \"kubernetes.io/projected/cbffc6a1-81df-479c-b40e-3f865c187a73-kube-api-access-xs48l\") pod \"cbffc6a1-81df-479c-b40e-3f865c187a73\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " Mar 09 13:59:18 crc kubenswrapper[4764]: I0309 13:59:18.914015 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-ssh-key-openstack-edpm-ipam\") pod \"cbffc6a1-81df-479c-b40e-3f865c187a73\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " Mar 09 13:59:18 crc kubenswrapper[4764]: I0309 13:59:18.921751 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-ceph" (OuterVolumeSpecName: "ceph") pod "cbffc6a1-81df-479c-b40e-3f865c187a73" (UID: "cbffc6a1-81df-479c-b40e-3f865c187a73"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:59:18 crc kubenswrapper[4764]: I0309 13:59:18.923935 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbffc6a1-81df-479c-b40e-3f865c187a73-kube-api-access-xs48l" (OuterVolumeSpecName: "kube-api-access-xs48l") pod "cbffc6a1-81df-479c-b40e-3f865c187a73" (UID: "cbffc6a1-81df-479c-b40e-3f865c187a73"). InnerVolumeSpecName "kube-api-access-xs48l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:59:18 crc kubenswrapper[4764]: E0309 13:59:18.942823 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-ssh-key-openstack-edpm-ipam podName:cbffc6a1-81df-479c-b40e-3f865c187a73 nodeName:}" failed. No retries permitted until 2026-03-09 13:59:19.442779766 +0000 UTC m=+2314.692951674 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key-openstack-edpm-ipam" (UniqueName: "kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-ssh-key-openstack-edpm-ipam") pod "cbffc6a1-81df-479c-b40e-3f865c187a73" (UID: "cbffc6a1-81df-479c-b40e-3f865c187a73") : error deleting /var/lib/kubelet/pods/cbffc6a1-81df-479c-b40e-3f865c187a73/volume-subpaths: remove /var/lib/kubelet/pods/cbffc6a1-81df-479c-b40e-3f865c187a73/volume-subpaths: no such file or directory Mar 09 13:59:18 crc kubenswrapper[4764]: I0309 13:59:18.946268 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-inventory" (OuterVolumeSpecName: "inventory") pod "cbffc6a1-81df-479c-b40e-3f865c187a73" (UID: "cbffc6a1-81df-479c-b40e-3f865c187a73"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.017043 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.017087 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs48l\" (UniqueName: \"kubernetes.io/projected/cbffc6a1-81df-479c-b40e-3f865c187a73-kube-api-access-xs48l\") on node \"crc\" DevicePath \"\"" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.017099 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.399839 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" event={"ID":"cbffc6a1-81df-479c-b40e-3f865c187a73","Type":"ContainerDied","Data":"2851c1e5435a239fbe006f90989fb7fbe06dda075e0fa33347b764f78afacfae"} Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.400330 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2851c1e5435a239fbe006f90989fb7fbe06dda075e0fa33347b764f78afacfae" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.399923 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.528308 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-ssh-key-openstack-edpm-ipam\") pod \"cbffc6a1-81df-479c-b40e-3f865c187a73\" (UID: \"cbffc6a1-81df-479c-b40e-3f865c187a73\") " Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.528431 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl"] Mar 09 13:59:19 crc kubenswrapper[4764]: E0309 13:59:19.528988 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbffc6a1-81df-479c-b40e-3f865c187a73" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.529013 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbffc6a1-81df-479c-b40e-3f865c187a73" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.529225 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbffc6a1-81df-479c-b40e-3f865c187a73" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.530098 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.535565 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cbffc6a1-81df-479c-b40e-3f865c187a73" (UID: "cbffc6a1-81df-479c-b40e-3f865c187a73"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.548438 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl"] Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.631606 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.631730 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.631789 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.632360 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t9pd\" (UniqueName: \"kubernetes.io/projected/ea3a2b04-e009-4dcd-8eca-543cc084b329-kube-api-access-4t9pd\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.632834 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cbffc6a1-81df-479c-b40e-3f865c187a73-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.735200 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.735373 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t9pd\" (UniqueName: \"kubernetes.io/projected/ea3a2b04-e009-4dcd-8eca-543cc084b329-kube-api-access-4t9pd\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.735448 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.735515 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.740018 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.740213 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.741458 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.763228 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t9pd\" (UniqueName: \"kubernetes.io/projected/ea3a2b04-e009-4dcd-8eca-543cc084b329-kube-api-access-4t9pd\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:19 crc kubenswrapper[4764]: I0309 13:59:19.896231 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:20 crc kubenswrapper[4764]: I0309 13:59:20.509278 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl"] Mar 09 13:59:20 crc kubenswrapper[4764]: W0309 13:59:20.513208 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea3a2b04_e009_4dcd_8eca_543cc084b329.slice/crio-29a0e5d6218a8119d39ffa3dd74128ff15504a3840777dcd48ebe0d5cf642624 WatchSource:0}: Error finding container 29a0e5d6218a8119d39ffa3dd74128ff15504a3840777dcd48ebe0d5cf642624: Status 404 returned error can't find the container with id 29a0e5d6218a8119d39ffa3dd74128ff15504a3840777dcd48ebe0d5cf642624 Mar 09 13:59:21 crc kubenswrapper[4764]: I0309 13:59:21.425244 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" event={"ID":"ea3a2b04-e009-4dcd-8eca-543cc084b329","Type":"ContainerStarted","Data":"91d47ab481b9f29f3401d128d19aa21a508e0899ce7b78b79fba40dc480e7870"} Mar 09 13:59:21 crc kubenswrapper[4764]: I0309 13:59:21.425667 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" event={"ID":"ea3a2b04-e009-4dcd-8eca-543cc084b329","Type":"ContainerStarted","Data":"29a0e5d6218a8119d39ffa3dd74128ff15504a3840777dcd48ebe0d5cf642624"} Mar 09 13:59:21 crc kubenswrapper[4764]: I0309 13:59:21.462631 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" podStartSLOduration=2.026957245 podStartE2EDuration="2.462602359s" podCreationTimestamp="2026-03-09 13:59:19 +0000 UTC" firstStartedPulling="2026-03-09 13:59:20.517278476 +0000 UTC m=+2315.767450404" lastFinishedPulling="2026-03-09 13:59:20.95292361 +0000 UTC m=+2316.203095518" observedRunningTime="2026-03-09 13:59:21.452082779 +0000 UTC m=+2316.702254697" watchObservedRunningTime="2026-03-09 13:59:21.462602359 +0000 UTC m=+2316.712774267" Mar 09 13:59:25 crc kubenswrapper[4764]: I0309 13:59:25.567986 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 13:59:25 crc kubenswrapper[4764]: E0309 13:59:25.569012 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:59:34 crc kubenswrapper[4764]: I0309 13:59:34.938869 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9bhcc"] Mar 09 13:59:34 crc kubenswrapper[4764]: I0309 13:59:34.942586 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:34 crc kubenswrapper[4764]: I0309 13:59:34.951212 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9bhcc"] Mar 09 13:59:34 crc kubenswrapper[4764]: I0309 13:59:34.975718 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bbjx\" (UniqueName: \"kubernetes.io/projected/9d992693-633d-4d51-9c8d-965e2ee308f6-kube-api-access-5bbjx\") pod \"redhat-marketplace-9bhcc\" (UID: \"9d992693-633d-4d51-9c8d-965e2ee308f6\") " pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:34 crc kubenswrapper[4764]: I0309 13:59:34.975829 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d992693-633d-4d51-9c8d-965e2ee308f6-catalog-content\") pod \"redhat-marketplace-9bhcc\" (UID: \"9d992693-633d-4d51-9c8d-965e2ee308f6\") " pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:34 crc kubenswrapper[4764]: I0309 13:59:34.975956 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d992693-633d-4d51-9c8d-965e2ee308f6-utilities\") pod \"redhat-marketplace-9bhcc\" (UID: \"9d992693-633d-4d51-9c8d-965e2ee308f6\") " pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:35 crc kubenswrapper[4764]: I0309 13:59:35.078382 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d992693-633d-4d51-9c8d-965e2ee308f6-catalog-content\") pod \"redhat-marketplace-9bhcc\" (UID: \"9d992693-633d-4d51-9c8d-965e2ee308f6\") " pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:35 crc kubenswrapper[4764]: I0309 13:59:35.078526 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d992693-633d-4d51-9c8d-965e2ee308f6-utilities\") pod \"redhat-marketplace-9bhcc\" (UID: \"9d992693-633d-4d51-9c8d-965e2ee308f6\") " pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:35 crc kubenswrapper[4764]: I0309 13:59:35.078621 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bbjx\" (UniqueName: \"kubernetes.io/projected/9d992693-633d-4d51-9c8d-965e2ee308f6-kube-api-access-5bbjx\") pod \"redhat-marketplace-9bhcc\" (UID: \"9d992693-633d-4d51-9c8d-965e2ee308f6\") " pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:35 crc kubenswrapper[4764]: I0309 13:59:35.078895 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d992693-633d-4d51-9c8d-965e2ee308f6-catalog-content\") pod \"redhat-marketplace-9bhcc\" (UID: \"9d992693-633d-4d51-9c8d-965e2ee308f6\") " pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:35 crc kubenswrapper[4764]: I0309 13:59:35.079212 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d992693-633d-4d51-9c8d-965e2ee308f6-utilities\") pod \"redhat-marketplace-9bhcc\" (UID: \"9d992693-633d-4d51-9c8d-965e2ee308f6\") " pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:35 crc kubenswrapper[4764]: I0309 13:59:35.108166 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bbjx\" (UniqueName: \"kubernetes.io/projected/9d992693-633d-4d51-9c8d-965e2ee308f6-kube-api-access-5bbjx\") pod \"redhat-marketplace-9bhcc\" (UID: \"9d992693-633d-4d51-9c8d-965e2ee308f6\") " pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:35 crc kubenswrapper[4764]: I0309 13:59:35.278240 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:35 crc kubenswrapper[4764]: I0309 13:59:35.626385 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9bhcc"] Mar 09 13:59:35 crc kubenswrapper[4764]: W0309 13:59:35.647407 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d992693_633d_4d51_9c8d_965e2ee308f6.slice/crio-9b55baf19f6fada22e37fe1cc527b71c3a02abd3b4bda4dc7937acfff5ac91bf WatchSource:0}: Error finding container 9b55baf19f6fada22e37fe1cc527b71c3a02abd3b4bda4dc7937acfff5ac91bf: Status 404 returned error can't find the container with id 9b55baf19f6fada22e37fe1cc527b71c3a02abd3b4bda4dc7937acfff5ac91bf Mar 09 13:59:36 crc kubenswrapper[4764]: I0309 13:59:36.561714 4764 generic.go:334] "Generic (PLEG): container finished" podID="9d992693-633d-4d51-9c8d-965e2ee308f6" containerID="10b85fab7d63359ecd39b07502ac92df194594bb649d4a8636d61a702e9c2ca9" exitCode=0 Mar 09 13:59:36 crc kubenswrapper[4764]: I0309 13:59:36.561791 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bhcc" event={"ID":"9d992693-633d-4d51-9c8d-965e2ee308f6","Type":"ContainerDied","Data":"10b85fab7d63359ecd39b07502ac92df194594bb649d4a8636d61a702e9c2ca9"} Mar 09 13:59:36 crc kubenswrapper[4764]: I0309 13:59:36.561856 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bhcc" event={"ID":"9d992693-633d-4d51-9c8d-965e2ee308f6","Type":"ContainerStarted","Data":"9b55baf19f6fada22e37fe1cc527b71c3a02abd3b4bda4dc7937acfff5ac91bf"} Mar 09 13:59:37 crc kubenswrapper[4764]: I0309 13:59:37.559638 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 13:59:37 crc kubenswrapper[4764]: E0309 13:59:37.560427 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:59:37 crc kubenswrapper[4764]: I0309 13:59:37.575055 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bhcc" event={"ID":"9d992693-633d-4d51-9c8d-965e2ee308f6","Type":"ContainerStarted","Data":"5665e0e89795cb215fdc2084c3f5e88666f9a37297088c9e625faf6bee01a1c4"} Mar 09 13:59:38 crc kubenswrapper[4764]: I0309 13:59:38.613811 4764 generic.go:334] "Generic (PLEG): container finished" podID="9d992693-633d-4d51-9c8d-965e2ee308f6" containerID="5665e0e89795cb215fdc2084c3f5e88666f9a37297088c9e625faf6bee01a1c4" exitCode=0 Mar 09 13:59:38 crc kubenswrapper[4764]: I0309 13:59:38.613928 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bhcc" event={"ID":"9d992693-633d-4d51-9c8d-965e2ee308f6","Type":"ContainerDied","Data":"5665e0e89795cb215fdc2084c3f5e88666f9a37297088c9e625faf6bee01a1c4"} Mar 09 13:59:39 crc kubenswrapper[4764]: I0309 13:59:39.628523 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bhcc" event={"ID":"9d992693-633d-4d51-9c8d-965e2ee308f6","Type":"ContainerStarted","Data":"c2852137dee859a501c3f8a28af6eb95c9b607c41dee834f3e668b1c1c9858e0"} Mar 09 13:59:39 crc kubenswrapper[4764]: I0309 13:59:39.653178 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9bhcc" podStartSLOduration=3.119546915 podStartE2EDuration="5.653151056s" podCreationTimestamp="2026-03-09 13:59:34 +0000 UTC" firstStartedPulling="2026-03-09 13:59:36.564545397 +0000 UTC m=+2331.814717305" lastFinishedPulling="2026-03-09 13:59:39.098149538 +0000 UTC m=+2334.348321446" observedRunningTime="2026-03-09 13:59:39.651483382 +0000 UTC m=+2334.901655310" watchObservedRunningTime="2026-03-09 13:59:39.653151056 +0000 UTC m=+2334.903322984" Mar 09 13:59:45 crc kubenswrapper[4764]: I0309 13:59:45.279709 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:45 crc kubenswrapper[4764]: I0309 13:59:45.280793 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:45 crc kubenswrapper[4764]: I0309 13:59:45.339120 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:45 crc kubenswrapper[4764]: I0309 13:59:45.758090 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:45 crc kubenswrapper[4764]: I0309 13:59:45.821395 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9bhcc"] Mar 09 13:59:47 crc kubenswrapper[4764]: I0309 13:59:47.716204 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9bhcc" podUID="9d992693-633d-4d51-9c8d-965e2ee308f6" containerName="registry-server" containerID="cri-o://c2852137dee859a501c3f8a28af6eb95c9b607c41dee834f3e668b1c1c9858e0" gracePeriod=2 Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.560401 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 13:59:48 crc kubenswrapper[4764]: E0309 13:59:48.561219 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.706391 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.730589 4764 generic.go:334] "Generic (PLEG): container finished" podID="9d992693-633d-4d51-9c8d-965e2ee308f6" containerID="c2852137dee859a501c3f8a28af6eb95c9b607c41dee834f3e668b1c1c9858e0" exitCode=0 Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.730705 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bhcc" event={"ID":"9d992693-633d-4d51-9c8d-965e2ee308f6","Type":"ContainerDied","Data":"c2852137dee859a501c3f8a28af6eb95c9b607c41dee834f3e668b1c1c9858e0"} Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.730816 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bhcc" event={"ID":"9d992693-633d-4d51-9c8d-965e2ee308f6","Type":"ContainerDied","Data":"9b55baf19f6fada22e37fe1cc527b71c3a02abd3b4bda4dc7937acfff5ac91bf"} Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.730847 4764 scope.go:117] "RemoveContainer" containerID="c2852137dee859a501c3f8a28af6eb95c9b607c41dee834f3e668b1c1c9858e0" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.730737 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9bhcc" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.771111 4764 scope.go:117] "RemoveContainer" containerID="5665e0e89795cb215fdc2084c3f5e88666f9a37297088c9e625faf6bee01a1c4" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.796334 4764 scope.go:117] "RemoveContainer" containerID="10b85fab7d63359ecd39b07502ac92df194594bb649d4a8636d61a702e9c2ca9" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.818000 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bbjx\" (UniqueName: \"kubernetes.io/projected/9d992693-633d-4d51-9c8d-965e2ee308f6-kube-api-access-5bbjx\") pod \"9d992693-633d-4d51-9c8d-965e2ee308f6\" (UID: \"9d992693-633d-4d51-9c8d-965e2ee308f6\") " Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.818075 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d992693-633d-4d51-9c8d-965e2ee308f6-utilities\") pod \"9d992693-633d-4d51-9c8d-965e2ee308f6\" (UID: \"9d992693-633d-4d51-9c8d-965e2ee308f6\") " Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.818371 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d992693-633d-4d51-9c8d-965e2ee308f6-catalog-content\") pod \"9d992693-633d-4d51-9c8d-965e2ee308f6\" (UID: \"9d992693-633d-4d51-9c8d-965e2ee308f6\") " Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.819528 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d992693-633d-4d51-9c8d-965e2ee308f6-utilities" (OuterVolumeSpecName: "utilities") pod "9d992693-633d-4d51-9c8d-965e2ee308f6" (UID: "9d992693-633d-4d51-9c8d-965e2ee308f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.831960 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d992693-633d-4d51-9c8d-965e2ee308f6-kube-api-access-5bbjx" (OuterVolumeSpecName: "kube-api-access-5bbjx") pod "9d992693-633d-4d51-9c8d-965e2ee308f6" (UID: "9d992693-633d-4d51-9c8d-965e2ee308f6"). InnerVolumeSpecName "kube-api-access-5bbjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.849290 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d992693-633d-4d51-9c8d-965e2ee308f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d992693-633d-4d51-9c8d-965e2ee308f6" (UID: "9d992693-633d-4d51-9c8d-965e2ee308f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.851046 4764 scope.go:117] "RemoveContainer" containerID="c2852137dee859a501c3f8a28af6eb95c9b607c41dee834f3e668b1c1c9858e0" Mar 09 13:59:48 crc kubenswrapper[4764]: E0309 13:59:48.852860 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2852137dee859a501c3f8a28af6eb95c9b607c41dee834f3e668b1c1c9858e0\": container with ID starting with c2852137dee859a501c3f8a28af6eb95c9b607c41dee834f3e668b1c1c9858e0 not found: ID does not exist" containerID="c2852137dee859a501c3f8a28af6eb95c9b607c41dee834f3e668b1c1c9858e0" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.852901 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2852137dee859a501c3f8a28af6eb95c9b607c41dee834f3e668b1c1c9858e0"} err="failed to get container status \"c2852137dee859a501c3f8a28af6eb95c9b607c41dee834f3e668b1c1c9858e0\": rpc error: code = NotFound desc = could not find container \"c2852137dee859a501c3f8a28af6eb95c9b607c41dee834f3e668b1c1c9858e0\": container with ID starting with c2852137dee859a501c3f8a28af6eb95c9b607c41dee834f3e668b1c1c9858e0 not found: ID does not exist" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.852929 4764 scope.go:117] "RemoveContainer" containerID="5665e0e89795cb215fdc2084c3f5e88666f9a37297088c9e625faf6bee01a1c4" Mar 09 13:59:48 crc kubenswrapper[4764]: E0309 13:59:48.853516 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5665e0e89795cb215fdc2084c3f5e88666f9a37297088c9e625faf6bee01a1c4\": container with ID starting with 5665e0e89795cb215fdc2084c3f5e88666f9a37297088c9e625faf6bee01a1c4 not found: ID does not exist" containerID="5665e0e89795cb215fdc2084c3f5e88666f9a37297088c9e625faf6bee01a1c4" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.853564 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5665e0e89795cb215fdc2084c3f5e88666f9a37297088c9e625faf6bee01a1c4"} err="failed to get container status \"5665e0e89795cb215fdc2084c3f5e88666f9a37297088c9e625faf6bee01a1c4\": rpc error: code = NotFound desc = could not find container \"5665e0e89795cb215fdc2084c3f5e88666f9a37297088c9e625faf6bee01a1c4\": container with ID starting with 5665e0e89795cb215fdc2084c3f5e88666f9a37297088c9e625faf6bee01a1c4 not found: ID does not exist" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.853598 4764 scope.go:117] "RemoveContainer" containerID="10b85fab7d63359ecd39b07502ac92df194594bb649d4a8636d61a702e9c2ca9" Mar 09 13:59:48 crc kubenswrapper[4764]: E0309 13:59:48.854554 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10b85fab7d63359ecd39b07502ac92df194594bb649d4a8636d61a702e9c2ca9\": container with ID starting with 10b85fab7d63359ecd39b07502ac92df194594bb649d4a8636d61a702e9c2ca9 not found: ID does not exist" containerID="10b85fab7d63359ecd39b07502ac92df194594bb649d4a8636d61a702e9c2ca9" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.854626 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10b85fab7d63359ecd39b07502ac92df194594bb649d4a8636d61a702e9c2ca9"} err="failed to get container status \"10b85fab7d63359ecd39b07502ac92df194594bb649d4a8636d61a702e9c2ca9\": rpc error: code = NotFound desc = could not find container \"10b85fab7d63359ecd39b07502ac92df194594bb649d4a8636d61a702e9c2ca9\": container with ID starting with 10b85fab7d63359ecd39b07502ac92df194594bb649d4a8636d61a702e9c2ca9 not found: ID does not exist" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.921730 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bbjx\" (UniqueName: \"kubernetes.io/projected/9d992693-633d-4d51-9c8d-965e2ee308f6-kube-api-access-5bbjx\") on node \"crc\" DevicePath \"\"" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.921772 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d992693-633d-4d51-9c8d-965e2ee308f6-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:59:48 crc kubenswrapper[4764]: I0309 13:59:48.921786 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d992693-633d-4d51-9c8d-965e2ee308f6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:59:49 crc kubenswrapper[4764]: I0309 13:59:49.075754 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9bhcc"] Mar 09 13:59:49 crc kubenswrapper[4764]: I0309 13:59:49.088169 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9bhcc"] Mar 09 13:59:49 crc kubenswrapper[4764]: E0309 13:59:49.224192 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d992693_633d_4d51_9c8d_965e2ee308f6.slice/crio-9b55baf19f6fada22e37fe1cc527b71c3a02abd3b4bda4dc7937acfff5ac91bf\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d992693_633d_4d51_9c8d_965e2ee308f6.slice\": RecentStats: unable to find data in memory cache]" Mar 09 13:59:49 crc kubenswrapper[4764]: I0309 13:59:49.574093 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d992693-633d-4d51-9c8d-965e2ee308f6" path="/var/lib/kubelet/pods/9d992693-633d-4d51-9c8d-965e2ee308f6/volumes" Mar 09 13:59:57 crc kubenswrapper[4764]: I0309 13:59:57.837918 4764 generic.go:334] "Generic (PLEG): container finished" podID="ea3a2b04-e009-4dcd-8eca-543cc084b329" containerID="91d47ab481b9f29f3401d128d19aa21a508e0899ce7b78b79fba40dc480e7870" exitCode=0 Mar 09 13:59:57 crc kubenswrapper[4764]: I0309 13:59:57.838550 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" event={"ID":"ea3a2b04-e009-4dcd-8eca-543cc084b329","Type":"ContainerDied","Data":"91d47ab481b9f29f3401d128d19aa21a508e0899ce7b78b79fba40dc480e7870"} Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.280794 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.352019 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-ceph\") pod \"ea3a2b04-e009-4dcd-8eca-543cc084b329\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.352313 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-inventory\") pod \"ea3a2b04-e009-4dcd-8eca-543cc084b329\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.352373 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t9pd\" (UniqueName: \"kubernetes.io/projected/ea3a2b04-e009-4dcd-8eca-543cc084b329-kube-api-access-4t9pd\") pod \"ea3a2b04-e009-4dcd-8eca-543cc084b329\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.352403 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-ssh-key-openstack-edpm-ipam\") pod \"ea3a2b04-e009-4dcd-8eca-543cc084b329\" (UID: \"ea3a2b04-e009-4dcd-8eca-543cc084b329\") " Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.360500 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-ceph" (OuterVolumeSpecName: "ceph") pod "ea3a2b04-e009-4dcd-8eca-543cc084b329" (UID: "ea3a2b04-e009-4dcd-8eca-543cc084b329"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.360937 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea3a2b04-e009-4dcd-8eca-543cc084b329-kube-api-access-4t9pd" (OuterVolumeSpecName: "kube-api-access-4t9pd") pod "ea3a2b04-e009-4dcd-8eca-543cc084b329" (UID: "ea3a2b04-e009-4dcd-8eca-543cc084b329"). InnerVolumeSpecName "kube-api-access-4t9pd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.385737 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ea3a2b04-e009-4dcd-8eca-543cc084b329" (UID: "ea3a2b04-e009-4dcd-8eca-543cc084b329"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.400961 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-inventory" (OuterVolumeSpecName: "inventory") pod "ea3a2b04-e009-4dcd-8eca-543cc084b329" (UID: "ea3a2b04-e009-4dcd-8eca-543cc084b329"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.457747 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.458398 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t9pd\" (UniqueName: \"kubernetes.io/projected/ea3a2b04-e009-4dcd-8eca-543cc084b329-kube-api-access-4t9pd\") on node \"crc\" DevicePath \"\"" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.458416 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.458430 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea3a2b04-e009-4dcd-8eca-543cc084b329-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.869631 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" event={"ID":"ea3a2b04-e009-4dcd-8eca-543cc084b329","Type":"ContainerDied","Data":"29a0e5d6218a8119d39ffa3dd74128ff15504a3840777dcd48ebe0d5cf642624"} Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.869703 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29a0e5d6218a8119d39ffa3dd74128ff15504a3840777dcd48ebe0d5cf642624" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.869805 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.984018 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-x85q2"] Mar 09 13:59:59 crc kubenswrapper[4764]: E0309 13:59:59.984580 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d992693-633d-4d51-9c8d-965e2ee308f6" containerName="extract-content" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.984600 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d992693-633d-4d51-9c8d-965e2ee308f6" containerName="extract-content" Mar 09 13:59:59 crc kubenswrapper[4764]: E0309 13:59:59.984616 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d992693-633d-4d51-9c8d-965e2ee308f6" containerName="extract-utilities" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.984624 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d992693-633d-4d51-9c8d-965e2ee308f6" containerName="extract-utilities" Mar 09 13:59:59 crc kubenswrapper[4764]: E0309 13:59:59.984659 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea3a2b04-e009-4dcd-8eca-543cc084b329" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.984667 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3a2b04-e009-4dcd-8eca-543cc084b329" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:59:59 crc kubenswrapper[4764]: E0309 13:59:59.984693 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d992693-633d-4d51-9c8d-965e2ee308f6" containerName="registry-server" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.984700 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d992693-633d-4d51-9c8d-965e2ee308f6" containerName="registry-server" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.984895 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d992693-633d-4d51-9c8d-965e2ee308f6" containerName="registry-server" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.984915 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea3a2b04-e009-4dcd-8eca-543cc084b329" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.985815 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.992391 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.992835 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.992863 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.992908 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.995048 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 13:59:59 crc kubenswrapper[4764]: I0309 13:59:59.998254 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-x85q2"] Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.076402 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2q9z\" (UniqueName: \"kubernetes.io/projected/23319545-4107-4a83-b7e1-955e4648bf7b-kube-api-access-q2q9z\") pod \"ssh-known-hosts-edpm-deployment-x85q2\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.076471 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-x85q2\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.076537 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-ceph\") pod \"ssh-known-hosts-edpm-deployment-x85q2\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.076606 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-x85q2\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.139894 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551080-svz2w"] Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.141583 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551080-svz2w" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.144230 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.144263 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.144312 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.151111 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551080-svz2w"] Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.179170 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2q9z\" (UniqueName: \"kubernetes.io/projected/23319545-4107-4a83-b7e1-955e4648bf7b-kube-api-access-q2q9z\") pod \"ssh-known-hosts-edpm-deployment-x85q2\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.179246 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-x85q2\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.179290 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-ceph\") pod \"ssh-known-hosts-edpm-deployment-x85q2\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.179328 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-x85q2\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.185757 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-ceph\") pod \"ssh-known-hosts-edpm-deployment-x85q2\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.186063 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-x85q2\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.186676 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-x85q2\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.196564 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2q9z\" (UniqueName: \"kubernetes.io/projected/23319545-4107-4a83-b7e1-955e4648bf7b-kube-api-access-q2q9z\") pod \"ssh-known-hosts-edpm-deployment-x85q2\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.245136 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g"] Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.247929 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.253234 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.253567 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.257977 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g"] Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.281076 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8vx5\" (UniqueName: \"kubernetes.io/projected/f6eebc0e-7e89-4489-b808-7eebf0e54dca-kube-api-access-d8vx5\") pod \"auto-csr-approver-29551080-svz2w\" (UID: \"f6eebc0e-7e89-4489-b808-7eebf0e54dca\") " pod="openshift-infra/auto-csr-approver-29551080-svz2w" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.309695 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.383597 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f13122f-94d3-47ba-9c7c-989ebe96468e-secret-volume\") pod \"collect-profiles-29551080-n964g\" (UID: \"4f13122f-94d3-47ba-9c7c-989ebe96468e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.383704 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f13122f-94d3-47ba-9c7c-989ebe96468e-config-volume\") pod \"collect-profiles-29551080-n964g\" (UID: \"4f13122f-94d3-47ba-9c7c-989ebe96468e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.384174 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5dvz\" (UniqueName: \"kubernetes.io/projected/4f13122f-94d3-47ba-9c7c-989ebe96468e-kube-api-access-d5dvz\") pod \"collect-profiles-29551080-n964g\" (UID: \"4f13122f-94d3-47ba-9c7c-989ebe96468e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.384563 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8vx5\" (UniqueName: \"kubernetes.io/projected/f6eebc0e-7e89-4489-b808-7eebf0e54dca-kube-api-access-d8vx5\") pod \"auto-csr-approver-29551080-svz2w\" (UID: \"f6eebc0e-7e89-4489-b808-7eebf0e54dca\") " pod="openshift-infra/auto-csr-approver-29551080-svz2w" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.412858 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8vx5\" (UniqueName: \"kubernetes.io/projected/f6eebc0e-7e89-4489-b808-7eebf0e54dca-kube-api-access-d8vx5\") pod \"auto-csr-approver-29551080-svz2w\" (UID: \"f6eebc0e-7e89-4489-b808-7eebf0e54dca\") " pod="openshift-infra/auto-csr-approver-29551080-svz2w" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.460207 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551080-svz2w" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.486356 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f13122f-94d3-47ba-9c7c-989ebe96468e-secret-volume\") pod \"collect-profiles-29551080-n964g\" (UID: \"4f13122f-94d3-47ba-9c7c-989ebe96468e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.486463 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f13122f-94d3-47ba-9c7c-989ebe96468e-config-volume\") pod \"collect-profiles-29551080-n964g\" (UID: \"4f13122f-94d3-47ba-9c7c-989ebe96468e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.486557 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5dvz\" (UniqueName: \"kubernetes.io/projected/4f13122f-94d3-47ba-9c7c-989ebe96468e-kube-api-access-d5dvz\") pod \"collect-profiles-29551080-n964g\" (UID: \"4f13122f-94d3-47ba-9c7c-989ebe96468e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.487814 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f13122f-94d3-47ba-9c7c-989ebe96468e-config-volume\") pod \"collect-profiles-29551080-n964g\" (UID: \"4f13122f-94d3-47ba-9c7c-989ebe96468e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.490387 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f13122f-94d3-47ba-9c7c-989ebe96468e-secret-volume\") pod \"collect-profiles-29551080-n964g\" (UID: \"4f13122f-94d3-47ba-9c7c-989ebe96468e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.511806 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5dvz\" (UniqueName: \"kubernetes.io/projected/4f13122f-94d3-47ba-9c7c-989ebe96468e-kube-api-access-d5dvz\") pod \"collect-profiles-29551080-n964g\" (UID: \"4f13122f-94d3-47ba-9c7c-989ebe96468e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.596780 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.888202 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-x85q2"] Mar 09 14:00:00 crc kubenswrapper[4764]: I0309 14:00:00.903606 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551080-svz2w"] Mar 09 14:00:00 crc kubenswrapper[4764]: W0309 14:00:00.905458 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6eebc0e_7e89_4489_b808_7eebf0e54dca.slice/crio-b89a492d7501ec70f38c24543ed56396567de1540f48bf44ac303cff316d4171 WatchSource:0}: Error finding container b89a492d7501ec70f38c24543ed56396567de1540f48bf44ac303cff316d4171: Status 404 returned error can't find the container with id b89a492d7501ec70f38c24543ed56396567de1540f48bf44ac303cff316d4171 Mar 09 14:00:01 crc kubenswrapper[4764]: I0309 14:00:01.039782 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g"] Mar 09 14:00:01 crc kubenswrapper[4764]: W0309 14:00:01.043198 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f13122f_94d3_47ba_9c7c_989ebe96468e.slice/crio-243e774d47f8d9a657e2bda283d0e4b662d6968f445495e7b80e3a737580d3e8 WatchSource:0}: Error finding container 243e774d47f8d9a657e2bda283d0e4b662d6968f445495e7b80e3a737580d3e8: Status 404 returned error can't find the container with id 243e774d47f8d9a657e2bda283d0e4b662d6968f445495e7b80e3a737580d3e8 Mar 09 14:00:01 crc kubenswrapper[4764]: I0309 14:00:01.560730 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:00:01 crc kubenswrapper[4764]: E0309 14:00:01.561504 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:00:01 crc kubenswrapper[4764]: I0309 14:00:01.902724 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" event={"ID":"23319545-4107-4a83-b7e1-955e4648bf7b","Type":"ContainerStarted","Data":"8012376628a2434c25ec25a001db6c303665a134e1d1b05ef3e255939acfe13c"} Mar 09 14:00:01 crc kubenswrapper[4764]: I0309 14:00:01.903143 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" event={"ID":"23319545-4107-4a83-b7e1-955e4648bf7b","Type":"ContainerStarted","Data":"892d7bf1897aed2e0545cb05dd3f303e9e4c066119aefab5bda9572a1af4261c"} Mar 09 14:00:01 crc kubenswrapper[4764]: I0309 14:00:01.908474 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551080-svz2w" event={"ID":"f6eebc0e-7e89-4489-b808-7eebf0e54dca","Type":"ContainerStarted","Data":"b89a492d7501ec70f38c24543ed56396567de1540f48bf44ac303cff316d4171"} Mar 09 14:00:01 crc kubenswrapper[4764]: I0309 14:00:01.912764 4764 generic.go:334] "Generic (PLEG): container finished" podID="4f13122f-94d3-47ba-9c7c-989ebe96468e" containerID="619f9f75884e6a89c52f4fcf55d2dfba6aa2fb01bdb5db49c229617e54ab7608" exitCode=0 Mar 09 14:00:01 crc kubenswrapper[4764]: I0309 14:00:01.912879 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" event={"ID":"4f13122f-94d3-47ba-9c7c-989ebe96468e","Type":"ContainerDied","Data":"619f9f75884e6a89c52f4fcf55d2dfba6aa2fb01bdb5db49c229617e54ab7608"} Mar 09 14:00:01 crc kubenswrapper[4764]: I0309 14:00:01.912994 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" event={"ID":"4f13122f-94d3-47ba-9c7c-989ebe96468e","Type":"ContainerStarted","Data":"243e774d47f8d9a657e2bda283d0e4b662d6968f445495e7b80e3a737580d3e8"} Mar 09 14:00:01 crc kubenswrapper[4764]: I0309 14:00:01.927379 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" podStartSLOduration=2.359672535 podStartE2EDuration="2.927351282s" podCreationTimestamp="2026-03-09 13:59:59 +0000 UTC" firstStartedPulling="2026-03-09 14:00:00.893425006 +0000 UTC m=+2356.143596904" lastFinishedPulling="2026-03-09 14:00:01.461103743 +0000 UTC m=+2356.711275651" observedRunningTime="2026-03-09 14:00:01.925053351 +0000 UTC m=+2357.175225259" watchObservedRunningTime="2026-03-09 14:00:01.927351282 +0000 UTC m=+2357.177523190" Mar 09 14:00:03 crc kubenswrapper[4764]: I0309 14:00:03.265061 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" Mar 09 14:00:03 crc kubenswrapper[4764]: I0309 14:00:03.351755 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f13122f-94d3-47ba-9c7c-989ebe96468e-secret-volume\") pod \"4f13122f-94d3-47ba-9c7c-989ebe96468e\" (UID: \"4f13122f-94d3-47ba-9c7c-989ebe96468e\") " Mar 09 14:00:03 crc kubenswrapper[4764]: I0309 14:00:03.351852 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5dvz\" (UniqueName: \"kubernetes.io/projected/4f13122f-94d3-47ba-9c7c-989ebe96468e-kube-api-access-d5dvz\") pod \"4f13122f-94d3-47ba-9c7c-989ebe96468e\" (UID: \"4f13122f-94d3-47ba-9c7c-989ebe96468e\") " Mar 09 14:00:03 crc kubenswrapper[4764]: I0309 14:00:03.351923 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f13122f-94d3-47ba-9c7c-989ebe96468e-config-volume\") pod \"4f13122f-94d3-47ba-9c7c-989ebe96468e\" (UID: \"4f13122f-94d3-47ba-9c7c-989ebe96468e\") " Mar 09 14:00:03 crc kubenswrapper[4764]: I0309 14:00:03.353251 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f13122f-94d3-47ba-9c7c-989ebe96468e-config-volume" (OuterVolumeSpecName: "config-volume") pod "4f13122f-94d3-47ba-9c7c-989ebe96468e" (UID: "4f13122f-94d3-47ba-9c7c-989ebe96468e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:00:03 crc kubenswrapper[4764]: I0309 14:00:03.360060 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f13122f-94d3-47ba-9c7c-989ebe96468e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4f13122f-94d3-47ba-9c7c-989ebe96468e" (UID: "4f13122f-94d3-47ba-9c7c-989ebe96468e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:00:03 crc kubenswrapper[4764]: I0309 14:00:03.361520 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f13122f-94d3-47ba-9c7c-989ebe96468e-kube-api-access-d5dvz" (OuterVolumeSpecName: "kube-api-access-d5dvz") pod "4f13122f-94d3-47ba-9c7c-989ebe96468e" (UID: "4f13122f-94d3-47ba-9c7c-989ebe96468e"). InnerVolumeSpecName "kube-api-access-d5dvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:00:03 crc kubenswrapper[4764]: I0309 14:00:03.454906 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f13122f-94d3-47ba-9c7c-989ebe96468e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:03 crc kubenswrapper[4764]: I0309 14:00:03.454961 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5dvz\" (UniqueName: \"kubernetes.io/projected/4f13122f-94d3-47ba-9c7c-989ebe96468e-kube-api-access-d5dvz\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:03 crc kubenswrapper[4764]: I0309 14:00:03.454974 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f13122f-94d3-47ba-9c7c-989ebe96468e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:03 crc kubenswrapper[4764]: I0309 14:00:03.933836 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" event={"ID":"4f13122f-94d3-47ba-9c7c-989ebe96468e","Type":"ContainerDied","Data":"243e774d47f8d9a657e2bda283d0e4b662d6968f445495e7b80e3a737580d3e8"} Mar 09 14:00:03 crc kubenswrapper[4764]: I0309 14:00:03.934295 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="243e774d47f8d9a657e2bda283d0e4b662d6968f445495e7b80e3a737580d3e8" Mar 09 14:00:03 crc kubenswrapper[4764]: I0309 14:00:03.933893 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-n964g" Mar 09 14:00:04 crc kubenswrapper[4764]: I0309 14:00:04.360991 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn"] Mar 09 14:00:04 crc kubenswrapper[4764]: I0309 14:00:04.376887 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551035-5fpgn"] Mar 09 14:00:04 crc kubenswrapper[4764]: I0309 14:00:04.945484 4764 generic.go:334] "Generic (PLEG): container finished" podID="f6eebc0e-7e89-4489-b808-7eebf0e54dca" containerID="55591ff62c6b7e1f8299cbaa3c157dfe4ead0808f086d1b63edc0dcadd27bf4a" exitCode=0 Mar 09 14:00:04 crc kubenswrapper[4764]: I0309 14:00:04.945554 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551080-svz2w" event={"ID":"f6eebc0e-7e89-4489-b808-7eebf0e54dca","Type":"ContainerDied","Data":"55591ff62c6b7e1f8299cbaa3c157dfe4ead0808f086d1b63edc0dcadd27bf4a"} Mar 09 14:00:05 crc kubenswrapper[4764]: I0309 14:00:05.574100 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39da5087-79bc-4154-b340-22183d9e4417" path="/var/lib/kubelet/pods/39da5087-79bc-4154-b340-22183d9e4417/volumes" Mar 09 14:00:06 crc kubenswrapper[4764]: I0309 14:00:06.299061 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551080-svz2w" Mar 09 14:00:06 crc kubenswrapper[4764]: I0309 14:00:06.450870 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8vx5\" (UniqueName: \"kubernetes.io/projected/f6eebc0e-7e89-4489-b808-7eebf0e54dca-kube-api-access-d8vx5\") pod \"f6eebc0e-7e89-4489-b808-7eebf0e54dca\" (UID: \"f6eebc0e-7e89-4489-b808-7eebf0e54dca\") " Mar 09 14:00:06 crc kubenswrapper[4764]: I0309 14:00:06.458370 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6eebc0e-7e89-4489-b808-7eebf0e54dca-kube-api-access-d8vx5" (OuterVolumeSpecName: "kube-api-access-d8vx5") pod "f6eebc0e-7e89-4489-b808-7eebf0e54dca" (UID: "f6eebc0e-7e89-4489-b808-7eebf0e54dca"). InnerVolumeSpecName "kube-api-access-d8vx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:00:06 crc kubenswrapper[4764]: I0309 14:00:06.553680 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8vx5\" (UniqueName: \"kubernetes.io/projected/f6eebc0e-7e89-4489-b808-7eebf0e54dca-kube-api-access-d8vx5\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:06 crc kubenswrapper[4764]: I0309 14:00:06.973707 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551080-svz2w" event={"ID":"f6eebc0e-7e89-4489-b808-7eebf0e54dca","Type":"ContainerDied","Data":"b89a492d7501ec70f38c24543ed56396567de1540f48bf44ac303cff316d4171"} Mar 09 14:00:06 crc kubenswrapper[4764]: I0309 14:00:06.973765 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b89a492d7501ec70f38c24543ed56396567de1540f48bf44ac303cff316d4171" Mar 09 14:00:06 crc kubenswrapper[4764]: I0309 14:00:06.973765 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551080-svz2w" Mar 09 14:00:07 crc kubenswrapper[4764]: I0309 14:00:07.371853 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551074-ztbcz"] Mar 09 14:00:07 crc kubenswrapper[4764]: I0309 14:00:07.407628 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551074-ztbcz"] Mar 09 14:00:07 crc kubenswrapper[4764]: I0309 14:00:07.571858 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="909c58d5-d4d7-4042-94f0-df77bda9590a" path="/var/lib/kubelet/pods/909c58d5-d4d7-4042-94f0-df77bda9590a/volumes" Mar 09 14:00:09 crc kubenswrapper[4764]: I0309 14:00:09.051172 4764 scope.go:117] "RemoveContainer" containerID="45383c97959a17ffdeaa9f0ab6e8a1110b113c90d84de0fa490663169c04fa26" Mar 09 14:00:09 crc kubenswrapper[4764]: I0309 14:00:09.083080 4764 scope.go:117] "RemoveContainer" containerID="5510a27aa618536b31f12ced254c914aa21f71e4d8962e547b119bb29d1548f1" Mar 09 14:00:10 crc kubenswrapper[4764]: I0309 14:00:10.004852 4764 generic.go:334] "Generic (PLEG): container finished" podID="23319545-4107-4a83-b7e1-955e4648bf7b" containerID="8012376628a2434c25ec25a001db6c303665a134e1d1b05ef3e255939acfe13c" exitCode=0 Mar 09 14:00:10 crc kubenswrapper[4764]: I0309 14:00:10.004923 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" event={"ID":"23319545-4107-4a83-b7e1-955e4648bf7b","Type":"ContainerDied","Data":"8012376628a2434c25ec25a001db6c303665a134e1d1b05ef3e255939acfe13c"} Mar 09 14:00:11 crc kubenswrapper[4764]: I0309 14:00:11.451312 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 14:00:11 crc kubenswrapper[4764]: I0309 14:00:11.560222 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-ceph\") pod \"23319545-4107-4a83-b7e1-955e4648bf7b\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " Mar 09 14:00:11 crc kubenswrapper[4764]: I0309 14:00:11.560806 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-inventory-0\") pod \"23319545-4107-4a83-b7e1-955e4648bf7b\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " Mar 09 14:00:11 crc kubenswrapper[4764]: I0309 14:00:11.560882 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2q9z\" (UniqueName: \"kubernetes.io/projected/23319545-4107-4a83-b7e1-955e4648bf7b-kube-api-access-q2q9z\") pod \"23319545-4107-4a83-b7e1-955e4648bf7b\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " Mar 09 14:00:11 crc kubenswrapper[4764]: I0309 14:00:11.560925 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-ssh-key-openstack-edpm-ipam\") pod \"23319545-4107-4a83-b7e1-955e4648bf7b\" (UID: \"23319545-4107-4a83-b7e1-955e4648bf7b\") " Mar 09 14:00:11 crc kubenswrapper[4764]: I0309 14:00:11.566832 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-ceph" (OuterVolumeSpecName: "ceph") pod "23319545-4107-4a83-b7e1-955e4648bf7b" (UID: "23319545-4107-4a83-b7e1-955e4648bf7b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:00:11 crc kubenswrapper[4764]: I0309 14:00:11.567496 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23319545-4107-4a83-b7e1-955e4648bf7b-kube-api-access-q2q9z" (OuterVolumeSpecName: "kube-api-access-q2q9z") pod "23319545-4107-4a83-b7e1-955e4648bf7b" (UID: "23319545-4107-4a83-b7e1-955e4648bf7b"). InnerVolumeSpecName "kube-api-access-q2q9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:00:11 crc kubenswrapper[4764]: I0309 14:00:11.588268 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "23319545-4107-4a83-b7e1-955e4648bf7b" (UID: "23319545-4107-4a83-b7e1-955e4648bf7b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:00:11 crc kubenswrapper[4764]: I0309 14:00:11.593704 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "23319545-4107-4a83-b7e1-955e4648bf7b" (UID: "23319545-4107-4a83-b7e1-955e4648bf7b"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:00:11 crc kubenswrapper[4764]: I0309 14:00:11.665838 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:11 crc kubenswrapper[4764]: I0309 14:00:11.665879 4764 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:11 crc kubenswrapper[4764]: I0309 14:00:11.665893 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2q9z\" (UniqueName: \"kubernetes.io/projected/23319545-4107-4a83-b7e1-955e4648bf7b-kube-api-access-q2q9z\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:11 crc kubenswrapper[4764]: I0309 14:00:11.665907 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23319545-4107-4a83-b7e1-955e4648bf7b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.028321 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" event={"ID":"23319545-4107-4a83-b7e1-955e4648bf7b","Type":"ContainerDied","Data":"892d7bf1897aed2e0545cb05dd3f303e9e4c066119aefab5bda9572a1af4261c"} Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.028380 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="892d7bf1897aed2e0545cb05dd3f303e9e4c066119aefab5bda9572a1af4261c" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.028473 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x85q2" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.113088 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6"] Mar 09 14:00:12 crc kubenswrapper[4764]: E0309 14:00:12.113707 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6eebc0e-7e89-4489-b808-7eebf0e54dca" containerName="oc" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.113735 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6eebc0e-7e89-4489-b808-7eebf0e54dca" containerName="oc" Mar 09 14:00:12 crc kubenswrapper[4764]: E0309 14:00:12.113789 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23319545-4107-4a83-b7e1-955e4648bf7b" containerName="ssh-known-hosts-edpm-deployment" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.113802 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="23319545-4107-4a83-b7e1-955e4648bf7b" containerName="ssh-known-hosts-edpm-deployment" Mar 09 14:00:12 crc kubenswrapper[4764]: E0309 14:00:12.113818 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f13122f-94d3-47ba-9c7c-989ebe96468e" containerName="collect-profiles" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.113828 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f13122f-94d3-47ba-9c7c-989ebe96468e" containerName="collect-profiles" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.114083 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f13122f-94d3-47ba-9c7c-989ebe96468e" containerName="collect-profiles" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.114119 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="23319545-4107-4a83-b7e1-955e4648bf7b" containerName="ssh-known-hosts-edpm-deployment" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.114154 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6eebc0e-7e89-4489-b808-7eebf0e54dca" containerName="oc" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.115364 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.117716 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.118265 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.118436 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.119766 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.119813 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.126131 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6"] Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.282266 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4rw6\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.282574 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4rw6\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.282812 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zdw4\" (UniqueName: \"kubernetes.io/projected/942b7017-cdda-4d7a-8be8-521111f4fcd1-kube-api-access-5zdw4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4rw6\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.283198 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4rw6\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.385668 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4rw6\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.385820 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4rw6\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.385864 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zdw4\" (UniqueName: \"kubernetes.io/projected/942b7017-cdda-4d7a-8be8-521111f4fcd1-kube-api-access-5zdw4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4rw6\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.385952 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4rw6\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.391330 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4rw6\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.394604 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4rw6\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.403319 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zdw4\" (UniqueName: \"kubernetes.io/projected/942b7017-cdda-4d7a-8be8-521111f4fcd1-kube-api-access-5zdw4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4rw6\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.406246 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4rw6\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.434248 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:12 crc kubenswrapper[4764]: I0309 14:00:12.966774 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6"] Mar 09 14:00:13 crc kubenswrapper[4764]: I0309 14:00:13.040716 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" event={"ID":"942b7017-cdda-4d7a-8be8-521111f4fcd1","Type":"ContainerStarted","Data":"d92809745759714030457fe7c8af618ee88aa24996b77390082f9f223a4cefad"} Mar 09 14:00:14 crc kubenswrapper[4764]: I0309 14:00:14.560188 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:00:14 crc kubenswrapper[4764]: E0309 14:00:14.561159 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:00:15 crc kubenswrapper[4764]: I0309 14:00:15.057610 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" event={"ID":"942b7017-cdda-4d7a-8be8-521111f4fcd1","Type":"ContainerStarted","Data":"a06d71f1a77dceca0be2bf0b483e0d47e9a68e3152d9acd920b824104f7f40be"} Mar 09 14:00:15 crc kubenswrapper[4764]: I0309 14:00:15.081569 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" podStartSLOduration=2.190152158 podStartE2EDuration="3.08153967s" podCreationTimestamp="2026-03-09 14:00:12 +0000 UTC" firstStartedPulling="2026-03-09 14:00:12.97155852 +0000 UTC m=+2368.221730428" lastFinishedPulling="2026-03-09 14:00:13.862946032 +0000 UTC m=+2369.113117940" observedRunningTime="2026-03-09 14:00:15.07330291 +0000 UTC m=+2370.323474838" watchObservedRunningTime="2026-03-09 14:00:15.08153967 +0000 UTC m=+2370.331711598" Mar 09 14:00:21 crc kubenswrapper[4764]: I0309 14:00:21.118479 4764 generic.go:334] "Generic (PLEG): container finished" podID="942b7017-cdda-4d7a-8be8-521111f4fcd1" containerID="a06d71f1a77dceca0be2bf0b483e0d47e9a68e3152d9acd920b824104f7f40be" exitCode=0 Mar 09 14:00:21 crc kubenswrapper[4764]: I0309 14:00:21.118580 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" event={"ID":"942b7017-cdda-4d7a-8be8-521111f4fcd1","Type":"ContainerDied","Data":"a06d71f1a77dceca0be2bf0b483e0d47e9a68e3152d9acd920b824104f7f40be"} Mar 09 14:00:22 crc kubenswrapper[4764]: I0309 14:00:22.570856 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:22 crc kubenswrapper[4764]: I0309 14:00:22.708645 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-ssh-key-openstack-edpm-ipam\") pod \"942b7017-cdda-4d7a-8be8-521111f4fcd1\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " Mar 09 14:00:22 crc kubenswrapper[4764]: I0309 14:00:22.709264 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-ceph\") pod \"942b7017-cdda-4d7a-8be8-521111f4fcd1\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " Mar 09 14:00:22 crc kubenswrapper[4764]: I0309 14:00:22.709478 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zdw4\" (UniqueName: \"kubernetes.io/projected/942b7017-cdda-4d7a-8be8-521111f4fcd1-kube-api-access-5zdw4\") pod \"942b7017-cdda-4d7a-8be8-521111f4fcd1\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " Mar 09 14:00:22 crc kubenswrapper[4764]: I0309 14:00:22.709610 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-inventory\") pod \"942b7017-cdda-4d7a-8be8-521111f4fcd1\" (UID: \"942b7017-cdda-4d7a-8be8-521111f4fcd1\") " Mar 09 14:00:22 crc kubenswrapper[4764]: I0309 14:00:22.716314 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-ceph" (OuterVolumeSpecName: "ceph") pod "942b7017-cdda-4d7a-8be8-521111f4fcd1" (UID: "942b7017-cdda-4d7a-8be8-521111f4fcd1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:00:22 crc kubenswrapper[4764]: I0309 14:00:22.716850 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/942b7017-cdda-4d7a-8be8-521111f4fcd1-kube-api-access-5zdw4" (OuterVolumeSpecName: "kube-api-access-5zdw4") pod "942b7017-cdda-4d7a-8be8-521111f4fcd1" (UID: "942b7017-cdda-4d7a-8be8-521111f4fcd1"). InnerVolumeSpecName "kube-api-access-5zdw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:00:22 crc kubenswrapper[4764]: I0309 14:00:22.741153 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "942b7017-cdda-4d7a-8be8-521111f4fcd1" (UID: "942b7017-cdda-4d7a-8be8-521111f4fcd1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:00:22 crc kubenswrapper[4764]: I0309 14:00:22.749194 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-inventory" (OuterVolumeSpecName: "inventory") pod "942b7017-cdda-4d7a-8be8-521111f4fcd1" (UID: "942b7017-cdda-4d7a-8be8-521111f4fcd1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:00:22 crc kubenswrapper[4764]: I0309 14:00:22.813039 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zdw4\" (UniqueName: \"kubernetes.io/projected/942b7017-cdda-4d7a-8be8-521111f4fcd1-kube-api-access-5zdw4\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:22 crc kubenswrapper[4764]: I0309 14:00:22.813111 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:22 crc kubenswrapper[4764]: I0309 14:00:22.813124 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:22 crc kubenswrapper[4764]: I0309 14:00:22.813139 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/942b7017-cdda-4d7a-8be8-521111f4fcd1-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.141254 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" event={"ID":"942b7017-cdda-4d7a-8be8-521111f4fcd1","Type":"ContainerDied","Data":"d92809745759714030457fe7c8af618ee88aa24996b77390082f9f223a4cefad"} Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.141311 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d92809745759714030457fe7c8af618ee88aa24996b77390082f9f223a4cefad" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.141350 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4rw6" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.225446 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg"] Mar 09 14:00:23 crc kubenswrapper[4764]: E0309 14:00:23.226177 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="942b7017-cdda-4d7a-8be8-521111f4fcd1" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.226203 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="942b7017-cdda-4d7a-8be8-521111f4fcd1" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.226451 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="942b7017-cdda-4d7a-8be8-521111f4fcd1" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.227390 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.234351 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.234681 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.234757 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.234863 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.234641 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.237809 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg"] Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.324759 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22ckp\" (UniqueName: \"kubernetes.io/projected/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-kube-api-access-22ckp\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.324842 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.325156 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.325287 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.428070 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.428150 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.428213 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22ckp\" (UniqueName: \"kubernetes.io/projected/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-kube-api-access-22ckp\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.428256 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.434603 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.434603 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.434741 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.450607 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22ckp\" (UniqueName: \"kubernetes.io/projected/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-kube-api-access-22ckp\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:23 crc kubenswrapper[4764]: I0309 14:00:23.556487 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:24 crc kubenswrapper[4764]: I0309 14:00:24.325347 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg"] Mar 09 14:00:25 crc kubenswrapper[4764]: I0309 14:00:25.160761 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" event={"ID":"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300","Type":"ContainerStarted","Data":"4dd4c308fa6de7cca4815781796356da95e83110403baa4426c3bd13904126bc"} Mar 09 14:00:25 crc kubenswrapper[4764]: I0309 14:00:25.161172 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" event={"ID":"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300","Type":"ContainerStarted","Data":"ca149789a5aed112c6c479c7602abb7c51c2df6acad7ebd1db01364782405ce0"} Mar 09 14:00:25 crc kubenswrapper[4764]: I0309 14:00:25.194954 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" podStartSLOduration=1.75363846 podStartE2EDuration="2.19481523s" podCreationTimestamp="2026-03-09 14:00:23 +0000 UTC" firstStartedPulling="2026-03-09 14:00:24.337298593 +0000 UTC m=+2379.587470501" lastFinishedPulling="2026-03-09 14:00:24.778475363 +0000 UTC m=+2380.028647271" observedRunningTime="2026-03-09 14:00:25.178870824 +0000 UTC m=+2380.429042752" watchObservedRunningTime="2026-03-09 14:00:25.19481523 +0000 UTC m=+2380.444987138" Mar 09 14:00:27 crc kubenswrapper[4764]: I0309 14:00:27.560752 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:00:27 crc kubenswrapper[4764]: E0309 14:00:27.561695 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:00:34 crc kubenswrapper[4764]: I0309 14:00:34.247830 4764 generic.go:334] "Generic (PLEG): container finished" podID="6ee5c8cc-9f2b-42f8-aed5-37c3540bd300" containerID="4dd4c308fa6de7cca4815781796356da95e83110403baa4426c3bd13904126bc" exitCode=0 Mar 09 14:00:34 crc kubenswrapper[4764]: I0309 14:00:34.247957 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" event={"ID":"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300","Type":"ContainerDied","Data":"4dd4c308fa6de7cca4815781796356da95e83110403baa4426c3bd13904126bc"} Mar 09 14:00:35 crc kubenswrapper[4764]: I0309 14:00:35.677275 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:35 crc kubenswrapper[4764]: I0309 14:00:35.850187 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22ckp\" (UniqueName: \"kubernetes.io/projected/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-kube-api-access-22ckp\") pod \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " Mar 09 14:00:35 crc kubenswrapper[4764]: I0309 14:00:35.850424 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-ssh-key-openstack-edpm-ipam\") pod \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " Mar 09 14:00:35 crc kubenswrapper[4764]: I0309 14:00:35.850472 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-ceph\") pod \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " Mar 09 14:00:35 crc kubenswrapper[4764]: I0309 14:00:35.850519 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-inventory\") pod \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\" (UID: \"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300\") " Mar 09 14:00:35 crc kubenswrapper[4764]: I0309 14:00:35.857167 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-kube-api-access-22ckp" (OuterVolumeSpecName: "kube-api-access-22ckp") pod "6ee5c8cc-9f2b-42f8-aed5-37c3540bd300" (UID: "6ee5c8cc-9f2b-42f8-aed5-37c3540bd300"). InnerVolumeSpecName "kube-api-access-22ckp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:00:35 crc kubenswrapper[4764]: I0309 14:00:35.859329 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-ceph" (OuterVolumeSpecName: "ceph") pod "6ee5c8cc-9f2b-42f8-aed5-37c3540bd300" (UID: "6ee5c8cc-9f2b-42f8-aed5-37c3540bd300"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:00:35 crc kubenswrapper[4764]: I0309 14:00:35.882585 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6ee5c8cc-9f2b-42f8-aed5-37c3540bd300" (UID: "6ee5c8cc-9f2b-42f8-aed5-37c3540bd300"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:00:35 crc kubenswrapper[4764]: I0309 14:00:35.882923 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-inventory" (OuterVolumeSpecName: "inventory") pod "6ee5c8cc-9f2b-42f8-aed5-37c3540bd300" (UID: "6ee5c8cc-9f2b-42f8-aed5-37c3540bd300"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:00:35 crc kubenswrapper[4764]: I0309 14:00:35.953732 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22ckp\" (UniqueName: \"kubernetes.io/projected/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-kube-api-access-22ckp\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:35 crc kubenswrapper[4764]: I0309 14:00:35.953796 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:35 crc kubenswrapper[4764]: I0309 14:00:35.953811 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:35 crc kubenswrapper[4764]: I0309 14:00:35.953831 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ee5c8cc-9f2b-42f8-aed5-37c3540bd300-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.269967 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" event={"ID":"6ee5c8cc-9f2b-42f8-aed5-37c3540bd300","Type":"ContainerDied","Data":"ca149789a5aed112c6c479c7602abb7c51c2df6acad7ebd1db01364782405ce0"} Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.270017 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca149789a5aed112c6c479c7602abb7c51c2df6acad7ebd1db01364782405ce0" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.270036 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.369956 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s"] Mar 09 14:00:36 crc kubenswrapper[4764]: E0309 14:00:36.370423 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ee5c8cc-9f2b-42f8-aed5-37c3540bd300" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.370444 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee5c8cc-9f2b-42f8-aed5-37c3540bd300" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.370638 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ee5c8cc-9f2b-42f8-aed5-37c3540bd300" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.371474 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.374734 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.374734 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.375230 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.375896 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.375943 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.376268 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.377795 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.382377 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.386929 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s"] Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.565976 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zx9k\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-kube-api-access-6zx9k\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.566036 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.566418 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.566440 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.566478 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.566719 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.566764 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.566871 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.566918 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.566954 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.566991 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.567023 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.567104 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.668863 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.668929 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zx9k\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-kube-api-access-6zx9k\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.668963 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.668989 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.669019 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.669069 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.669139 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.669193 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.669249 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.669280 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.669310 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.669343 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.669371 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.673731 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.674465 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.675451 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.678135 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.678798 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.678881 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.679890 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.680516 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.681553 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.681980 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.686476 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.687108 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.694263 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zx9k\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-kube-api-access-6zx9k\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:36 crc kubenswrapper[4764]: I0309 14:00:36.992260 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:00:37 crc kubenswrapper[4764]: I0309 14:00:37.550895 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s"] Mar 09 14:00:38 crc kubenswrapper[4764]: I0309 14:00:38.292258 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" event={"ID":"949d7512-b3be-4068-b05a-20589fbc2b52","Type":"ContainerStarted","Data":"fd45c11e7b2b7a0ad5cb655148b378fb1667ce6154e5c572398f320b6660f275"} Mar 09 14:00:38 crc kubenswrapper[4764]: I0309 14:00:38.559779 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:00:38 crc kubenswrapper[4764]: E0309 14:00:38.560713 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:00:39 crc kubenswrapper[4764]: I0309 14:00:39.304269 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" event={"ID":"949d7512-b3be-4068-b05a-20589fbc2b52","Type":"ContainerStarted","Data":"39847ae98cf6e663c2d6e6155c011decc4cdfda221bf659392fa3d752f138247"} Mar 09 14:00:39 crc kubenswrapper[4764]: I0309 14:00:39.336064 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" podStartSLOduration=2.833031782 podStartE2EDuration="3.336036723s" podCreationTimestamp="2026-03-09 14:00:36 +0000 UTC" firstStartedPulling="2026-03-09 14:00:37.565861226 +0000 UTC m=+2392.816033134" lastFinishedPulling="2026-03-09 14:00:38.068866177 +0000 UTC m=+2393.319038075" observedRunningTime="2026-03-09 14:00:39.329438376 +0000 UTC m=+2394.579610314" watchObservedRunningTime="2026-03-09 14:00:39.336036723 +0000 UTC m=+2394.586208641" Mar 09 14:00:51 crc kubenswrapper[4764]: I0309 14:00:51.560244 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:00:51 crc kubenswrapper[4764]: E0309 14:00:51.561392 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.156148 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29551081-wz9hv"] Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.160588 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.176337 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29551081-wz9hv"] Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.234783 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-config-data\") pod \"keystone-cron-29551081-wz9hv\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.234894 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-fernet-keys\") pod \"keystone-cron-29551081-wz9hv\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.234924 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-combined-ca-bundle\") pod \"keystone-cron-29551081-wz9hv\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.234962 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jprwh\" (UniqueName: \"kubernetes.io/projected/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-kube-api-access-jprwh\") pod \"keystone-cron-29551081-wz9hv\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.337833 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-config-data\") pod \"keystone-cron-29551081-wz9hv\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.337934 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-fernet-keys\") pod \"keystone-cron-29551081-wz9hv\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.337982 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-combined-ca-bundle\") pod \"keystone-cron-29551081-wz9hv\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.338024 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jprwh\" (UniqueName: \"kubernetes.io/projected/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-kube-api-access-jprwh\") pod \"keystone-cron-29551081-wz9hv\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.346578 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-config-data\") pod \"keystone-cron-29551081-wz9hv\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.347864 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-fernet-keys\") pod \"keystone-cron-29551081-wz9hv\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.356496 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-combined-ca-bundle\") pod \"keystone-cron-29551081-wz9hv\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.367551 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jprwh\" (UniqueName: \"kubernetes.io/projected/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-kube-api-access-jprwh\") pod \"keystone-cron-29551081-wz9hv\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.484022 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:00 crc kubenswrapper[4764]: I0309 14:01:00.997951 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29551081-wz9hv"] Mar 09 14:01:01 crc kubenswrapper[4764]: I0309 14:01:01.501609 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29551081-wz9hv" event={"ID":"6ec256c5-cf20-4b12-bb84-0f5d3e02460a","Type":"ContainerStarted","Data":"74125e64c2f42534639475226ea119db97fbe809fe5256ca1d1f86b0e608d59a"} Mar 09 14:01:01 crc kubenswrapper[4764]: I0309 14:01:01.501744 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29551081-wz9hv" event={"ID":"6ec256c5-cf20-4b12-bb84-0f5d3e02460a","Type":"ContainerStarted","Data":"ae6f384de670520ca16ff59918eb69c70be5c807796ecc67aeb342a5218c418b"} Mar 09 14:01:01 crc kubenswrapper[4764]: I0309 14:01:01.529134 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29551081-wz9hv" podStartSLOduration=1.529099712 podStartE2EDuration="1.529099712s" podCreationTimestamp="2026-03-09 14:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:01:01.523979795 +0000 UTC m=+2416.774151703" watchObservedRunningTime="2026-03-09 14:01:01.529099712 +0000 UTC m=+2416.779271640" Mar 09 14:01:03 crc kubenswrapper[4764]: I0309 14:01:03.519857 4764 generic.go:334] "Generic (PLEG): container finished" podID="6ec256c5-cf20-4b12-bb84-0f5d3e02460a" containerID="74125e64c2f42534639475226ea119db97fbe809fe5256ca1d1f86b0e608d59a" exitCode=0 Mar 09 14:01:03 crc kubenswrapper[4764]: I0309 14:01:03.519940 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29551081-wz9hv" event={"ID":"6ec256c5-cf20-4b12-bb84-0f5d3e02460a","Type":"ContainerDied","Data":"74125e64c2f42534639475226ea119db97fbe809fe5256ca1d1f86b0e608d59a"} Mar 09 14:01:04 crc kubenswrapper[4764]: I0309 14:01:04.878361 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:04 crc kubenswrapper[4764]: I0309 14:01:04.945303 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jprwh\" (UniqueName: \"kubernetes.io/projected/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-kube-api-access-jprwh\") pod \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " Mar 09 14:01:04 crc kubenswrapper[4764]: I0309 14:01:04.945397 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-combined-ca-bundle\") pod \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " Mar 09 14:01:04 crc kubenswrapper[4764]: I0309 14:01:04.945461 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-config-data\") pod \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " Mar 09 14:01:04 crc kubenswrapper[4764]: I0309 14:01:04.945618 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-fernet-keys\") pod \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\" (UID: \"6ec256c5-cf20-4b12-bb84-0f5d3e02460a\") " Mar 09 14:01:04 crc kubenswrapper[4764]: I0309 14:01:04.952696 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6ec256c5-cf20-4b12-bb84-0f5d3e02460a" (UID: "6ec256c5-cf20-4b12-bb84-0f5d3e02460a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:04 crc kubenswrapper[4764]: I0309 14:01:04.956988 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-kube-api-access-jprwh" (OuterVolumeSpecName: "kube-api-access-jprwh") pod "6ec256c5-cf20-4b12-bb84-0f5d3e02460a" (UID: "6ec256c5-cf20-4b12-bb84-0f5d3e02460a"). InnerVolumeSpecName "kube-api-access-jprwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:01:04 crc kubenswrapper[4764]: I0309 14:01:04.975834 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ec256c5-cf20-4b12-bb84-0f5d3e02460a" (UID: "6ec256c5-cf20-4b12-bb84-0f5d3e02460a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:04 crc kubenswrapper[4764]: I0309 14:01:04.999945 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-config-data" (OuterVolumeSpecName: "config-data") pod "6ec256c5-cf20-4b12-bb84-0f5d3e02460a" (UID: "6ec256c5-cf20-4b12-bb84-0f5d3e02460a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:05 crc kubenswrapper[4764]: I0309 14:01:05.047364 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:05 crc kubenswrapper[4764]: I0309 14:01:05.047746 4764 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:05 crc kubenswrapper[4764]: I0309 14:01:05.047811 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jprwh\" (UniqueName: \"kubernetes.io/projected/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-kube-api-access-jprwh\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:05 crc kubenswrapper[4764]: I0309 14:01:05.047883 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ec256c5-cf20-4b12-bb84-0f5d3e02460a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:05 crc kubenswrapper[4764]: I0309 14:01:05.542531 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29551081-wz9hv" event={"ID":"6ec256c5-cf20-4b12-bb84-0f5d3e02460a","Type":"ContainerDied","Data":"ae6f384de670520ca16ff59918eb69c70be5c807796ecc67aeb342a5218c418b"} Mar 09 14:01:05 crc kubenswrapper[4764]: I0309 14:01:05.542596 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae6f384de670520ca16ff59918eb69c70be5c807796ecc67aeb342a5218c418b" Mar 09 14:01:05 crc kubenswrapper[4764]: I0309 14:01:05.542603 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29551081-wz9hv" Mar 09 14:01:05 crc kubenswrapper[4764]: I0309 14:01:05.580142 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:01:05 crc kubenswrapper[4764]: E0309 14:01:05.582955 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:01:06 crc kubenswrapper[4764]: I0309 14:01:06.555243 4764 generic.go:334] "Generic (PLEG): container finished" podID="949d7512-b3be-4068-b05a-20589fbc2b52" containerID="39847ae98cf6e663c2d6e6155c011decc4cdfda221bf659392fa3d752f138247" exitCode=0 Mar 09 14:01:06 crc kubenswrapper[4764]: I0309 14:01:06.555376 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" event={"ID":"949d7512-b3be-4068-b05a-20589fbc2b52","Type":"ContainerDied","Data":"39847ae98cf6e663c2d6e6155c011decc4cdfda221bf659392fa3d752f138247"} Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.004272 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.115384 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-ovn-default-certs-0\") pod \"949d7512-b3be-4068-b05a-20589fbc2b52\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.115456 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zx9k\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-kube-api-access-6zx9k\") pod \"949d7512-b3be-4068-b05a-20589fbc2b52\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.115522 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-libvirt-combined-ca-bundle\") pod \"949d7512-b3be-4068-b05a-20589fbc2b52\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.115562 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-inventory\") pod \"949d7512-b3be-4068-b05a-20589fbc2b52\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.115616 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ceph\") pod \"949d7512-b3be-4068-b05a-20589fbc2b52\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.115745 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-nova-combined-ca-bundle\") pod \"949d7512-b3be-4068-b05a-20589fbc2b52\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.115801 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-neutron-metadata-combined-ca-bundle\") pod \"949d7512-b3be-4068-b05a-20589fbc2b52\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.115869 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-repo-setup-combined-ca-bundle\") pod \"949d7512-b3be-4068-b05a-20589fbc2b52\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.115972 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-bootstrap-combined-ca-bundle\") pod \"949d7512-b3be-4068-b05a-20589fbc2b52\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.116005 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"949d7512-b3be-4068-b05a-20589fbc2b52\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.116585 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ssh-key-openstack-edpm-ipam\") pod \"949d7512-b3be-4068-b05a-20589fbc2b52\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.116634 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ovn-combined-ca-bundle\") pod \"949d7512-b3be-4068-b05a-20589fbc2b52\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.116797 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"949d7512-b3be-4068-b05a-20589fbc2b52\" (UID: \"949d7512-b3be-4068-b05a-20589fbc2b52\") " Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.122878 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "949d7512-b3be-4068-b05a-20589fbc2b52" (UID: "949d7512-b3be-4068-b05a-20589fbc2b52"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.123893 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "949d7512-b3be-4068-b05a-20589fbc2b52" (UID: "949d7512-b3be-4068-b05a-20589fbc2b52"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.123934 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ceph" (OuterVolumeSpecName: "ceph") pod "949d7512-b3be-4068-b05a-20589fbc2b52" (UID: "949d7512-b3be-4068-b05a-20589fbc2b52"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.124073 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "949d7512-b3be-4068-b05a-20589fbc2b52" (UID: "949d7512-b3be-4068-b05a-20589fbc2b52"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.124095 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "949d7512-b3be-4068-b05a-20589fbc2b52" (UID: "949d7512-b3be-4068-b05a-20589fbc2b52"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.124685 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "949d7512-b3be-4068-b05a-20589fbc2b52" (UID: "949d7512-b3be-4068-b05a-20589fbc2b52"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.124709 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "949d7512-b3be-4068-b05a-20589fbc2b52" (UID: "949d7512-b3be-4068-b05a-20589fbc2b52"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.126862 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "949d7512-b3be-4068-b05a-20589fbc2b52" (UID: "949d7512-b3be-4068-b05a-20589fbc2b52"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.129054 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "949d7512-b3be-4068-b05a-20589fbc2b52" (UID: "949d7512-b3be-4068-b05a-20589fbc2b52"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.129864 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "949d7512-b3be-4068-b05a-20589fbc2b52" (UID: "949d7512-b3be-4068-b05a-20589fbc2b52"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.132254 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-kube-api-access-6zx9k" (OuterVolumeSpecName: "kube-api-access-6zx9k") pod "949d7512-b3be-4068-b05a-20589fbc2b52" (UID: "949d7512-b3be-4068-b05a-20589fbc2b52"). InnerVolumeSpecName "kube-api-access-6zx9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.144915 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "949d7512-b3be-4068-b05a-20589fbc2b52" (UID: "949d7512-b3be-4068-b05a-20589fbc2b52"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.148037 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-inventory" (OuterVolumeSpecName: "inventory") pod "949d7512-b3be-4068-b05a-20589fbc2b52" (UID: "949d7512-b3be-4068-b05a-20589fbc2b52"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.219751 4764 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.219799 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.219819 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.219834 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zx9k\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-kube-api-access-6zx9k\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.219850 4764 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.219864 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.219876 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.219889 4764 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.219899 4764 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.219910 4764 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.219921 4764 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.219932 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/949d7512-b3be-4068-b05a-20589fbc2b52-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.219943 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/949d7512-b3be-4068-b05a-20589fbc2b52-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.575670 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" event={"ID":"949d7512-b3be-4068-b05a-20589fbc2b52","Type":"ContainerDied","Data":"fd45c11e7b2b7a0ad5cb655148b378fb1667ce6154e5c572398f320b6660f275"} Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.575729 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd45c11e7b2b7a0ad5cb655148b378fb1667ce6154e5c572398f320b6660f275" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.575869 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.730106 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp"] Mar 09 14:01:08 crc kubenswrapper[4764]: E0309 14:01:08.731301 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec256c5-cf20-4b12-bb84-0f5d3e02460a" containerName="keystone-cron" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.731326 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec256c5-cf20-4b12-bb84-0f5d3e02460a" containerName="keystone-cron" Mar 09 14:01:08 crc kubenswrapper[4764]: E0309 14:01:08.731375 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="949d7512-b3be-4068-b05a-20589fbc2b52" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.731385 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="949d7512-b3be-4068-b05a-20589fbc2b52" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.731615 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ec256c5-cf20-4b12-bb84-0f5d3e02460a" containerName="keystone-cron" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.731663 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="949d7512-b3be-4068-b05a-20589fbc2b52" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.732584 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.735758 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.735769 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.736083 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.736091 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.736479 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.751159 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp"] Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.829413 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnqm7\" (UniqueName: \"kubernetes.io/projected/e8ac27d6-e52e-4d38-b772-6ada493e746f-kube-api-access-fnqm7\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.829470 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.829520 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.829540 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.933117 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnqm7\" (UniqueName: \"kubernetes.io/projected/e8ac27d6-e52e-4d38-b772-6ada493e746f-kube-api-access-fnqm7\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.933219 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.933318 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.933351 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.938673 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.938742 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.939312 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:08 crc kubenswrapper[4764]: I0309 14:01:08.954246 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnqm7\" (UniqueName: \"kubernetes.io/projected/e8ac27d6-e52e-4d38-b772-6ada493e746f-kube-api-access-fnqm7\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:09 crc kubenswrapper[4764]: I0309 14:01:09.063534 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:09 crc kubenswrapper[4764]: I0309 14:01:09.628884 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp"] Mar 09 14:01:09 crc kubenswrapper[4764]: I0309 14:01:09.631967 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 14:01:10 crc kubenswrapper[4764]: I0309 14:01:10.597747 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" event={"ID":"e8ac27d6-e52e-4d38-b772-6ada493e746f","Type":"ContainerStarted","Data":"61754814ddd979027f1efaca27564db157bd11cd9aa5636858b9aa55143aff7c"} Mar 09 14:01:10 crc kubenswrapper[4764]: I0309 14:01:10.598285 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" event={"ID":"e8ac27d6-e52e-4d38-b772-6ada493e746f","Type":"ContainerStarted","Data":"85bb75c3b5612ff132d5c2c795945c0084bae3764401db098ff5980fb2ca16ec"} Mar 09 14:01:10 crc kubenswrapper[4764]: I0309 14:01:10.621097 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" podStartSLOduration=2.195519638 podStartE2EDuration="2.621069521s" podCreationTimestamp="2026-03-09 14:01:08 +0000 UTC" firstStartedPulling="2026-03-09 14:01:09.631604871 +0000 UTC m=+2424.881776779" lastFinishedPulling="2026-03-09 14:01:10.057154764 +0000 UTC m=+2425.307326662" observedRunningTime="2026-03-09 14:01:10.617769453 +0000 UTC m=+2425.867941381" watchObservedRunningTime="2026-03-09 14:01:10.621069521 +0000 UTC m=+2425.871241439" Mar 09 14:01:15 crc kubenswrapper[4764]: I0309 14:01:15.645162 4764 generic.go:334] "Generic (PLEG): container finished" podID="e8ac27d6-e52e-4d38-b772-6ada493e746f" containerID="61754814ddd979027f1efaca27564db157bd11cd9aa5636858b9aa55143aff7c" exitCode=0 Mar 09 14:01:15 crc kubenswrapper[4764]: I0309 14:01:15.645258 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" event={"ID":"e8ac27d6-e52e-4d38-b772-6ada493e746f","Type":"ContainerDied","Data":"61754814ddd979027f1efaca27564db157bd11cd9aa5636858b9aa55143aff7c"} Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.128562 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.313389 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnqm7\" (UniqueName: \"kubernetes.io/projected/e8ac27d6-e52e-4d38-b772-6ada493e746f-kube-api-access-fnqm7\") pod \"e8ac27d6-e52e-4d38-b772-6ada493e746f\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.313461 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-ceph\") pod \"e8ac27d6-e52e-4d38-b772-6ada493e746f\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.313527 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-ssh-key-openstack-edpm-ipam\") pod \"e8ac27d6-e52e-4d38-b772-6ada493e746f\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.313612 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-inventory\") pod \"e8ac27d6-e52e-4d38-b772-6ada493e746f\" (UID: \"e8ac27d6-e52e-4d38-b772-6ada493e746f\") " Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.321935 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-ceph" (OuterVolumeSpecName: "ceph") pod "e8ac27d6-e52e-4d38-b772-6ada493e746f" (UID: "e8ac27d6-e52e-4d38-b772-6ada493e746f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.327101 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8ac27d6-e52e-4d38-b772-6ada493e746f-kube-api-access-fnqm7" (OuterVolumeSpecName: "kube-api-access-fnqm7") pod "e8ac27d6-e52e-4d38-b772-6ada493e746f" (UID: "e8ac27d6-e52e-4d38-b772-6ada493e746f"). InnerVolumeSpecName "kube-api-access-fnqm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.347338 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e8ac27d6-e52e-4d38-b772-6ada493e746f" (UID: "e8ac27d6-e52e-4d38-b772-6ada493e746f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.349132 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-inventory" (OuterVolumeSpecName: "inventory") pod "e8ac27d6-e52e-4d38-b772-6ada493e746f" (UID: "e8ac27d6-e52e-4d38-b772-6ada493e746f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.416165 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnqm7\" (UniqueName: \"kubernetes.io/projected/e8ac27d6-e52e-4d38-b772-6ada493e746f-kube-api-access-fnqm7\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.416217 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.416233 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.416245 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8ac27d6-e52e-4d38-b772-6ada493e746f-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.665477 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" event={"ID":"e8ac27d6-e52e-4d38-b772-6ada493e746f","Type":"ContainerDied","Data":"85bb75c3b5612ff132d5c2c795945c0084bae3764401db098ff5980fb2ca16ec"} Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.665529 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85bb75c3b5612ff132d5c2c795945c0084bae3764401db098ff5980fb2ca16ec" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.665537 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.770228 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8"] Mar 09 14:01:17 crc kubenswrapper[4764]: E0309 14:01:17.770795 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8ac27d6-e52e-4d38-b772-6ada493e746f" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.770818 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8ac27d6-e52e-4d38-b772-6ada493e746f" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.771017 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8ac27d6-e52e-4d38-b772-6ada493e746f" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.771730 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.778686 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8"] Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.786401 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.786471 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.786402 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.786764 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.786860 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.786932 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.836991 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.837076 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.837160 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.837191 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.837227 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l422c\" (UniqueName: \"kubernetes.io/projected/ede2526d-593a-4258-9ec2-172270be638a-kube-api-access-l422c\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.837268 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ede2526d-593a-4258-9ec2-172270be638a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.939779 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.939874 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.939898 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.939938 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l422c\" (UniqueName: \"kubernetes.io/projected/ede2526d-593a-4258-9ec2-172270be638a-kube-api-access-l422c\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.939982 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ede2526d-593a-4258-9ec2-172270be638a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.940059 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.941639 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ede2526d-593a-4258-9ec2-172270be638a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.944439 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.945085 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.945150 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.945535 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:17 crc kubenswrapper[4764]: I0309 14:01:17.963912 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l422c\" (UniqueName: \"kubernetes.io/projected/ede2526d-593a-4258-9ec2-172270be638a-kube-api-access-l422c\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bkvx8\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:18 crc kubenswrapper[4764]: I0309 14:01:18.110456 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:01:18 crc kubenswrapper[4764]: I0309 14:01:18.658288 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8"] Mar 09 14:01:18 crc kubenswrapper[4764]: I0309 14:01:18.676129 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" event={"ID":"ede2526d-593a-4258-9ec2-172270be638a","Type":"ContainerStarted","Data":"34e4e0e2e0553eae12a91e21b5fe948fe626bbec1e8d92ace1c4dceace8d957e"} Mar 09 14:01:19 crc kubenswrapper[4764]: I0309 14:01:19.688117 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" event={"ID":"ede2526d-593a-4258-9ec2-172270be638a","Type":"ContainerStarted","Data":"2dc5eadd32fcb5eb0fa35e7388b219c67069c6c548b4292850ea9586a4f91976"} Mar 09 14:01:20 crc kubenswrapper[4764]: I0309 14:01:20.560141 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:01:20 crc kubenswrapper[4764]: E0309 14:01:20.560503 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:01:31 crc kubenswrapper[4764]: I0309 14:01:31.560277 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:01:31 crc kubenswrapper[4764]: E0309 14:01:31.561471 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:01:33 crc kubenswrapper[4764]: I0309 14:01:33.906872 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-kl47c" podUID="9333a95c-85e4-4e7d-a142-ae2dd06b4146" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:01:43 crc kubenswrapper[4764]: I0309 14:01:43.560523 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:01:43 crc kubenswrapper[4764]: E0309 14:01:43.561633 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:01:55 crc kubenswrapper[4764]: I0309 14:01:55.571138 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:01:55 crc kubenswrapper[4764]: E0309 14:01:55.574008 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:02:00 crc kubenswrapper[4764]: I0309 14:02:00.155581 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" podStartSLOduration=42.751668104 podStartE2EDuration="43.155558589s" podCreationTimestamp="2026-03-09 14:01:17 +0000 UTC" firstStartedPulling="2026-03-09 14:01:18.666379334 +0000 UTC m=+2433.916551242" lastFinishedPulling="2026-03-09 14:01:19.070269819 +0000 UTC m=+2434.320441727" observedRunningTime="2026-03-09 14:01:19.71597605 +0000 UTC m=+2434.966147958" watchObservedRunningTime="2026-03-09 14:02:00.155558589 +0000 UTC m=+2475.405730497" Mar 09 14:02:00 crc kubenswrapper[4764]: I0309 14:02:00.160673 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551082-9pffj"] Mar 09 14:02:00 crc kubenswrapper[4764]: I0309 14:02:00.162294 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551082-9pffj" Mar 09 14:02:00 crc kubenswrapper[4764]: I0309 14:02:00.166374 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:02:00 crc kubenswrapper[4764]: I0309 14:02:00.166600 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:02:00 crc kubenswrapper[4764]: I0309 14:02:00.166746 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:02:00 crc kubenswrapper[4764]: I0309 14:02:00.172415 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551082-9pffj"] Mar 09 14:02:00 crc kubenswrapper[4764]: I0309 14:02:00.295358 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqd4d\" (UniqueName: \"kubernetes.io/projected/cbc0f639-3ece-4df6-bbaa-af1572005872-kube-api-access-cqd4d\") pod \"auto-csr-approver-29551082-9pffj\" (UID: \"cbc0f639-3ece-4df6-bbaa-af1572005872\") " pod="openshift-infra/auto-csr-approver-29551082-9pffj" Mar 09 14:02:00 crc kubenswrapper[4764]: I0309 14:02:00.397884 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqd4d\" (UniqueName: \"kubernetes.io/projected/cbc0f639-3ece-4df6-bbaa-af1572005872-kube-api-access-cqd4d\") pod \"auto-csr-approver-29551082-9pffj\" (UID: \"cbc0f639-3ece-4df6-bbaa-af1572005872\") " pod="openshift-infra/auto-csr-approver-29551082-9pffj" Mar 09 14:02:00 crc kubenswrapper[4764]: I0309 14:02:00.420575 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqd4d\" (UniqueName: \"kubernetes.io/projected/cbc0f639-3ece-4df6-bbaa-af1572005872-kube-api-access-cqd4d\") pod \"auto-csr-approver-29551082-9pffj\" (UID: \"cbc0f639-3ece-4df6-bbaa-af1572005872\") " pod="openshift-infra/auto-csr-approver-29551082-9pffj" Mar 09 14:02:00 crc kubenswrapper[4764]: I0309 14:02:00.491076 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551082-9pffj" Mar 09 14:02:00 crc kubenswrapper[4764]: I0309 14:02:00.955088 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551082-9pffj"] Mar 09 14:02:01 crc kubenswrapper[4764]: I0309 14:02:01.188170 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551082-9pffj" event={"ID":"cbc0f639-3ece-4df6-bbaa-af1572005872","Type":"ContainerStarted","Data":"9ba7e5964b950123ff567487976ceb2369a4ec0837aa4fde8a04e86b8df20854"} Mar 09 14:02:03 crc kubenswrapper[4764]: I0309 14:02:03.209693 4764 generic.go:334] "Generic (PLEG): container finished" podID="cbc0f639-3ece-4df6-bbaa-af1572005872" containerID="083b9151b9954604db8e0ffa63a89b0a3dc0a267db79fa28e9e68016aa4eee5c" exitCode=0 Mar 09 14:02:03 crc kubenswrapper[4764]: I0309 14:02:03.209763 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551082-9pffj" event={"ID":"cbc0f639-3ece-4df6-bbaa-af1572005872","Type":"ContainerDied","Data":"083b9151b9954604db8e0ffa63a89b0a3dc0a267db79fa28e9e68016aa4eee5c"} Mar 09 14:02:04 crc kubenswrapper[4764]: I0309 14:02:04.582356 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551082-9pffj" Mar 09 14:02:04 crc kubenswrapper[4764]: I0309 14:02:04.701504 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqd4d\" (UniqueName: \"kubernetes.io/projected/cbc0f639-3ece-4df6-bbaa-af1572005872-kube-api-access-cqd4d\") pod \"cbc0f639-3ece-4df6-bbaa-af1572005872\" (UID: \"cbc0f639-3ece-4df6-bbaa-af1572005872\") " Mar 09 14:02:04 crc kubenswrapper[4764]: I0309 14:02:04.710223 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbc0f639-3ece-4df6-bbaa-af1572005872-kube-api-access-cqd4d" (OuterVolumeSpecName: "kube-api-access-cqd4d") pod "cbc0f639-3ece-4df6-bbaa-af1572005872" (UID: "cbc0f639-3ece-4df6-bbaa-af1572005872"). InnerVolumeSpecName "kube-api-access-cqd4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:02:04 crc kubenswrapper[4764]: I0309 14:02:04.805859 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqd4d\" (UniqueName: \"kubernetes.io/projected/cbc0f639-3ece-4df6-bbaa-af1572005872-kube-api-access-cqd4d\") on node \"crc\" DevicePath \"\"" Mar 09 14:02:05 crc kubenswrapper[4764]: I0309 14:02:05.232197 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551082-9pffj" event={"ID":"cbc0f639-3ece-4df6-bbaa-af1572005872","Type":"ContainerDied","Data":"9ba7e5964b950123ff567487976ceb2369a4ec0837aa4fde8a04e86b8df20854"} Mar 09 14:02:05 crc kubenswrapper[4764]: I0309 14:02:05.232609 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ba7e5964b950123ff567487976ceb2369a4ec0837aa4fde8a04e86b8df20854" Mar 09 14:02:05 crc kubenswrapper[4764]: I0309 14:02:05.232315 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551082-9pffj" Mar 09 14:02:05 crc kubenswrapper[4764]: I0309 14:02:05.671848 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551076-qrqw5"] Mar 09 14:02:05 crc kubenswrapper[4764]: I0309 14:02:05.679863 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551076-qrqw5"] Mar 09 14:02:06 crc kubenswrapper[4764]: I0309 14:02:06.561433 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:02:06 crc kubenswrapper[4764]: E0309 14:02:06.562329 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:02:07 crc kubenswrapper[4764]: I0309 14:02:07.571571 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a9d864e-dad7-4c7d-a639-d4042bb3339d" path="/var/lib/kubelet/pods/7a9d864e-dad7-4c7d-a639-d4042bb3339d/volumes" Mar 09 14:02:09 crc kubenswrapper[4764]: I0309 14:02:09.297006 4764 scope.go:117] "RemoveContainer" containerID="8f590f50cdda09a93b8757a7d03d71c1018f7d81bfe8e8784e29856175854a29" Mar 09 14:02:18 crc kubenswrapper[4764]: I0309 14:02:18.560774 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:02:18 crc kubenswrapper[4764]: E0309 14:02:18.561683 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:02:20 crc kubenswrapper[4764]: I0309 14:02:20.438705 4764 generic.go:334] "Generic (PLEG): container finished" podID="ede2526d-593a-4258-9ec2-172270be638a" containerID="2dc5eadd32fcb5eb0fa35e7388b219c67069c6c548b4292850ea9586a4f91976" exitCode=0 Mar 09 14:02:20 crc kubenswrapper[4764]: I0309 14:02:20.439081 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" event={"ID":"ede2526d-593a-4258-9ec2-172270be638a","Type":"ContainerDied","Data":"2dc5eadd32fcb5eb0fa35e7388b219c67069c6c548b4292850ea9586a4f91976"} Mar 09 14:02:21 crc kubenswrapper[4764]: I0309 14:02:21.846224 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:02:21 crc kubenswrapper[4764]: I0309 14:02:21.905675 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ceph\") pod \"ede2526d-593a-4258-9ec2-172270be638a\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " Mar 09 14:02:21 crc kubenswrapper[4764]: I0309 14:02:21.905772 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ssh-key-openstack-edpm-ipam\") pod \"ede2526d-593a-4258-9ec2-172270be638a\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " Mar 09 14:02:21 crc kubenswrapper[4764]: I0309 14:02:21.905852 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ede2526d-593a-4258-9ec2-172270be638a-ovncontroller-config-0\") pod \"ede2526d-593a-4258-9ec2-172270be638a\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " Mar 09 14:02:21 crc kubenswrapper[4764]: I0309 14:02:21.905922 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-inventory\") pod \"ede2526d-593a-4258-9ec2-172270be638a\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " Mar 09 14:02:21 crc kubenswrapper[4764]: I0309 14:02:21.906065 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ovn-combined-ca-bundle\") pod \"ede2526d-593a-4258-9ec2-172270be638a\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " Mar 09 14:02:21 crc kubenswrapper[4764]: I0309 14:02:21.906148 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l422c\" (UniqueName: \"kubernetes.io/projected/ede2526d-593a-4258-9ec2-172270be638a-kube-api-access-l422c\") pod \"ede2526d-593a-4258-9ec2-172270be638a\" (UID: \"ede2526d-593a-4258-9ec2-172270be638a\") " Mar 09 14:02:21 crc kubenswrapper[4764]: I0309 14:02:21.927988 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "ede2526d-593a-4258-9ec2-172270be638a" (UID: "ede2526d-593a-4258-9ec2-172270be638a"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:02:21 crc kubenswrapper[4764]: I0309 14:02:21.928054 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ceph" (OuterVolumeSpecName: "ceph") pod "ede2526d-593a-4258-9ec2-172270be638a" (UID: "ede2526d-593a-4258-9ec2-172270be638a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:02:21 crc kubenswrapper[4764]: I0309 14:02:21.928115 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ede2526d-593a-4258-9ec2-172270be638a-kube-api-access-l422c" (OuterVolumeSpecName: "kube-api-access-l422c") pod "ede2526d-593a-4258-9ec2-172270be638a" (UID: "ede2526d-593a-4258-9ec2-172270be638a"). InnerVolumeSpecName "kube-api-access-l422c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:02:21 crc kubenswrapper[4764]: I0309 14:02:21.932005 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ede2526d-593a-4258-9ec2-172270be638a-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "ede2526d-593a-4258-9ec2-172270be638a" (UID: "ede2526d-593a-4258-9ec2-172270be638a"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:02:21 crc kubenswrapper[4764]: I0309 14:02:21.941357 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ede2526d-593a-4258-9ec2-172270be638a" (UID: "ede2526d-593a-4258-9ec2-172270be638a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:02:21 crc kubenswrapper[4764]: I0309 14:02:21.942298 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-inventory" (OuterVolumeSpecName: "inventory") pod "ede2526d-593a-4258-9ec2-172270be638a" (UID: "ede2526d-593a-4258-9ec2-172270be638a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.009513 4764 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.009759 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l422c\" (UniqueName: \"kubernetes.io/projected/ede2526d-593a-4258-9ec2-172270be638a-kube-api-access-l422c\") on node \"crc\" DevicePath \"\"" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.009865 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.009950 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.010028 4764 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ede2526d-593a-4258-9ec2-172270be638a-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.010097 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ede2526d-593a-4258-9ec2-172270be638a-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.461387 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" event={"ID":"ede2526d-593a-4258-9ec2-172270be638a","Type":"ContainerDied","Data":"34e4e0e2e0553eae12a91e21b5fe948fe626bbec1e8d92ace1c4dceace8d957e"} Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.461469 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34e4e0e2e0553eae12a91e21b5fe948fe626bbec1e8d92ace1c4dceace8d957e" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.461471 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bkvx8" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.545771 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj"] Mar 09 14:02:22 crc kubenswrapper[4764]: E0309 14:02:22.546404 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbc0f639-3ece-4df6-bbaa-af1572005872" containerName="oc" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.546618 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc0f639-3ece-4df6-bbaa-af1572005872" containerName="oc" Mar 09 14:02:22 crc kubenswrapper[4764]: E0309 14:02:22.546679 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ede2526d-593a-4258-9ec2-172270be638a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.546686 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ede2526d-593a-4258-9ec2-172270be638a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.546891 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbc0f639-3ece-4df6-bbaa-af1572005872" containerName="oc" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.546930 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ede2526d-593a-4258-9ec2-172270be638a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.547843 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.553054 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.556580 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj"] Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.559289 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.559307 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.559360 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.559419 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.561276 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.562444 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.625031 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.625297 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.625388 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phvr6\" (UniqueName: \"kubernetes.io/projected/8a38f1e2-ce88-47d9-883d-4d95c781d181-kube-api-access-phvr6\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.625467 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.625668 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.625757 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.625869 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.728045 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.728213 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.728301 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.728340 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phvr6\" (UniqueName: \"kubernetes.io/projected/8a38f1e2-ce88-47d9-883d-4d95c781d181-kube-api-access-phvr6\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.728378 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.728436 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.728469 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.734412 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.734412 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.734855 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.735164 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.736085 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.741176 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.751746 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phvr6\" (UniqueName: \"kubernetes.io/projected/8a38f1e2-ce88-47d9-883d-4d95c781d181-kube-api-access-phvr6\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:22 crc kubenswrapper[4764]: I0309 14:02:22.868205 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:02:23 crc kubenswrapper[4764]: I0309 14:02:23.482403 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj"] Mar 09 14:02:24 crc kubenswrapper[4764]: I0309 14:02:24.479951 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" event={"ID":"8a38f1e2-ce88-47d9-883d-4d95c781d181","Type":"ContainerStarted","Data":"d1f7816a6424fdde2f043a6817a44c6ba9d4601873f3fb65c8072848e7fb38fd"} Mar 09 14:02:24 crc kubenswrapper[4764]: I0309 14:02:24.480370 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" event={"ID":"8a38f1e2-ce88-47d9-883d-4d95c781d181","Type":"ContainerStarted","Data":"a5d86cbde00cd41578cfdae467292759f1c4ed53f38e6d7e158ec1fa5e1cbc67"} Mar 09 14:02:24 crc kubenswrapper[4764]: I0309 14:02:24.503498 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" podStartSLOduration=1.994006773 podStartE2EDuration="2.503473476s" podCreationTimestamp="2026-03-09 14:02:22 +0000 UTC" firstStartedPulling="2026-03-09 14:02:23.490474708 +0000 UTC m=+2498.740646616" lastFinishedPulling="2026-03-09 14:02:23.999941401 +0000 UTC m=+2499.250113319" observedRunningTime="2026-03-09 14:02:24.497805334 +0000 UTC m=+2499.747977252" watchObservedRunningTime="2026-03-09 14:02:24.503473476 +0000 UTC m=+2499.753645384" Mar 09 14:02:33 crc kubenswrapper[4764]: I0309 14:02:33.559530 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:02:33 crc kubenswrapper[4764]: E0309 14:02:33.560322 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:02:44 crc kubenswrapper[4764]: I0309 14:02:44.559884 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:02:44 crc kubenswrapper[4764]: E0309 14:02:44.560801 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:02:55 crc kubenswrapper[4764]: I0309 14:02:55.567236 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:02:55 crc kubenswrapper[4764]: E0309 14:02:55.568324 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:03:08 crc kubenswrapper[4764]: I0309 14:03:08.559897 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:03:08 crc kubenswrapper[4764]: E0309 14:03:08.560824 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:03:14 crc kubenswrapper[4764]: I0309 14:03:14.943136 4764 generic.go:334] "Generic (PLEG): container finished" podID="8a38f1e2-ce88-47d9-883d-4d95c781d181" containerID="d1f7816a6424fdde2f043a6817a44c6ba9d4601873f3fb65c8072848e7fb38fd" exitCode=0 Mar 09 14:03:14 crc kubenswrapper[4764]: I0309 14:03:14.943197 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" event={"ID":"8a38f1e2-ce88-47d9-883d-4d95c781d181","Type":"ContainerDied","Data":"d1f7816a6424fdde2f043a6817a44c6ba9d4601873f3fb65c8072848e7fb38fd"} Mar 09 14:03:16 crc kubenswrapper[4764]: I0309 14:03:16.971158 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" event={"ID":"8a38f1e2-ce88-47d9-883d-4d95c781d181","Type":"ContainerDied","Data":"a5d86cbde00cd41578cfdae467292759f1c4ed53f38e6d7e158ec1fa5e1cbc67"} Mar 09 14:03:16 crc kubenswrapper[4764]: I0309 14:03:16.971609 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5d86cbde00cd41578cfdae467292759f1c4ed53f38e6d7e158ec1fa5e1cbc67" Mar 09 14:03:16 crc kubenswrapper[4764]: I0309 14:03:16.994825 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.015563 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phvr6\" (UniqueName: \"kubernetes.io/projected/8a38f1e2-ce88-47d9-883d-4d95c781d181-kube-api-access-phvr6\") pod \"8a38f1e2-ce88-47d9-883d-4d95c781d181\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.015700 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-nova-metadata-neutron-config-0\") pod \"8a38f1e2-ce88-47d9-883d-4d95c781d181\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.015719 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-ssh-key-openstack-edpm-ipam\") pod \"8a38f1e2-ce88-47d9-883d-4d95c781d181\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.015742 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-inventory\") pod \"8a38f1e2-ce88-47d9-883d-4d95c781d181\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.015764 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-neutron-metadata-combined-ca-bundle\") pod \"8a38f1e2-ce88-47d9-883d-4d95c781d181\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.015786 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-ceph\") pod \"8a38f1e2-ce88-47d9-883d-4d95c781d181\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.015902 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-neutron-ovn-metadata-agent-neutron-config-0\") pod \"8a38f1e2-ce88-47d9-883d-4d95c781d181\" (UID: \"8a38f1e2-ce88-47d9-883d-4d95c781d181\") " Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.023595 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a38f1e2-ce88-47d9-883d-4d95c781d181-kube-api-access-phvr6" (OuterVolumeSpecName: "kube-api-access-phvr6") pod "8a38f1e2-ce88-47d9-883d-4d95c781d181" (UID: "8a38f1e2-ce88-47d9-883d-4d95c781d181"). InnerVolumeSpecName "kube-api-access-phvr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.024452 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "8a38f1e2-ce88-47d9-883d-4d95c781d181" (UID: "8a38f1e2-ce88-47d9-883d-4d95c781d181"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.028043 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-ceph" (OuterVolumeSpecName: "ceph") pod "8a38f1e2-ce88-47d9-883d-4d95c781d181" (UID: "8a38f1e2-ce88-47d9-883d-4d95c781d181"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.051963 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "8a38f1e2-ce88-47d9-883d-4d95c781d181" (UID: "8a38f1e2-ce88-47d9-883d-4d95c781d181"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.063556 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "8a38f1e2-ce88-47d9-883d-4d95c781d181" (UID: "8a38f1e2-ce88-47d9-883d-4d95c781d181"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.076846 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-inventory" (OuterVolumeSpecName: "inventory") pod "8a38f1e2-ce88-47d9-883d-4d95c781d181" (UID: "8a38f1e2-ce88-47d9-883d-4d95c781d181"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.087266 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8a38f1e2-ce88-47d9-883d-4d95c781d181" (UID: "8a38f1e2-ce88-47d9-883d-4d95c781d181"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.119302 4764 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.119344 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.119358 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.119369 4764 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.119381 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.119395 4764 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8a38f1e2-ce88-47d9-883d-4d95c781d181-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.119406 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phvr6\" (UniqueName: \"kubernetes.io/projected/8a38f1e2-ce88-47d9-883d-4d95c781d181-kube-api-access-phvr6\") on node \"crc\" DevicePath \"\"" Mar 09 14:03:17 crc kubenswrapper[4764]: I0309 14:03:17.979086 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.146812 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98"] Mar 09 14:03:18 crc kubenswrapper[4764]: E0309 14:03:18.147477 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a38f1e2-ce88-47d9-883d-4d95c781d181" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.147518 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a38f1e2-ce88-47d9-883d-4d95c781d181" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.147845 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a38f1e2-ce88-47d9-883d-4d95c781d181" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.148965 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.152293 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.152940 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.153412 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.153584 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.153720 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.153723 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.161886 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98"] Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.243752 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.243814 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.243838 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.244221 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmjg6\" (UniqueName: \"kubernetes.io/projected/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-kube-api-access-pmjg6\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.244405 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.244470 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.345544 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.345616 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.345667 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.345788 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmjg6\" (UniqueName: \"kubernetes.io/projected/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-kube-api-access-pmjg6\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.345845 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.345879 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.353033 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.353165 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.353773 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.353806 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.364412 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.368809 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmjg6\" (UniqueName: \"kubernetes.io/projected/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-kube-api-access-pmjg6\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9pl98\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:18 crc kubenswrapper[4764]: I0309 14:03:18.515075 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:03:19 crc kubenswrapper[4764]: I0309 14:03:19.089286 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98"] Mar 09 14:03:20 crc kubenswrapper[4764]: I0309 14:03:20.002589 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" event={"ID":"0a9ed7f5-c296-41ac-ae0d-5845c66a385a","Type":"ContainerStarted","Data":"a1d62db2728aab12ddd10813940dde51882f73fd4e8503f3f1ba3d93d56bff30"} Mar 09 14:03:20 crc kubenswrapper[4764]: I0309 14:03:20.003462 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" event={"ID":"0a9ed7f5-c296-41ac-ae0d-5845c66a385a","Type":"ContainerStarted","Data":"548509aa0f1e8c3468f071ff1b05af0aabf6c1afd7c27047c4f22b0be5b05951"} Mar 09 14:03:20 crc kubenswrapper[4764]: I0309 14:03:20.027813 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" podStartSLOduration=1.58397101 podStartE2EDuration="2.027787671s" podCreationTimestamp="2026-03-09 14:03:18 +0000 UTC" firstStartedPulling="2026-03-09 14:03:19.094949593 +0000 UTC m=+2554.345121491" lastFinishedPulling="2026-03-09 14:03:19.538766244 +0000 UTC m=+2554.788938152" observedRunningTime="2026-03-09 14:03:20.020955129 +0000 UTC m=+2555.271127037" watchObservedRunningTime="2026-03-09 14:03:20.027787671 +0000 UTC m=+2555.277959579" Mar 09 14:03:21 crc kubenswrapper[4764]: I0309 14:03:21.560143 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:03:21 crc kubenswrapper[4764]: E0309 14:03:21.560444 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:03:33 crc kubenswrapper[4764]: I0309 14:03:33.210611 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jx29x"] Mar 09 14:03:33 crc kubenswrapper[4764]: I0309 14:03:33.214866 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:33 crc kubenswrapper[4764]: I0309 14:03:33.228158 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jx29x"] Mar 09 14:03:33 crc kubenswrapper[4764]: I0309 14:03:33.283978 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9twzf\" (UniqueName: \"kubernetes.io/projected/6154590f-34aa-4248-a339-14cb0d11da17-kube-api-access-9twzf\") pod \"certified-operators-jx29x\" (UID: \"6154590f-34aa-4248-a339-14cb0d11da17\") " pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:33 crc kubenswrapper[4764]: I0309 14:03:33.284151 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6154590f-34aa-4248-a339-14cb0d11da17-catalog-content\") pod \"certified-operators-jx29x\" (UID: \"6154590f-34aa-4248-a339-14cb0d11da17\") " pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:33 crc kubenswrapper[4764]: I0309 14:03:33.284196 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6154590f-34aa-4248-a339-14cb0d11da17-utilities\") pod \"certified-operators-jx29x\" (UID: \"6154590f-34aa-4248-a339-14cb0d11da17\") " pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:33 crc kubenswrapper[4764]: I0309 14:03:33.385815 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9twzf\" (UniqueName: \"kubernetes.io/projected/6154590f-34aa-4248-a339-14cb0d11da17-kube-api-access-9twzf\") pod \"certified-operators-jx29x\" (UID: \"6154590f-34aa-4248-a339-14cb0d11da17\") " pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:33 crc kubenswrapper[4764]: I0309 14:03:33.385894 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6154590f-34aa-4248-a339-14cb0d11da17-catalog-content\") pod \"certified-operators-jx29x\" (UID: \"6154590f-34aa-4248-a339-14cb0d11da17\") " pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:33 crc kubenswrapper[4764]: I0309 14:03:33.385916 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6154590f-34aa-4248-a339-14cb0d11da17-utilities\") pod \"certified-operators-jx29x\" (UID: \"6154590f-34aa-4248-a339-14cb0d11da17\") " pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:33 crc kubenswrapper[4764]: I0309 14:03:33.386469 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6154590f-34aa-4248-a339-14cb0d11da17-utilities\") pod \"certified-operators-jx29x\" (UID: \"6154590f-34aa-4248-a339-14cb0d11da17\") " pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:33 crc kubenswrapper[4764]: I0309 14:03:33.386573 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6154590f-34aa-4248-a339-14cb0d11da17-catalog-content\") pod \"certified-operators-jx29x\" (UID: \"6154590f-34aa-4248-a339-14cb0d11da17\") " pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:33 crc kubenswrapper[4764]: I0309 14:03:33.411452 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9twzf\" (UniqueName: \"kubernetes.io/projected/6154590f-34aa-4248-a339-14cb0d11da17-kube-api-access-9twzf\") pod \"certified-operators-jx29x\" (UID: \"6154590f-34aa-4248-a339-14cb0d11da17\") " pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:33 crc kubenswrapper[4764]: I0309 14:03:33.546992 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:34 crc kubenswrapper[4764]: I0309 14:03:34.202824 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jx29x"] Mar 09 14:03:34 crc kubenswrapper[4764]: I0309 14:03:34.559621 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:03:34 crc kubenswrapper[4764]: E0309 14:03:34.560090 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:03:35 crc kubenswrapper[4764]: I0309 14:03:35.145198 4764 generic.go:334] "Generic (PLEG): container finished" podID="6154590f-34aa-4248-a339-14cb0d11da17" containerID="a231bd14b5a52e83e1070151054791b6cd46c3e0738c42bd2c7de1d6fbe77190" exitCode=0 Mar 09 14:03:35 crc kubenswrapper[4764]: I0309 14:03:35.145303 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jx29x" event={"ID":"6154590f-34aa-4248-a339-14cb0d11da17","Type":"ContainerDied","Data":"a231bd14b5a52e83e1070151054791b6cd46c3e0738c42bd2c7de1d6fbe77190"} Mar 09 14:03:35 crc kubenswrapper[4764]: I0309 14:03:35.145828 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jx29x" event={"ID":"6154590f-34aa-4248-a339-14cb0d11da17","Type":"ContainerStarted","Data":"a8bed4ae015f91a1af51881a89b70eb39989a4c366d3d28f458614db53116da0"} Mar 09 14:03:37 crc kubenswrapper[4764]: I0309 14:03:37.164593 4764 generic.go:334] "Generic (PLEG): container finished" podID="6154590f-34aa-4248-a339-14cb0d11da17" containerID="cb84120ece4662aa8eaf5e4da54ddf7a38310a0e535e3d45c9e333ddbaf651d6" exitCode=0 Mar 09 14:03:37 crc kubenswrapper[4764]: I0309 14:03:37.164698 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jx29x" event={"ID":"6154590f-34aa-4248-a339-14cb0d11da17","Type":"ContainerDied","Data":"cb84120ece4662aa8eaf5e4da54ddf7a38310a0e535e3d45c9e333ddbaf651d6"} Mar 09 14:03:38 crc kubenswrapper[4764]: I0309 14:03:38.178337 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jx29x" event={"ID":"6154590f-34aa-4248-a339-14cb0d11da17","Type":"ContainerStarted","Data":"5ac092a4daa49b5d1556a11982584c1f9fc67103c309bac82f0cc93c736c239c"} Mar 09 14:03:38 crc kubenswrapper[4764]: I0309 14:03:38.206553 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jx29x" podStartSLOduration=2.7360647670000002 podStartE2EDuration="5.206524592s" podCreationTimestamp="2026-03-09 14:03:33 +0000 UTC" firstStartedPulling="2026-03-09 14:03:35.148504878 +0000 UTC m=+2570.398676796" lastFinishedPulling="2026-03-09 14:03:37.618964713 +0000 UTC m=+2572.869136621" observedRunningTime="2026-03-09 14:03:38.202820983 +0000 UTC m=+2573.452992911" watchObservedRunningTime="2026-03-09 14:03:38.206524592 +0000 UTC m=+2573.456696500" Mar 09 14:03:43 crc kubenswrapper[4764]: I0309 14:03:43.547744 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:43 crc kubenswrapper[4764]: I0309 14:03:43.548666 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:43 crc kubenswrapper[4764]: I0309 14:03:43.601240 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:44 crc kubenswrapper[4764]: I0309 14:03:44.282700 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:44 crc kubenswrapper[4764]: I0309 14:03:44.348758 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jx29x"] Mar 09 14:03:45 crc kubenswrapper[4764]: I0309 14:03:45.572104 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:03:45 crc kubenswrapper[4764]: E0309 14:03:45.573162 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:03:46 crc kubenswrapper[4764]: I0309 14:03:46.247069 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jx29x" podUID="6154590f-34aa-4248-a339-14cb0d11da17" containerName="registry-server" containerID="cri-o://5ac092a4daa49b5d1556a11982584c1f9fc67103c309bac82f0cc93c736c239c" gracePeriod=2 Mar 09 14:03:46 crc kubenswrapper[4764]: I0309 14:03:46.712623 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:46 crc kubenswrapper[4764]: I0309 14:03:46.883368 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9twzf\" (UniqueName: \"kubernetes.io/projected/6154590f-34aa-4248-a339-14cb0d11da17-kube-api-access-9twzf\") pod \"6154590f-34aa-4248-a339-14cb0d11da17\" (UID: \"6154590f-34aa-4248-a339-14cb0d11da17\") " Mar 09 14:03:46 crc kubenswrapper[4764]: I0309 14:03:46.883434 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6154590f-34aa-4248-a339-14cb0d11da17-utilities\") pod \"6154590f-34aa-4248-a339-14cb0d11da17\" (UID: \"6154590f-34aa-4248-a339-14cb0d11da17\") " Mar 09 14:03:46 crc kubenswrapper[4764]: I0309 14:03:46.883469 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6154590f-34aa-4248-a339-14cb0d11da17-catalog-content\") pod \"6154590f-34aa-4248-a339-14cb0d11da17\" (UID: \"6154590f-34aa-4248-a339-14cb0d11da17\") " Mar 09 14:03:46 crc kubenswrapper[4764]: I0309 14:03:46.884800 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6154590f-34aa-4248-a339-14cb0d11da17-utilities" (OuterVolumeSpecName: "utilities") pod "6154590f-34aa-4248-a339-14cb0d11da17" (UID: "6154590f-34aa-4248-a339-14cb0d11da17"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:03:46 crc kubenswrapper[4764]: I0309 14:03:46.891044 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6154590f-34aa-4248-a339-14cb0d11da17-kube-api-access-9twzf" (OuterVolumeSpecName: "kube-api-access-9twzf") pod "6154590f-34aa-4248-a339-14cb0d11da17" (UID: "6154590f-34aa-4248-a339-14cb0d11da17"). InnerVolumeSpecName "kube-api-access-9twzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:03:46 crc kubenswrapper[4764]: I0309 14:03:46.988290 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6154590f-34aa-4248-a339-14cb0d11da17-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:03:46 crc kubenswrapper[4764]: I0309 14:03:46.988363 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9twzf\" (UniqueName: \"kubernetes.io/projected/6154590f-34aa-4248-a339-14cb0d11da17-kube-api-access-9twzf\") on node \"crc\" DevicePath \"\"" Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.263280 4764 generic.go:334] "Generic (PLEG): container finished" podID="6154590f-34aa-4248-a339-14cb0d11da17" containerID="5ac092a4daa49b5d1556a11982584c1f9fc67103c309bac82f0cc93c736c239c" exitCode=0 Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.263357 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jx29x" event={"ID":"6154590f-34aa-4248-a339-14cb0d11da17","Type":"ContainerDied","Data":"5ac092a4daa49b5d1556a11982584c1f9fc67103c309bac82f0cc93c736c239c"} Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.263398 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jx29x" Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.263407 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jx29x" event={"ID":"6154590f-34aa-4248-a339-14cb0d11da17","Type":"ContainerDied","Data":"a8bed4ae015f91a1af51881a89b70eb39989a4c366d3d28f458614db53116da0"} Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.263442 4764 scope.go:117] "RemoveContainer" containerID="5ac092a4daa49b5d1556a11982584c1f9fc67103c309bac82f0cc93c736c239c" Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.285012 4764 scope.go:117] "RemoveContainer" containerID="cb84120ece4662aa8eaf5e4da54ddf7a38310a0e535e3d45c9e333ddbaf651d6" Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.316601 4764 scope.go:117] "RemoveContainer" containerID="a231bd14b5a52e83e1070151054791b6cd46c3e0738c42bd2c7de1d6fbe77190" Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.358357 4764 scope.go:117] "RemoveContainer" containerID="5ac092a4daa49b5d1556a11982584c1f9fc67103c309bac82f0cc93c736c239c" Mar 09 14:03:47 crc kubenswrapper[4764]: E0309 14:03:47.359159 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ac092a4daa49b5d1556a11982584c1f9fc67103c309bac82f0cc93c736c239c\": container with ID starting with 5ac092a4daa49b5d1556a11982584c1f9fc67103c309bac82f0cc93c736c239c not found: ID does not exist" containerID="5ac092a4daa49b5d1556a11982584c1f9fc67103c309bac82f0cc93c736c239c" Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.359225 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ac092a4daa49b5d1556a11982584c1f9fc67103c309bac82f0cc93c736c239c"} err="failed to get container status \"5ac092a4daa49b5d1556a11982584c1f9fc67103c309bac82f0cc93c736c239c\": rpc error: code = NotFound desc = could not find container \"5ac092a4daa49b5d1556a11982584c1f9fc67103c309bac82f0cc93c736c239c\": container with ID starting with 5ac092a4daa49b5d1556a11982584c1f9fc67103c309bac82f0cc93c736c239c not found: ID does not exist" Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.359287 4764 scope.go:117] "RemoveContainer" containerID="cb84120ece4662aa8eaf5e4da54ddf7a38310a0e535e3d45c9e333ddbaf651d6" Mar 09 14:03:47 crc kubenswrapper[4764]: E0309 14:03:47.359714 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb84120ece4662aa8eaf5e4da54ddf7a38310a0e535e3d45c9e333ddbaf651d6\": container with ID starting with cb84120ece4662aa8eaf5e4da54ddf7a38310a0e535e3d45c9e333ddbaf651d6 not found: ID does not exist" containerID="cb84120ece4662aa8eaf5e4da54ddf7a38310a0e535e3d45c9e333ddbaf651d6" Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.359756 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb84120ece4662aa8eaf5e4da54ddf7a38310a0e535e3d45c9e333ddbaf651d6"} err="failed to get container status \"cb84120ece4662aa8eaf5e4da54ddf7a38310a0e535e3d45c9e333ddbaf651d6\": rpc error: code = NotFound desc = could not find container \"cb84120ece4662aa8eaf5e4da54ddf7a38310a0e535e3d45c9e333ddbaf651d6\": container with ID starting with cb84120ece4662aa8eaf5e4da54ddf7a38310a0e535e3d45c9e333ddbaf651d6 not found: ID does not exist" Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.359785 4764 scope.go:117] "RemoveContainer" containerID="a231bd14b5a52e83e1070151054791b6cd46c3e0738c42bd2c7de1d6fbe77190" Mar 09 14:03:47 crc kubenswrapper[4764]: E0309 14:03:47.360090 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a231bd14b5a52e83e1070151054791b6cd46c3e0738c42bd2c7de1d6fbe77190\": container with ID starting with a231bd14b5a52e83e1070151054791b6cd46c3e0738c42bd2c7de1d6fbe77190 not found: ID does not exist" containerID="a231bd14b5a52e83e1070151054791b6cd46c3e0738c42bd2c7de1d6fbe77190" Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.360129 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a231bd14b5a52e83e1070151054791b6cd46c3e0738c42bd2c7de1d6fbe77190"} err="failed to get container status \"a231bd14b5a52e83e1070151054791b6cd46c3e0738c42bd2c7de1d6fbe77190\": rpc error: code = NotFound desc = could not find container \"a231bd14b5a52e83e1070151054791b6cd46c3e0738c42bd2c7de1d6fbe77190\": container with ID starting with a231bd14b5a52e83e1070151054791b6cd46c3e0738c42bd2c7de1d6fbe77190 not found: ID does not exist" Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.385922 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6154590f-34aa-4248-a339-14cb0d11da17-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6154590f-34aa-4248-a339-14cb0d11da17" (UID: "6154590f-34aa-4248-a339-14cb0d11da17"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.403722 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6154590f-34aa-4248-a339-14cb0d11da17-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.606925 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jx29x"] Mar 09 14:03:47 crc kubenswrapper[4764]: I0309 14:03:47.615693 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jx29x"] Mar 09 14:03:49 crc kubenswrapper[4764]: I0309 14:03:49.571402 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6154590f-34aa-4248-a339-14cb0d11da17" path="/var/lib/kubelet/pods/6154590f-34aa-4248-a339-14cb0d11da17/volumes" Mar 09 14:03:57 crc kubenswrapper[4764]: I0309 14:03:57.560519 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:03:57 crc kubenswrapper[4764]: E0309 14:03:57.561720 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:04:00 crc kubenswrapper[4764]: I0309 14:04:00.147405 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551084-pchq7"] Mar 09 14:04:00 crc kubenswrapper[4764]: E0309 14:04:00.148606 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6154590f-34aa-4248-a339-14cb0d11da17" containerName="extract-utilities" Mar 09 14:04:00 crc kubenswrapper[4764]: I0309 14:04:00.148622 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6154590f-34aa-4248-a339-14cb0d11da17" containerName="extract-utilities" Mar 09 14:04:00 crc kubenswrapper[4764]: E0309 14:04:00.148633 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6154590f-34aa-4248-a339-14cb0d11da17" containerName="extract-content" Mar 09 14:04:00 crc kubenswrapper[4764]: I0309 14:04:00.148640 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6154590f-34aa-4248-a339-14cb0d11da17" containerName="extract-content" Mar 09 14:04:00 crc kubenswrapper[4764]: E0309 14:04:00.148691 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6154590f-34aa-4248-a339-14cb0d11da17" containerName="registry-server" Mar 09 14:04:00 crc kubenswrapper[4764]: I0309 14:04:00.148699 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6154590f-34aa-4248-a339-14cb0d11da17" containerName="registry-server" Mar 09 14:04:00 crc kubenswrapper[4764]: I0309 14:04:00.148870 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6154590f-34aa-4248-a339-14cb0d11da17" containerName="registry-server" Mar 09 14:04:00 crc kubenswrapper[4764]: I0309 14:04:00.149640 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551084-pchq7" Mar 09 14:04:00 crc kubenswrapper[4764]: I0309 14:04:00.152885 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:04:00 crc kubenswrapper[4764]: I0309 14:04:00.152911 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:04:00 crc kubenswrapper[4764]: I0309 14:04:00.152885 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:04:00 crc kubenswrapper[4764]: I0309 14:04:00.159439 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551084-pchq7"] Mar 09 14:04:00 crc kubenswrapper[4764]: I0309 14:04:00.291068 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zss62\" (UniqueName: \"kubernetes.io/projected/91d1d723-d4e8-40d8-9d17-3dfee51e7aef-kube-api-access-zss62\") pod \"auto-csr-approver-29551084-pchq7\" (UID: \"91d1d723-d4e8-40d8-9d17-3dfee51e7aef\") " pod="openshift-infra/auto-csr-approver-29551084-pchq7" Mar 09 14:04:00 crc kubenswrapper[4764]: I0309 14:04:00.392795 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zss62\" (UniqueName: \"kubernetes.io/projected/91d1d723-d4e8-40d8-9d17-3dfee51e7aef-kube-api-access-zss62\") pod \"auto-csr-approver-29551084-pchq7\" (UID: \"91d1d723-d4e8-40d8-9d17-3dfee51e7aef\") " pod="openshift-infra/auto-csr-approver-29551084-pchq7" Mar 09 14:04:00 crc kubenswrapper[4764]: I0309 14:04:00.414145 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zss62\" (UniqueName: \"kubernetes.io/projected/91d1d723-d4e8-40d8-9d17-3dfee51e7aef-kube-api-access-zss62\") pod \"auto-csr-approver-29551084-pchq7\" (UID: \"91d1d723-d4e8-40d8-9d17-3dfee51e7aef\") " pod="openshift-infra/auto-csr-approver-29551084-pchq7" Mar 09 14:04:00 crc kubenswrapper[4764]: I0309 14:04:00.480332 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551084-pchq7" Mar 09 14:04:00 crc kubenswrapper[4764]: I0309 14:04:00.744348 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551084-pchq7"] Mar 09 14:04:01 crc kubenswrapper[4764]: I0309 14:04:01.400182 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551084-pchq7" event={"ID":"91d1d723-d4e8-40d8-9d17-3dfee51e7aef","Type":"ContainerStarted","Data":"72484cba728ec8aec28a5b633ccfe8a2f5e5486a0d2c3482ac695e52f4c532be"} Mar 09 14:04:02 crc kubenswrapper[4764]: I0309 14:04:02.426218 4764 generic.go:334] "Generic (PLEG): container finished" podID="91d1d723-d4e8-40d8-9d17-3dfee51e7aef" containerID="6e6b381c4b8297d66803da8d662836720642fff63afbcf8243cdcd4157213d11" exitCode=0 Mar 09 14:04:02 crc kubenswrapper[4764]: I0309 14:04:02.426297 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551084-pchq7" event={"ID":"91d1d723-d4e8-40d8-9d17-3dfee51e7aef","Type":"ContainerDied","Data":"6e6b381c4b8297d66803da8d662836720642fff63afbcf8243cdcd4157213d11"} Mar 09 14:04:03 crc kubenswrapper[4764]: I0309 14:04:03.812951 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551084-pchq7" Mar 09 14:04:03 crc kubenswrapper[4764]: I0309 14:04:03.975698 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zss62\" (UniqueName: \"kubernetes.io/projected/91d1d723-d4e8-40d8-9d17-3dfee51e7aef-kube-api-access-zss62\") pod \"91d1d723-d4e8-40d8-9d17-3dfee51e7aef\" (UID: \"91d1d723-d4e8-40d8-9d17-3dfee51e7aef\") " Mar 09 14:04:03 crc kubenswrapper[4764]: I0309 14:04:03.982347 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91d1d723-d4e8-40d8-9d17-3dfee51e7aef-kube-api-access-zss62" (OuterVolumeSpecName: "kube-api-access-zss62") pod "91d1d723-d4e8-40d8-9d17-3dfee51e7aef" (UID: "91d1d723-d4e8-40d8-9d17-3dfee51e7aef"). InnerVolumeSpecName "kube-api-access-zss62". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:04 crc kubenswrapper[4764]: I0309 14:04:04.078893 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zss62\" (UniqueName: \"kubernetes.io/projected/91d1d723-d4e8-40d8-9d17-3dfee51e7aef-kube-api-access-zss62\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:04 crc kubenswrapper[4764]: I0309 14:04:04.448630 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551084-pchq7" event={"ID":"91d1d723-d4e8-40d8-9d17-3dfee51e7aef","Type":"ContainerDied","Data":"72484cba728ec8aec28a5b633ccfe8a2f5e5486a0d2c3482ac695e52f4c532be"} Mar 09 14:04:04 crc kubenswrapper[4764]: I0309 14:04:04.448705 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72484cba728ec8aec28a5b633ccfe8a2f5e5486a0d2c3482ac695e52f4c532be" Mar 09 14:04:04 crc kubenswrapper[4764]: I0309 14:04:04.448743 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551084-pchq7" Mar 09 14:04:04 crc kubenswrapper[4764]: I0309 14:04:04.903863 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551078-z7ms2"] Mar 09 14:04:04 crc kubenswrapper[4764]: I0309 14:04:04.916744 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551078-z7ms2"] Mar 09 14:04:05 crc kubenswrapper[4764]: I0309 14:04:05.571930 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9" path="/var/lib/kubelet/pods/f26fbbd6-fe1a-4ca6-82a8-e425edc3d3d9/volumes" Mar 09 14:04:09 crc kubenswrapper[4764]: I0309 14:04:09.392327 4764 scope.go:117] "RemoveContainer" containerID="e98df59174ca147f240e577b9eb7747712c4ce30b3cb22cafa3c795a0cc708fe" Mar 09 14:04:11 crc kubenswrapper[4764]: I0309 14:04:11.561282 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:04:12 crc kubenswrapper[4764]: I0309 14:04:12.540920 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"63776e57dad14b6a74a94432519b0b437cfc2082448e28e0a32706fa439569a0"} Mar 09 14:06:00 crc kubenswrapper[4764]: I0309 14:06:00.152601 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551086-dn2gl"] Mar 09 14:06:00 crc kubenswrapper[4764]: E0309 14:06:00.154308 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91d1d723-d4e8-40d8-9d17-3dfee51e7aef" containerName="oc" Mar 09 14:06:00 crc kubenswrapper[4764]: I0309 14:06:00.154328 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="91d1d723-d4e8-40d8-9d17-3dfee51e7aef" containerName="oc" Mar 09 14:06:00 crc kubenswrapper[4764]: I0309 14:06:00.154574 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="91d1d723-d4e8-40d8-9d17-3dfee51e7aef" containerName="oc" Mar 09 14:06:00 crc kubenswrapper[4764]: I0309 14:06:00.155548 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551086-dn2gl" Mar 09 14:06:00 crc kubenswrapper[4764]: I0309 14:06:00.158423 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:06:00 crc kubenswrapper[4764]: I0309 14:06:00.158934 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:06:00 crc kubenswrapper[4764]: I0309 14:06:00.159702 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:06:00 crc kubenswrapper[4764]: I0309 14:06:00.172939 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551086-dn2gl"] Mar 09 14:06:00 crc kubenswrapper[4764]: I0309 14:06:00.288329 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p98kr\" (UniqueName: \"kubernetes.io/projected/55c3951f-6e8b-46f4-9332-9c5d658862e4-kube-api-access-p98kr\") pod \"auto-csr-approver-29551086-dn2gl\" (UID: \"55c3951f-6e8b-46f4-9332-9c5d658862e4\") " pod="openshift-infra/auto-csr-approver-29551086-dn2gl" Mar 09 14:06:00 crc kubenswrapper[4764]: I0309 14:06:00.391099 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p98kr\" (UniqueName: \"kubernetes.io/projected/55c3951f-6e8b-46f4-9332-9c5d658862e4-kube-api-access-p98kr\") pod \"auto-csr-approver-29551086-dn2gl\" (UID: \"55c3951f-6e8b-46f4-9332-9c5d658862e4\") " pod="openshift-infra/auto-csr-approver-29551086-dn2gl" Mar 09 14:06:00 crc kubenswrapper[4764]: I0309 14:06:00.414222 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p98kr\" (UniqueName: \"kubernetes.io/projected/55c3951f-6e8b-46f4-9332-9c5d658862e4-kube-api-access-p98kr\") pod \"auto-csr-approver-29551086-dn2gl\" (UID: \"55c3951f-6e8b-46f4-9332-9c5d658862e4\") " pod="openshift-infra/auto-csr-approver-29551086-dn2gl" Mar 09 14:06:00 crc kubenswrapper[4764]: I0309 14:06:00.480943 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551086-dn2gl" Mar 09 14:06:00 crc kubenswrapper[4764]: I0309 14:06:00.992279 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551086-dn2gl"] Mar 09 14:06:01 crc kubenswrapper[4764]: I0309 14:06:01.634082 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551086-dn2gl" event={"ID":"55c3951f-6e8b-46f4-9332-9c5d658862e4","Type":"ContainerStarted","Data":"fb1ca94ed31e845ce1a0283cdd181ddc6c87515415c8bff16919dbdf8df21971"} Mar 09 14:06:02 crc kubenswrapper[4764]: I0309 14:06:02.645989 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551086-dn2gl" event={"ID":"55c3951f-6e8b-46f4-9332-9c5d658862e4","Type":"ContainerStarted","Data":"4dfa06f7a78f4ddaf54d7700f7fde17ae280a4e75103177bb66105376bf4e6a2"} Mar 09 14:06:02 crc kubenswrapper[4764]: I0309 14:06:02.672121 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551086-dn2gl" podStartSLOduration=1.368834846 podStartE2EDuration="2.672098988s" podCreationTimestamp="2026-03-09 14:06:00 +0000 UTC" firstStartedPulling="2026-03-09 14:06:00.997863236 +0000 UTC m=+2716.248035144" lastFinishedPulling="2026-03-09 14:06:02.301127378 +0000 UTC m=+2717.551299286" observedRunningTime="2026-03-09 14:06:02.661263999 +0000 UTC m=+2717.911435907" watchObservedRunningTime="2026-03-09 14:06:02.672098988 +0000 UTC m=+2717.922270896" Mar 09 14:06:03 crc kubenswrapper[4764]: I0309 14:06:03.671208 4764 generic.go:334] "Generic (PLEG): container finished" podID="55c3951f-6e8b-46f4-9332-9c5d658862e4" containerID="4dfa06f7a78f4ddaf54d7700f7fde17ae280a4e75103177bb66105376bf4e6a2" exitCode=0 Mar 09 14:06:03 crc kubenswrapper[4764]: I0309 14:06:03.671306 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551086-dn2gl" event={"ID":"55c3951f-6e8b-46f4-9332-9c5d658862e4","Type":"ContainerDied","Data":"4dfa06f7a78f4ddaf54d7700f7fde17ae280a4e75103177bb66105376bf4e6a2"} Mar 09 14:06:05 crc kubenswrapper[4764]: I0309 14:06:05.010036 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551086-dn2gl" Mar 09 14:06:05 crc kubenswrapper[4764]: I0309 14:06:05.126080 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p98kr\" (UniqueName: \"kubernetes.io/projected/55c3951f-6e8b-46f4-9332-9c5d658862e4-kube-api-access-p98kr\") pod \"55c3951f-6e8b-46f4-9332-9c5d658862e4\" (UID: \"55c3951f-6e8b-46f4-9332-9c5d658862e4\") " Mar 09 14:06:05 crc kubenswrapper[4764]: I0309 14:06:05.133881 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55c3951f-6e8b-46f4-9332-9c5d658862e4-kube-api-access-p98kr" (OuterVolumeSpecName: "kube-api-access-p98kr") pod "55c3951f-6e8b-46f4-9332-9c5d658862e4" (UID: "55c3951f-6e8b-46f4-9332-9c5d658862e4"). InnerVolumeSpecName "kube-api-access-p98kr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:06:05 crc kubenswrapper[4764]: I0309 14:06:05.228995 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p98kr\" (UniqueName: \"kubernetes.io/projected/55c3951f-6e8b-46f4-9332-9c5d658862e4-kube-api-access-p98kr\") on node \"crc\" DevicePath \"\"" Mar 09 14:06:05 crc kubenswrapper[4764]: I0309 14:06:05.689486 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551086-dn2gl" event={"ID":"55c3951f-6e8b-46f4-9332-9c5d658862e4","Type":"ContainerDied","Data":"fb1ca94ed31e845ce1a0283cdd181ddc6c87515415c8bff16919dbdf8df21971"} Mar 09 14:06:05 crc kubenswrapper[4764]: I0309 14:06:05.689538 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb1ca94ed31e845ce1a0283cdd181ddc6c87515415c8bff16919dbdf8df21971" Mar 09 14:06:05 crc kubenswrapper[4764]: I0309 14:06:05.689564 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551086-dn2gl" Mar 09 14:06:05 crc kubenswrapper[4764]: I0309 14:06:05.794318 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551080-svz2w"] Mar 09 14:06:05 crc kubenswrapper[4764]: I0309 14:06:05.802467 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551080-svz2w"] Mar 09 14:06:07 crc kubenswrapper[4764]: I0309 14:06:07.571483 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6eebc0e-7e89-4489-b808-7eebf0e54dca" path="/var/lib/kubelet/pods/f6eebc0e-7e89-4489-b808-7eebf0e54dca/volumes" Mar 09 14:06:09 crc kubenswrapper[4764]: I0309 14:06:09.529761 4764 scope.go:117] "RemoveContainer" containerID="55591ff62c6b7e1f8299cbaa3c157dfe4ead0808f086d1b63edc0dcadd27bf4a" Mar 09 14:06:28 crc kubenswrapper[4764]: I0309 14:06:28.370299 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:06:28 crc kubenswrapper[4764]: I0309 14:06:28.370946 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:06:58 crc kubenswrapper[4764]: I0309 14:06:58.371019 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:06:58 crc kubenswrapper[4764]: I0309 14:06:58.371915 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:07:01 crc kubenswrapper[4764]: I0309 14:07:01.251455 4764 generic.go:334] "Generic (PLEG): container finished" podID="0a9ed7f5-c296-41ac-ae0d-5845c66a385a" containerID="a1d62db2728aab12ddd10813940dde51882f73fd4e8503f3f1ba3d93d56bff30" exitCode=0 Mar 09 14:07:01 crc kubenswrapper[4764]: I0309 14:07:01.251592 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" event={"ID":"0a9ed7f5-c296-41ac-ae0d-5845c66a385a","Type":"ContainerDied","Data":"a1d62db2728aab12ddd10813940dde51882f73fd4e8503f3f1ba3d93d56bff30"} Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.671047 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.859707 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-inventory\") pod \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.859774 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-libvirt-secret-0\") pod \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.859823 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-ceph\") pod \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.859997 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-libvirt-combined-ca-bundle\") pod \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.860051 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmjg6\" (UniqueName: \"kubernetes.io/projected/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-kube-api-access-pmjg6\") pod \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.860078 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-ssh-key-openstack-edpm-ipam\") pod \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\" (UID: \"0a9ed7f5-c296-41ac-ae0d-5845c66a385a\") " Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.867825 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "0a9ed7f5-c296-41ac-ae0d-5845c66a385a" (UID: "0a9ed7f5-c296-41ac-ae0d-5845c66a385a"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.869167 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-ceph" (OuterVolumeSpecName: "ceph") pod "0a9ed7f5-c296-41ac-ae0d-5845c66a385a" (UID: "0a9ed7f5-c296-41ac-ae0d-5845c66a385a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.869970 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-kube-api-access-pmjg6" (OuterVolumeSpecName: "kube-api-access-pmjg6") pod "0a9ed7f5-c296-41ac-ae0d-5845c66a385a" (UID: "0a9ed7f5-c296-41ac-ae0d-5845c66a385a"). InnerVolumeSpecName "kube-api-access-pmjg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.892365 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0a9ed7f5-c296-41ac-ae0d-5845c66a385a" (UID: "0a9ed7f5-c296-41ac-ae0d-5845c66a385a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.894623 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-inventory" (OuterVolumeSpecName: "inventory") pod "0a9ed7f5-c296-41ac-ae0d-5845c66a385a" (UID: "0a9ed7f5-c296-41ac-ae0d-5845c66a385a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.901623 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "0a9ed7f5-c296-41ac-ae0d-5845c66a385a" (UID: "0a9ed7f5-c296-41ac-ae0d-5845c66a385a"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.962582 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.962628 4764 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.962660 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.962671 4764 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.962682 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:02 crc kubenswrapper[4764]: I0309 14:07:02.962693 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmjg6\" (UniqueName: \"kubernetes.io/projected/0a9ed7f5-c296-41ac-ae0d-5845c66a385a-kube-api-access-pmjg6\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.271348 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" event={"ID":"0a9ed7f5-c296-41ac-ae0d-5845c66a385a","Type":"ContainerDied","Data":"548509aa0f1e8c3468f071ff1b05af0aabf6c1afd7c27047c4f22b0be5b05951"} Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.271857 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="548509aa0f1e8c3468f071ff1b05af0aabf6c1afd7c27047c4f22b0be5b05951" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.271426 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9pl98" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.378018 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq"] Mar 09 14:07:03 crc kubenswrapper[4764]: E0309 14:07:03.378523 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55c3951f-6e8b-46f4-9332-9c5d658862e4" containerName="oc" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.378546 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c3951f-6e8b-46f4-9332-9c5d658862e4" containerName="oc" Mar 09 14:07:03 crc kubenswrapper[4764]: E0309 14:07:03.378578 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a9ed7f5-c296-41ac-ae0d-5845c66a385a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.378585 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a9ed7f5-c296-41ac-ae0d-5845c66a385a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.378789 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="55c3951f-6e8b-46f4-9332-9c5d658862e4" containerName="oc" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.378808 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a9ed7f5-c296-41ac-ae0d-5845c66a385a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.379569 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.384868 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.384999 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.385226 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.385338 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.385508 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.385555 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.385697 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ctrj8" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.385763 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.385901 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.397326 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq"] Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.576240 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c6xn\" (UniqueName: \"kubernetes.io/projected/eab144b6-e27c-4ffc-9dd5-6236ca12719f-kube-api-access-7c6xn\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.576297 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.576328 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.576356 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-3\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.576378 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.576419 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.576451 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.576478 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.576515 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.576553 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-2\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.576583 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.576605 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.576621 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.678777 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c6xn\" (UniqueName: \"kubernetes.io/projected/eab144b6-e27c-4ffc-9dd5-6236ca12719f-kube-api-access-7c6xn\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.678859 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.678899 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.678953 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-3\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.678981 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.679032 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.679104 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.679143 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.679252 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.679294 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-2\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.679334 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.679358 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.679384 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.681390 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.681461 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.685156 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-3\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.685334 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.685455 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.685988 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.686570 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.686616 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.686624 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.687581 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-2\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.687855 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.693848 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.696516 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c6xn\" (UniqueName: \"kubernetes.io/projected/eab144b6-e27c-4ffc-9dd5-6236ca12719f-kube-api-access-7c6xn\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:03 crc kubenswrapper[4764]: I0309 14:07:03.700462 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:07:04 crc kubenswrapper[4764]: I0309 14:07:04.247412 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq"] Mar 09 14:07:04 crc kubenswrapper[4764]: I0309 14:07:04.253486 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 14:07:04 crc kubenswrapper[4764]: I0309 14:07:04.282143 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" event={"ID":"eab144b6-e27c-4ffc-9dd5-6236ca12719f","Type":"ContainerStarted","Data":"764af20acdc888d995ca2db68f3ab58c7fcd79d0559f4ce7bd21b3b2183c8e62"} Mar 09 14:07:05 crc kubenswrapper[4764]: I0309 14:07:05.293952 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" event={"ID":"eab144b6-e27c-4ffc-9dd5-6236ca12719f","Type":"ContainerStarted","Data":"86fa8795ecd86c7b9c42df1c80c762e511f08dce7f9815c18841bff01c585b45"} Mar 09 14:07:05 crc kubenswrapper[4764]: I0309 14:07:05.320029 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" podStartSLOduration=1.590808214 podStartE2EDuration="2.320001905s" podCreationTimestamp="2026-03-09 14:07:03 +0000 UTC" firstStartedPulling="2026-03-09 14:07:04.2530766 +0000 UTC m=+2779.503248518" lastFinishedPulling="2026-03-09 14:07:04.982270301 +0000 UTC m=+2780.232442209" observedRunningTime="2026-03-09 14:07:05.314713494 +0000 UTC m=+2780.564885432" watchObservedRunningTime="2026-03-09 14:07:05.320001905 +0000 UTC m=+2780.570173823" Mar 09 14:07:28 crc kubenswrapper[4764]: I0309 14:07:28.370239 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:07:28 crc kubenswrapper[4764]: I0309 14:07:28.370822 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:07:28 crc kubenswrapper[4764]: I0309 14:07:28.370878 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 14:07:28 crc kubenswrapper[4764]: I0309 14:07:28.371752 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"63776e57dad14b6a74a94432519b0b437cfc2082448e28e0a32706fa439569a0"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 14:07:28 crc kubenswrapper[4764]: I0309 14:07:28.371807 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://63776e57dad14b6a74a94432519b0b437cfc2082448e28e0a32706fa439569a0" gracePeriod=600 Mar 09 14:07:28 crc kubenswrapper[4764]: I0309 14:07:28.524245 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="63776e57dad14b6a74a94432519b0b437cfc2082448e28e0a32706fa439569a0" exitCode=0 Mar 09 14:07:28 crc kubenswrapper[4764]: I0309 14:07:28.524568 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"63776e57dad14b6a74a94432519b0b437cfc2082448e28e0a32706fa439569a0"} Mar 09 14:07:28 crc kubenswrapper[4764]: I0309 14:07:28.525124 4764 scope.go:117] "RemoveContainer" containerID="827bb36b06cc9f42c479b857320543a7b207fe3138757e8a034890cd3b0b0211" Mar 09 14:07:29 crc kubenswrapper[4764]: I0309 14:07:29.537128 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb"} Mar 09 14:08:00 crc kubenswrapper[4764]: I0309 14:08:00.146283 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551088-wtbvc"] Mar 09 14:08:00 crc kubenswrapper[4764]: I0309 14:08:00.148272 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551088-wtbvc" Mar 09 14:08:00 crc kubenswrapper[4764]: I0309 14:08:00.154162 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:08:00 crc kubenswrapper[4764]: I0309 14:08:00.154586 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:08:00 crc kubenswrapper[4764]: I0309 14:08:00.154181 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:08:00 crc kubenswrapper[4764]: I0309 14:08:00.162088 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551088-wtbvc"] Mar 09 14:08:00 crc kubenswrapper[4764]: I0309 14:08:00.207739 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rczx5\" (UniqueName: \"kubernetes.io/projected/902ae1d9-a43c-46c6-a492-10ee0242e721-kube-api-access-rczx5\") pod \"auto-csr-approver-29551088-wtbvc\" (UID: \"902ae1d9-a43c-46c6-a492-10ee0242e721\") " pod="openshift-infra/auto-csr-approver-29551088-wtbvc" Mar 09 14:08:00 crc kubenswrapper[4764]: I0309 14:08:00.309574 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczx5\" (UniqueName: \"kubernetes.io/projected/902ae1d9-a43c-46c6-a492-10ee0242e721-kube-api-access-rczx5\") pod \"auto-csr-approver-29551088-wtbvc\" (UID: \"902ae1d9-a43c-46c6-a492-10ee0242e721\") " pod="openshift-infra/auto-csr-approver-29551088-wtbvc" Mar 09 14:08:00 crc kubenswrapper[4764]: I0309 14:08:00.347181 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczx5\" (UniqueName: \"kubernetes.io/projected/902ae1d9-a43c-46c6-a492-10ee0242e721-kube-api-access-rczx5\") pod \"auto-csr-approver-29551088-wtbvc\" (UID: \"902ae1d9-a43c-46c6-a492-10ee0242e721\") " pod="openshift-infra/auto-csr-approver-29551088-wtbvc" Mar 09 14:08:00 crc kubenswrapper[4764]: I0309 14:08:00.525484 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551088-wtbvc" Mar 09 14:08:00 crc kubenswrapper[4764]: I0309 14:08:00.997309 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551088-wtbvc"] Mar 09 14:08:01 crc kubenswrapper[4764]: I0309 14:08:01.865832 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551088-wtbvc" event={"ID":"902ae1d9-a43c-46c6-a492-10ee0242e721","Type":"ContainerStarted","Data":"c14e088b1d022613970d0112683f438fb3a8b48c6452e709953c210fc2585b9e"} Mar 09 14:08:02 crc kubenswrapper[4764]: I0309 14:08:02.878866 4764 generic.go:334] "Generic (PLEG): container finished" podID="902ae1d9-a43c-46c6-a492-10ee0242e721" containerID="20854488fc265f9f6a154273e0d45920e3a458753547063958f0c25388b08a64" exitCode=0 Mar 09 14:08:02 crc kubenswrapper[4764]: I0309 14:08:02.878933 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551088-wtbvc" event={"ID":"902ae1d9-a43c-46c6-a492-10ee0242e721","Type":"ContainerDied","Data":"20854488fc265f9f6a154273e0d45920e3a458753547063958f0c25388b08a64"} Mar 09 14:08:04 crc kubenswrapper[4764]: I0309 14:08:04.269251 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551088-wtbvc" Mar 09 14:08:04 crc kubenswrapper[4764]: I0309 14:08:04.401200 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rczx5\" (UniqueName: \"kubernetes.io/projected/902ae1d9-a43c-46c6-a492-10ee0242e721-kube-api-access-rczx5\") pod \"902ae1d9-a43c-46c6-a492-10ee0242e721\" (UID: \"902ae1d9-a43c-46c6-a492-10ee0242e721\") " Mar 09 14:08:04 crc kubenswrapper[4764]: I0309 14:08:04.411176 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/902ae1d9-a43c-46c6-a492-10ee0242e721-kube-api-access-rczx5" (OuterVolumeSpecName: "kube-api-access-rczx5") pod "902ae1d9-a43c-46c6-a492-10ee0242e721" (UID: "902ae1d9-a43c-46c6-a492-10ee0242e721"). InnerVolumeSpecName "kube-api-access-rczx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:08:04 crc kubenswrapper[4764]: I0309 14:08:04.503158 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rczx5\" (UniqueName: \"kubernetes.io/projected/902ae1d9-a43c-46c6-a492-10ee0242e721-kube-api-access-rczx5\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:04 crc kubenswrapper[4764]: I0309 14:08:04.901807 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551088-wtbvc" event={"ID":"902ae1d9-a43c-46c6-a492-10ee0242e721","Type":"ContainerDied","Data":"c14e088b1d022613970d0112683f438fb3a8b48c6452e709953c210fc2585b9e"} Mar 09 14:08:04 crc kubenswrapper[4764]: I0309 14:08:04.901857 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c14e088b1d022613970d0112683f438fb3a8b48c6452e709953c210fc2585b9e" Mar 09 14:08:04 crc kubenswrapper[4764]: I0309 14:08:04.902033 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551088-wtbvc" Mar 09 14:08:05 crc kubenswrapper[4764]: I0309 14:08:05.357494 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551082-9pffj"] Mar 09 14:08:05 crc kubenswrapper[4764]: I0309 14:08:05.368127 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551082-9pffj"] Mar 09 14:08:05 crc kubenswrapper[4764]: I0309 14:08:05.576319 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbc0f639-3ece-4df6-bbaa-af1572005872" path="/var/lib/kubelet/pods/cbc0f639-3ece-4df6-bbaa-af1572005872/volumes" Mar 09 14:08:09 crc kubenswrapper[4764]: I0309 14:08:09.629568 4764 scope.go:117] "RemoveContainer" containerID="083b9151b9954604db8e0ffa63a89b0a3dc0a267db79fa28e9e68016aa4eee5c" Mar 09 14:08:30 crc kubenswrapper[4764]: I0309 14:08:30.741936 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wfpcl"] Mar 09 14:08:30 crc kubenswrapper[4764]: E0309 14:08:30.743957 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="902ae1d9-a43c-46c6-a492-10ee0242e721" containerName="oc" Mar 09 14:08:30 crc kubenswrapper[4764]: I0309 14:08:30.744004 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="902ae1d9-a43c-46c6-a492-10ee0242e721" containerName="oc" Mar 09 14:08:30 crc kubenswrapper[4764]: I0309 14:08:30.744838 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="902ae1d9-a43c-46c6-a492-10ee0242e721" containerName="oc" Mar 09 14:08:30 crc kubenswrapper[4764]: I0309 14:08:30.748183 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:30 crc kubenswrapper[4764]: I0309 14:08:30.762480 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wfpcl"] Mar 09 14:08:30 crc kubenswrapper[4764]: I0309 14:08:30.853730 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f47d85f-63cb-45ce-b935-1b2a534523dc-catalog-content\") pod \"community-operators-wfpcl\" (UID: \"5f47d85f-63cb-45ce-b935-1b2a534523dc\") " pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:30 crc kubenswrapper[4764]: I0309 14:08:30.853872 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnxnf\" (UniqueName: \"kubernetes.io/projected/5f47d85f-63cb-45ce-b935-1b2a534523dc-kube-api-access-wnxnf\") pod \"community-operators-wfpcl\" (UID: \"5f47d85f-63cb-45ce-b935-1b2a534523dc\") " pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:30 crc kubenswrapper[4764]: I0309 14:08:30.853936 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f47d85f-63cb-45ce-b935-1b2a534523dc-utilities\") pod \"community-operators-wfpcl\" (UID: \"5f47d85f-63cb-45ce-b935-1b2a534523dc\") " pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:30 crc kubenswrapper[4764]: I0309 14:08:30.955617 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f47d85f-63cb-45ce-b935-1b2a534523dc-utilities\") pod \"community-operators-wfpcl\" (UID: \"5f47d85f-63cb-45ce-b935-1b2a534523dc\") " pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:30 crc kubenswrapper[4764]: I0309 14:08:30.955799 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f47d85f-63cb-45ce-b935-1b2a534523dc-catalog-content\") pod \"community-operators-wfpcl\" (UID: \"5f47d85f-63cb-45ce-b935-1b2a534523dc\") " pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:30 crc kubenswrapper[4764]: I0309 14:08:30.955941 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnxnf\" (UniqueName: \"kubernetes.io/projected/5f47d85f-63cb-45ce-b935-1b2a534523dc-kube-api-access-wnxnf\") pod \"community-operators-wfpcl\" (UID: \"5f47d85f-63cb-45ce-b935-1b2a534523dc\") " pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:30 crc kubenswrapper[4764]: I0309 14:08:30.956411 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f47d85f-63cb-45ce-b935-1b2a534523dc-catalog-content\") pod \"community-operators-wfpcl\" (UID: \"5f47d85f-63cb-45ce-b935-1b2a534523dc\") " pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:30 crc kubenswrapper[4764]: I0309 14:08:30.956876 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f47d85f-63cb-45ce-b935-1b2a534523dc-utilities\") pod \"community-operators-wfpcl\" (UID: \"5f47d85f-63cb-45ce-b935-1b2a534523dc\") " pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:30 crc kubenswrapper[4764]: I0309 14:08:30.987748 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnxnf\" (UniqueName: \"kubernetes.io/projected/5f47d85f-63cb-45ce-b935-1b2a534523dc-kube-api-access-wnxnf\") pod \"community-operators-wfpcl\" (UID: \"5f47d85f-63cb-45ce-b935-1b2a534523dc\") " pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:31 crc kubenswrapper[4764]: I0309 14:08:31.095904 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:31 crc kubenswrapper[4764]: I0309 14:08:31.701669 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wfpcl"] Mar 09 14:08:32 crc kubenswrapper[4764]: I0309 14:08:32.157438 4764 generic.go:334] "Generic (PLEG): container finished" podID="5f47d85f-63cb-45ce-b935-1b2a534523dc" containerID="bb942a0c55a1ddef8cb3af71369818a07d7c6cedd6474ba3eefacd8515ae9e03" exitCode=0 Mar 09 14:08:32 crc kubenswrapper[4764]: I0309 14:08:32.157571 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wfpcl" event={"ID":"5f47d85f-63cb-45ce-b935-1b2a534523dc","Type":"ContainerDied","Data":"bb942a0c55a1ddef8cb3af71369818a07d7c6cedd6474ba3eefacd8515ae9e03"} Mar 09 14:08:32 crc kubenswrapper[4764]: I0309 14:08:32.157956 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wfpcl" event={"ID":"5f47d85f-63cb-45ce-b935-1b2a534523dc","Type":"ContainerStarted","Data":"cfc0c125ad4a9bdc4d2c39a4d4e59044968a7fa9e65b6f1d8fca1ba7dfcb96b5"} Mar 09 14:08:33 crc kubenswrapper[4764]: I0309 14:08:33.176349 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wfpcl" event={"ID":"5f47d85f-63cb-45ce-b935-1b2a534523dc","Type":"ContainerStarted","Data":"5012025ebd18fb17d5d312a24940b888eecb61a59a4797cfdc87d49706f8c4cd"} Mar 09 14:08:34 crc kubenswrapper[4764]: I0309 14:08:34.187347 4764 generic.go:334] "Generic (PLEG): container finished" podID="5f47d85f-63cb-45ce-b935-1b2a534523dc" containerID="5012025ebd18fb17d5d312a24940b888eecb61a59a4797cfdc87d49706f8c4cd" exitCode=0 Mar 09 14:08:34 crc kubenswrapper[4764]: I0309 14:08:34.187406 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wfpcl" event={"ID":"5f47d85f-63cb-45ce-b935-1b2a534523dc","Type":"ContainerDied","Data":"5012025ebd18fb17d5d312a24940b888eecb61a59a4797cfdc87d49706f8c4cd"} Mar 09 14:08:35 crc kubenswrapper[4764]: I0309 14:08:35.201508 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wfpcl" event={"ID":"5f47d85f-63cb-45ce-b935-1b2a534523dc","Type":"ContainerStarted","Data":"da5f4a3cea7278ff0e8c3dfb2732a6e5cb3460daded67995e65716fc617874f5"} Mar 09 14:08:35 crc kubenswrapper[4764]: I0309 14:08:35.230521 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wfpcl" podStartSLOduration=2.786725282 podStartE2EDuration="5.230492851s" podCreationTimestamp="2026-03-09 14:08:30 +0000 UTC" firstStartedPulling="2026-03-09 14:08:32.162583547 +0000 UTC m=+2867.412755455" lastFinishedPulling="2026-03-09 14:08:34.606351116 +0000 UTC m=+2869.856523024" observedRunningTime="2026-03-09 14:08:35.22145231 +0000 UTC m=+2870.471624218" watchObservedRunningTime="2026-03-09 14:08:35.230492851 +0000 UTC m=+2870.480664759" Mar 09 14:08:41 crc kubenswrapper[4764]: I0309 14:08:41.097043 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:41 crc kubenswrapper[4764]: I0309 14:08:41.097912 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:41 crc kubenswrapper[4764]: I0309 14:08:41.146292 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:41 crc kubenswrapper[4764]: I0309 14:08:41.324233 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:41 crc kubenswrapper[4764]: I0309 14:08:41.904230 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vpnh7"] Mar 09 14:08:41 crc kubenswrapper[4764]: I0309 14:08:41.907070 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:41 crc kubenswrapper[4764]: I0309 14:08:41.933899 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vpnh7"] Mar 09 14:08:41 crc kubenswrapper[4764]: I0309 14:08:41.956023 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-utilities\") pod \"redhat-operators-vpnh7\" (UID: \"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc\") " pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:41 crc kubenswrapper[4764]: I0309 14:08:41.956148 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d46th\" (UniqueName: \"kubernetes.io/projected/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-kube-api-access-d46th\") pod \"redhat-operators-vpnh7\" (UID: \"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc\") " pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:41 crc kubenswrapper[4764]: I0309 14:08:41.956213 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-catalog-content\") pod \"redhat-operators-vpnh7\" (UID: \"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc\") " pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:42 crc kubenswrapper[4764]: I0309 14:08:42.058542 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-utilities\") pod \"redhat-operators-vpnh7\" (UID: \"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc\") " pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:42 crc kubenswrapper[4764]: I0309 14:08:42.058988 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d46th\" (UniqueName: \"kubernetes.io/projected/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-kube-api-access-d46th\") pod \"redhat-operators-vpnh7\" (UID: \"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc\") " pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:42 crc kubenswrapper[4764]: I0309 14:08:42.059196 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-utilities\") pod \"redhat-operators-vpnh7\" (UID: \"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc\") " pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:42 crc kubenswrapper[4764]: I0309 14:08:42.059386 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-catalog-content\") pod \"redhat-operators-vpnh7\" (UID: \"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc\") " pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:42 crc kubenswrapper[4764]: I0309 14:08:42.059931 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-catalog-content\") pod \"redhat-operators-vpnh7\" (UID: \"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc\") " pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:42 crc kubenswrapper[4764]: I0309 14:08:42.081735 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d46th\" (UniqueName: \"kubernetes.io/projected/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-kube-api-access-d46th\") pod \"redhat-operators-vpnh7\" (UID: \"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc\") " pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:42 crc kubenswrapper[4764]: I0309 14:08:42.235494 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:42 crc kubenswrapper[4764]: I0309 14:08:42.757079 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vpnh7"] Mar 09 14:08:43 crc kubenswrapper[4764]: I0309 14:08:43.288894 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wfpcl"] Mar 09 14:08:43 crc kubenswrapper[4764]: I0309 14:08:43.289882 4764 generic.go:334] "Generic (PLEG): container finished" podID="d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc" containerID="1a77f0233da3724318ca2ac4138ee23201ef7a67c89c708b7cafc81328a40522" exitCode=0 Mar 09 14:08:43 crc kubenswrapper[4764]: I0309 14:08:43.290015 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpnh7" event={"ID":"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc","Type":"ContainerDied","Data":"1a77f0233da3724318ca2ac4138ee23201ef7a67c89c708b7cafc81328a40522"} Mar 09 14:08:43 crc kubenswrapper[4764]: I0309 14:08:43.290085 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpnh7" event={"ID":"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc","Type":"ContainerStarted","Data":"2afe4d1e32eda8deed06dfb789128c9ea6b154f3a0227143cbe819cb6c855247"} Mar 09 14:08:44 crc kubenswrapper[4764]: I0309 14:08:44.302748 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wfpcl" podUID="5f47d85f-63cb-45ce-b935-1b2a534523dc" containerName="registry-server" containerID="cri-o://da5f4a3cea7278ff0e8c3dfb2732a6e5cb3460daded67995e65716fc617874f5" gracePeriod=2 Mar 09 14:08:44 crc kubenswrapper[4764]: I0309 14:08:44.794803 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:44 crc kubenswrapper[4764]: I0309 14:08:44.925574 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f47d85f-63cb-45ce-b935-1b2a534523dc-catalog-content\") pod \"5f47d85f-63cb-45ce-b935-1b2a534523dc\" (UID: \"5f47d85f-63cb-45ce-b935-1b2a534523dc\") " Mar 09 14:08:44 crc kubenswrapper[4764]: I0309 14:08:44.926016 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnxnf\" (UniqueName: \"kubernetes.io/projected/5f47d85f-63cb-45ce-b935-1b2a534523dc-kube-api-access-wnxnf\") pod \"5f47d85f-63cb-45ce-b935-1b2a534523dc\" (UID: \"5f47d85f-63cb-45ce-b935-1b2a534523dc\") " Mar 09 14:08:44 crc kubenswrapper[4764]: I0309 14:08:44.926113 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f47d85f-63cb-45ce-b935-1b2a534523dc-utilities\") pod \"5f47d85f-63cb-45ce-b935-1b2a534523dc\" (UID: \"5f47d85f-63cb-45ce-b935-1b2a534523dc\") " Mar 09 14:08:44 crc kubenswrapper[4764]: I0309 14:08:44.927606 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f47d85f-63cb-45ce-b935-1b2a534523dc-utilities" (OuterVolumeSpecName: "utilities") pod "5f47d85f-63cb-45ce-b935-1b2a534523dc" (UID: "5f47d85f-63cb-45ce-b935-1b2a534523dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:08:44 crc kubenswrapper[4764]: I0309 14:08:44.936068 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f47d85f-63cb-45ce-b935-1b2a534523dc-kube-api-access-wnxnf" (OuterVolumeSpecName: "kube-api-access-wnxnf") pod "5f47d85f-63cb-45ce-b935-1b2a534523dc" (UID: "5f47d85f-63cb-45ce-b935-1b2a534523dc"). InnerVolumeSpecName "kube-api-access-wnxnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:08:44 crc kubenswrapper[4764]: I0309 14:08:44.980342 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f47d85f-63cb-45ce-b935-1b2a534523dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5f47d85f-63cb-45ce-b935-1b2a534523dc" (UID: "5f47d85f-63cb-45ce-b935-1b2a534523dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.029215 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f47d85f-63cb-45ce-b935-1b2a534523dc-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.029285 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f47d85f-63cb-45ce-b935-1b2a534523dc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.029306 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnxnf\" (UniqueName: \"kubernetes.io/projected/5f47d85f-63cb-45ce-b935-1b2a534523dc-kube-api-access-wnxnf\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.320874 4764 generic.go:334] "Generic (PLEG): container finished" podID="5f47d85f-63cb-45ce-b935-1b2a534523dc" containerID="da5f4a3cea7278ff0e8c3dfb2732a6e5cb3460daded67995e65716fc617874f5" exitCode=0 Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.320959 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wfpcl" event={"ID":"5f47d85f-63cb-45ce-b935-1b2a534523dc","Type":"ContainerDied","Data":"da5f4a3cea7278ff0e8c3dfb2732a6e5cb3460daded67995e65716fc617874f5"} Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.320998 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wfpcl" event={"ID":"5f47d85f-63cb-45ce-b935-1b2a534523dc","Type":"ContainerDied","Data":"cfc0c125ad4a9bdc4d2c39a4d4e59044968a7fa9e65b6f1d8fca1ba7dfcb96b5"} Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.321029 4764 scope.go:117] "RemoveContainer" containerID="da5f4a3cea7278ff0e8c3dfb2732a6e5cb3460daded67995e65716fc617874f5" Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.321247 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wfpcl" Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.326922 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpnh7" event={"ID":"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc","Type":"ContainerStarted","Data":"dd64f84858d0a06586cf1c2b645ebbce25af53fb3c654cb2755a8ed059ce3f84"} Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.385291 4764 scope.go:117] "RemoveContainer" containerID="5012025ebd18fb17d5d312a24940b888eecb61a59a4797cfdc87d49706f8c4cd" Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.394088 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wfpcl"] Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.408245 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wfpcl"] Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.417827 4764 scope.go:117] "RemoveContainer" containerID="bb942a0c55a1ddef8cb3af71369818a07d7c6cedd6474ba3eefacd8515ae9e03" Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.470050 4764 scope.go:117] "RemoveContainer" containerID="da5f4a3cea7278ff0e8c3dfb2732a6e5cb3460daded67995e65716fc617874f5" Mar 09 14:08:45 crc kubenswrapper[4764]: E0309 14:08:45.470896 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da5f4a3cea7278ff0e8c3dfb2732a6e5cb3460daded67995e65716fc617874f5\": container with ID starting with da5f4a3cea7278ff0e8c3dfb2732a6e5cb3460daded67995e65716fc617874f5 not found: ID does not exist" containerID="da5f4a3cea7278ff0e8c3dfb2732a6e5cb3460daded67995e65716fc617874f5" Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.470957 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da5f4a3cea7278ff0e8c3dfb2732a6e5cb3460daded67995e65716fc617874f5"} err="failed to get container status \"da5f4a3cea7278ff0e8c3dfb2732a6e5cb3460daded67995e65716fc617874f5\": rpc error: code = NotFound desc = could not find container \"da5f4a3cea7278ff0e8c3dfb2732a6e5cb3460daded67995e65716fc617874f5\": container with ID starting with da5f4a3cea7278ff0e8c3dfb2732a6e5cb3460daded67995e65716fc617874f5 not found: ID does not exist" Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.470994 4764 scope.go:117] "RemoveContainer" containerID="5012025ebd18fb17d5d312a24940b888eecb61a59a4797cfdc87d49706f8c4cd" Mar 09 14:08:45 crc kubenswrapper[4764]: E0309 14:08:45.472236 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5012025ebd18fb17d5d312a24940b888eecb61a59a4797cfdc87d49706f8c4cd\": container with ID starting with 5012025ebd18fb17d5d312a24940b888eecb61a59a4797cfdc87d49706f8c4cd not found: ID does not exist" containerID="5012025ebd18fb17d5d312a24940b888eecb61a59a4797cfdc87d49706f8c4cd" Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.472268 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5012025ebd18fb17d5d312a24940b888eecb61a59a4797cfdc87d49706f8c4cd"} err="failed to get container status \"5012025ebd18fb17d5d312a24940b888eecb61a59a4797cfdc87d49706f8c4cd\": rpc error: code = NotFound desc = could not find container \"5012025ebd18fb17d5d312a24940b888eecb61a59a4797cfdc87d49706f8c4cd\": container with ID starting with 5012025ebd18fb17d5d312a24940b888eecb61a59a4797cfdc87d49706f8c4cd not found: ID does not exist" Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.472293 4764 scope.go:117] "RemoveContainer" containerID="bb942a0c55a1ddef8cb3af71369818a07d7c6cedd6474ba3eefacd8515ae9e03" Mar 09 14:08:45 crc kubenswrapper[4764]: E0309 14:08:45.472992 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb942a0c55a1ddef8cb3af71369818a07d7c6cedd6474ba3eefacd8515ae9e03\": container with ID starting with bb942a0c55a1ddef8cb3af71369818a07d7c6cedd6474ba3eefacd8515ae9e03 not found: ID does not exist" containerID="bb942a0c55a1ddef8cb3af71369818a07d7c6cedd6474ba3eefacd8515ae9e03" Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.473025 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb942a0c55a1ddef8cb3af71369818a07d7c6cedd6474ba3eefacd8515ae9e03"} err="failed to get container status \"bb942a0c55a1ddef8cb3af71369818a07d7c6cedd6474ba3eefacd8515ae9e03\": rpc error: code = NotFound desc = could not find container \"bb942a0c55a1ddef8cb3af71369818a07d7c6cedd6474ba3eefacd8515ae9e03\": container with ID starting with bb942a0c55a1ddef8cb3af71369818a07d7c6cedd6474ba3eefacd8515ae9e03 not found: ID does not exist" Mar 09 14:08:45 crc kubenswrapper[4764]: I0309 14:08:45.575475 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f47d85f-63cb-45ce-b935-1b2a534523dc" path="/var/lib/kubelet/pods/5f47d85f-63cb-45ce-b935-1b2a534523dc/volumes" Mar 09 14:08:46 crc kubenswrapper[4764]: I0309 14:08:46.348585 4764 generic.go:334] "Generic (PLEG): container finished" podID="d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc" containerID="dd64f84858d0a06586cf1c2b645ebbce25af53fb3c654cb2755a8ed059ce3f84" exitCode=0 Mar 09 14:08:46 crc kubenswrapper[4764]: I0309 14:08:46.348694 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpnh7" event={"ID":"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc","Type":"ContainerDied","Data":"dd64f84858d0a06586cf1c2b645ebbce25af53fb3c654cb2755a8ed059ce3f84"} Mar 09 14:08:46 crc kubenswrapper[4764]: I0309 14:08:46.348766 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpnh7" event={"ID":"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc","Type":"ContainerStarted","Data":"a837b89cd63d6dc6240374dd2b172b2ef7e641ed8c4d15da1021f90ddd848323"} Mar 09 14:08:46 crc kubenswrapper[4764]: I0309 14:08:46.376037 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vpnh7" podStartSLOduration=2.919296757 podStartE2EDuration="5.376017622s" podCreationTimestamp="2026-03-09 14:08:41 +0000 UTC" firstStartedPulling="2026-03-09 14:08:43.29208592 +0000 UTC m=+2878.542257828" lastFinishedPulling="2026-03-09 14:08:45.748806785 +0000 UTC m=+2880.998978693" observedRunningTime="2026-03-09 14:08:46.372063266 +0000 UTC m=+2881.622235214" watchObservedRunningTime="2026-03-09 14:08:46.376017622 +0000 UTC m=+2881.626189530" Mar 09 14:08:52 crc kubenswrapper[4764]: I0309 14:08:52.263886 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:52 crc kubenswrapper[4764]: I0309 14:08:52.283893 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:52 crc kubenswrapper[4764]: I0309 14:08:52.333988 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:52 crc kubenswrapper[4764]: I0309 14:08:52.459629 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:52 crc kubenswrapper[4764]: I0309 14:08:52.576549 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vpnh7"] Mar 09 14:08:54 crc kubenswrapper[4764]: I0309 14:08:54.424766 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vpnh7" podUID="d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc" containerName="registry-server" containerID="cri-o://a837b89cd63d6dc6240374dd2b172b2ef7e641ed8c4d15da1021f90ddd848323" gracePeriod=2 Mar 09 14:08:54 crc kubenswrapper[4764]: I0309 14:08:54.928688 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:54 crc kubenswrapper[4764]: I0309 14:08:54.961225 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-utilities\") pod \"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc\" (UID: \"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc\") " Mar 09 14:08:54 crc kubenswrapper[4764]: I0309 14:08:54.961403 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-catalog-content\") pod \"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc\" (UID: \"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc\") " Mar 09 14:08:54 crc kubenswrapper[4764]: I0309 14:08:54.961514 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d46th\" (UniqueName: \"kubernetes.io/projected/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-kube-api-access-d46th\") pod \"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc\" (UID: \"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc\") " Mar 09 14:08:54 crc kubenswrapper[4764]: I0309 14:08:54.963609 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-utilities" (OuterVolumeSpecName: "utilities") pod "d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc" (UID: "d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:08:54 crc kubenswrapper[4764]: I0309 14:08:54.970553 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-kube-api-access-d46th" (OuterVolumeSpecName: "kube-api-access-d46th") pod "d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc" (UID: "d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc"). InnerVolumeSpecName "kube-api-access-d46th". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:08:55 crc kubenswrapper[4764]: I0309 14:08:55.063792 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d46th\" (UniqueName: \"kubernetes.io/projected/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-kube-api-access-d46th\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:55 crc kubenswrapper[4764]: I0309 14:08:55.063833 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:55 crc kubenswrapper[4764]: I0309 14:08:55.437537 4764 generic.go:334] "Generic (PLEG): container finished" podID="d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc" containerID="a837b89cd63d6dc6240374dd2b172b2ef7e641ed8c4d15da1021f90ddd848323" exitCode=0 Mar 09 14:08:55 crc kubenswrapper[4764]: I0309 14:08:55.437593 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpnh7" event={"ID":"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc","Type":"ContainerDied","Data":"a837b89cd63d6dc6240374dd2b172b2ef7e641ed8c4d15da1021f90ddd848323"} Mar 09 14:08:55 crc kubenswrapper[4764]: I0309 14:08:55.437634 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpnh7" event={"ID":"d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc","Type":"ContainerDied","Data":"2afe4d1e32eda8deed06dfb789128c9ea6b154f3a0227143cbe819cb6c855247"} Mar 09 14:08:55 crc kubenswrapper[4764]: I0309 14:08:55.437630 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vpnh7" Mar 09 14:08:55 crc kubenswrapper[4764]: I0309 14:08:55.437675 4764 scope.go:117] "RemoveContainer" containerID="a837b89cd63d6dc6240374dd2b172b2ef7e641ed8c4d15da1021f90ddd848323" Mar 09 14:08:55 crc kubenswrapper[4764]: I0309 14:08:55.458352 4764 scope.go:117] "RemoveContainer" containerID="dd64f84858d0a06586cf1c2b645ebbce25af53fb3c654cb2755a8ed059ce3f84" Mar 09 14:08:55 crc kubenswrapper[4764]: I0309 14:08:55.482306 4764 scope.go:117] "RemoveContainer" containerID="1a77f0233da3724318ca2ac4138ee23201ef7a67c89c708b7cafc81328a40522" Mar 09 14:08:55 crc kubenswrapper[4764]: I0309 14:08:55.520726 4764 scope.go:117] "RemoveContainer" containerID="a837b89cd63d6dc6240374dd2b172b2ef7e641ed8c4d15da1021f90ddd848323" Mar 09 14:08:55 crc kubenswrapper[4764]: E0309 14:08:55.521641 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a837b89cd63d6dc6240374dd2b172b2ef7e641ed8c4d15da1021f90ddd848323\": container with ID starting with a837b89cd63d6dc6240374dd2b172b2ef7e641ed8c4d15da1021f90ddd848323 not found: ID does not exist" containerID="a837b89cd63d6dc6240374dd2b172b2ef7e641ed8c4d15da1021f90ddd848323" Mar 09 14:08:55 crc kubenswrapper[4764]: I0309 14:08:55.521724 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a837b89cd63d6dc6240374dd2b172b2ef7e641ed8c4d15da1021f90ddd848323"} err="failed to get container status \"a837b89cd63d6dc6240374dd2b172b2ef7e641ed8c4d15da1021f90ddd848323\": rpc error: code = NotFound desc = could not find container \"a837b89cd63d6dc6240374dd2b172b2ef7e641ed8c4d15da1021f90ddd848323\": container with ID starting with a837b89cd63d6dc6240374dd2b172b2ef7e641ed8c4d15da1021f90ddd848323 not found: ID does not exist" Mar 09 14:08:55 crc kubenswrapper[4764]: I0309 14:08:55.521762 4764 scope.go:117] "RemoveContainer" containerID="dd64f84858d0a06586cf1c2b645ebbce25af53fb3c654cb2755a8ed059ce3f84" Mar 09 14:08:55 crc kubenswrapper[4764]: E0309 14:08:55.522538 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd64f84858d0a06586cf1c2b645ebbce25af53fb3c654cb2755a8ed059ce3f84\": container with ID starting with dd64f84858d0a06586cf1c2b645ebbce25af53fb3c654cb2755a8ed059ce3f84 not found: ID does not exist" containerID="dd64f84858d0a06586cf1c2b645ebbce25af53fb3c654cb2755a8ed059ce3f84" Mar 09 14:08:55 crc kubenswrapper[4764]: I0309 14:08:55.522669 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd64f84858d0a06586cf1c2b645ebbce25af53fb3c654cb2755a8ed059ce3f84"} err="failed to get container status \"dd64f84858d0a06586cf1c2b645ebbce25af53fb3c654cb2755a8ed059ce3f84\": rpc error: code = NotFound desc = could not find container \"dd64f84858d0a06586cf1c2b645ebbce25af53fb3c654cb2755a8ed059ce3f84\": container with ID starting with dd64f84858d0a06586cf1c2b645ebbce25af53fb3c654cb2755a8ed059ce3f84 not found: ID does not exist" Mar 09 14:08:55 crc kubenswrapper[4764]: I0309 14:08:55.522759 4764 scope.go:117] "RemoveContainer" containerID="1a77f0233da3724318ca2ac4138ee23201ef7a67c89c708b7cafc81328a40522" Mar 09 14:08:55 crc kubenswrapper[4764]: E0309 14:08:55.523502 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a77f0233da3724318ca2ac4138ee23201ef7a67c89c708b7cafc81328a40522\": container with ID starting with 1a77f0233da3724318ca2ac4138ee23201ef7a67c89c708b7cafc81328a40522 not found: ID does not exist" containerID="1a77f0233da3724318ca2ac4138ee23201ef7a67c89c708b7cafc81328a40522" Mar 09 14:08:55 crc kubenswrapper[4764]: I0309 14:08:55.523585 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a77f0233da3724318ca2ac4138ee23201ef7a67c89c708b7cafc81328a40522"} err="failed to get container status \"1a77f0233da3724318ca2ac4138ee23201ef7a67c89c708b7cafc81328a40522\": rpc error: code = NotFound desc = could not find container \"1a77f0233da3724318ca2ac4138ee23201ef7a67c89c708b7cafc81328a40522\": container with ID starting with 1a77f0233da3724318ca2ac4138ee23201ef7a67c89c708b7cafc81328a40522 not found: ID does not exist" Mar 09 14:08:56 crc kubenswrapper[4764]: I0309 14:08:56.278922 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc" (UID: "d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:08:56 crc kubenswrapper[4764]: I0309 14:08:56.290774 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:56 crc kubenswrapper[4764]: I0309 14:08:56.375607 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vpnh7"] Mar 09 14:08:56 crc kubenswrapper[4764]: I0309 14:08:56.384538 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vpnh7"] Mar 09 14:08:57 crc kubenswrapper[4764]: I0309 14:08:57.572617 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc" path="/var/lib/kubelet/pods/d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc/volumes" Mar 09 14:09:24 crc kubenswrapper[4764]: I0309 14:09:24.734888 4764 generic.go:334] "Generic (PLEG): container finished" podID="eab144b6-e27c-4ffc-9dd5-6236ca12719f" containerID="86fa8795ecd86c7b9c42df1c80c762e511f08dce7f9815c18841bff01c585b45" exitCode=0 Mar 09 14:09:24 crc kubenswrapper[4764]: I0309 14:09:24.734996 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" event={"ID":"eab144b6-e27c-4ffc-9dd5-6236ca12719f","Type":"ContainerDied","Data":"86fa8795ecd86c7b9c42df1c80c762e511f08dce7f9815c18841bff01c585b45"} Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.185190 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.288358 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-migration-ssh-key-1\") pod \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.288463 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c6xn\" (UniqueName: \"kubernetes.io/projected/eab144b6-e27c-4ffc-9dd5-6236ca12719f-kube-api-access-7c6xn\") pod \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.288516 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ssh-key-openstack-edpm-ipam\") pod \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.288668 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-extra-config-0\") pod \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.288763 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-2\") pod \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.288785 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-inventory\") pod \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.288808 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-0\") pod \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.288850 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-custom-ceph-combined-ca-bundle\") pod \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.288887 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ceph-nova-0\") pod \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.288919 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-1\") pod \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.288944 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-3\") pod \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.288961 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-migration-ssh-key-0\") pod \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.288985 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ceph\") pod \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\" (UID: \"eab144b6-e27c-4ffc-9dd5-6236ca12719f\") " Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.302660 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "eab144b6-e27c-4ffc-9dd5-6236ca12719f" (UID: "eab144b6-e27c-4ffc-9dd5-6236ca12719f"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.302830 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ceph" (OuterVolumeSpecName: "ceph") pod "eab144b6-e27c-4ffc-9dd5-6236ca12719f" (UID: "eab144b6-e27c-4ffc-9dd5-6236ca12719f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.303846 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eab144b6-e27c-4ffc-9dd5-6236ca12719f-kube-api-access-7c6xn" (OuterVolumeSpecName: "kube-api-access-7c6xn") pod "eab144b6-e27c-4ffc-9dd5-6236ca12719f" (UID: "eab144b6-e27c-4ffc-9dd5-6236ca12719f"). InnerVolumeSpecName "kube-api-access-7c6xn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.323453 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "eab144b6-e27c-4ffc-9dd5-6236ca12719f" (UID: "eab144b6-e27c-4ffc-9dd5-6236ca12719f"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.324028 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "eab144b6-e27c-4ffc-9dd5-6236ca12719f" (UID: "eab144b6-e27c-4ffc-9dd5-6236ca12719f"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.328209 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "eab144b6-e27c-4ffc-9dd5-6236ca12719f" (UID: "eab144b6-e27c-4ffc-9dd5-6236ca12719f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.330300 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-inventory" (OuterVolumeSpecName: "inventory") pod "eab144b6-e27c-4ffc-9dd5-6236ca12719f" (UID: "eab144b6-e27c-4ffc-9dd5-6236ca12719f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.335872 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "eab144b6-e27c-4ffc-9dd5-6236ca12719f" (UID: "eab144b6-e27c-4ffc-9dd5-6236ca12719f"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.341702 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "eab144b6-e27c-4ffc-9dd5-6236ca12719f" (UID: "eab144b6-e27c-4ffc-9dd5-6236ca12719f"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.341839 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "eab144b6-e27c-4ffc-9dd5-6236ca12719f" (UID: "eab144b6-e27c-4ffc-9dd5-6236ca12719f"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.345110 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "eab144b6-e27c-4ffc-9dd5-6236ca12719f" (UID: "eab144b6-e27c-4ffc-9dd5-6236ca12719f"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.345603 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "eab144b6-e27c-4ffc-9dd5-6236ca12719f" (UID: "eab144b6-e27c-4ffc-9dd5-6236ca12719f"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.349543 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "eab144b6-e27c-4ffc-9dd5-6236ca12719f" (UID: "eab144b6-e27c-4ffc-9dd5-6236ca12719f"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.391186 4764 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.391237 4764 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.391251 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.391260 4764 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.391269 4764 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.391282 4764 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.391292 4764 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.391302 4764 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.391310 4764 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.391319 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.391327 4764 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.391358 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c6xn\" (UniqueName: \"kubernetes.io/projected/eab144b6-e27c-4ffc-9dd5-6236ca12719f-kube-api-access-7c6xn\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.391370 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eab144b6-e27c-4ffc-9dd5-6236ca12719f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.758078 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" event={"ID":"eab144b6-e27c-4ffc-9dd5-6236ca12719f","Type":"ContainerDied","Data":"764af20acdc888d995ca2db68f3ab58c7fcd79d0559f4ce7bd21b3b2183c8e62"} Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.758420 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="764af20acdc888d995ca2db68f3ab58c7fcd79d0559f4ce7bd21b3b2183c8e62" Mar 09 14:09:26 crc kubenswrapper[4764]: I0309 14:09:26.758135 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq" Mar 09 14:09:28 crc kubenswrapper[4764]: I0309 14:09:28.370390 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:09:28 crc kubenswrapper[4764]: I0309 14:09:28.370912 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:09:40 crc kubenswrapper[4764]: I0309 14:09:40.985481 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 09 14:09:40 crc kubenswrapper[4764]: E0309 14:09:40.986933 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f47d85f-63cb-45ce-b935-1b2a534523dc" containerName="extract-content" Mar 09 14:09:40 crc kubenswrapper[4764]: I0309 14:09:40.986954 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f47d85f-63cb-45ce-b935-1b2a534523dc" containerName="extract-content" Mar 09 14:09:40 crc kubenswrapper[4764]: E0309 14:09:40.986965 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f47d85f-63cb-45ce-b935-1b2a534523dc" containerName="extract-utilities" Mar 09 14:09:40 crc kubenswrapper[4764]: I0309 14:09:40.986971 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f47d85f-63cb-45ce-b935-1b2a534523dc" containerName="extract-utilities" Mar 09 14:09:40 crc kubenswrapper[4764]: E0309 14:09:40.986987 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eab144b6-e27c-4ffc-9dd5-6236ca12719f" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Mar 09 14:09:40 crc kubenswrapper[4764]: I0309 14:09:40.986999 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="eab144b6-e27c-4ffc-9dd5-6236ca12719f" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Mar 09 14:09:40 crc kubenswrapper[4764]: E0309 14:09:40.987026 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f47d85f-63cb-45ce-b935-1b2a534523dc" containerName="registry-server" Mar 09 14:09:40 crc kubenswrapper[4764]: I0309 14:09:40.987034 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f47d85f-63cb-45ce-b935-1b2a534523dc" containerName="registry-server" Mar 09 14:09:40 crc kubenswrapper[4764]: E0309 14:09:40.987046 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc" containerName="extract-content" Mar 09 14:09:40 crc kubenswrapper[4764]: I0309 14:09:40.987052 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc" containerName="extract-content" Mar 09 14:09:40 crc kubenswrapper[4764]: E0309 14:09:40.987069 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc" containerName="registry-server" Mar 09 14:09:40 crc kubenswrapper[4764]: I0309 14:09:40.987076 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc" containerName="registry-server" Mar 09 14:09:40 crc kubenswrapper[4764]: E0309 14:09:40.987089 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc" containerName="extract-utilities" Mar 09 14:09:40 crc kubenswrapper[4764]: I0309 14:09:40.987096 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc" containerName="extract-utilities" Mar 09 14:09:40 crc kubenswrapper[4764]: I0309 14:09:40.987317 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="eab144b6-e27c-4ffc-9dd5-6236ca12719f" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Mar 09 14:09:40 crc kubenswrapper[4764]: I0309 14:09:40.987336 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4b9b0a7-b93e-402d-9b7a-3ce1815d62dc" containerName="registry-server" Mar 09 14:09:40 crc kubenswrapper[4764]: I0309 14:09:40.987348 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f47d85f-63cb-45ce-b935-1b2a534523dc" containerName="registry-server" Mar 09 14:09:40 crc kubenswrapper[4764]: I0309 14:09:40.988460 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:40 crc kubenswrapper[4764]: I0309 14:09:40.991240 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 14:09:40 crc kubenswrapper[4764]: I0309 14:09:40.992147 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.007226 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.054280 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.056218 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.065175 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.068807 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.132586 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2kwv\" (UniqueName: \"kubernetes.io/projected/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-kube-api-access-b2kwv\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.132752 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.132806 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.132834 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.132885 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.132918 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.132978 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.133013 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-run\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.133049 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.133073 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.133093 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-sys\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.133137 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.133211 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.133238 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.133286 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-dev\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.133308 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.235743 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.235823 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.235848 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.235874 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.235903 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.235946 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.235987 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-run\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.236007 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.236032 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.236058 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-sys\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.236087 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.236162 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.236193 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.236230 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-dev\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.236257 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.236301 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2kwv\" (UniqueName: \"kubernetes.io/projected/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-kube-api-access-b2kwv\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.243984 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.244060 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.246453 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.248567 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.248792 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.248864 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.248887 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-run\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.249101 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.249153 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.249178 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-sys\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.249199 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.252928 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.253074 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.253113 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-dev\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.256006 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.260132 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2kwv\" (UniqueName: \"kubernetes.io/projected/9aaa370f-a3d5-4fce-9761-873aeb8d7b1f-kube-api-access-b2kwv\") pod \"cinder-volume-volume1-0\" (UID: \"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f\") " pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:41 crc kubenswrapper[4764]: I0309 14:09:41.345148 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.022312 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-run\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.023319 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw5m9\" (UniqueName: \"kubernetes.io/projected/e6a8f674-82eb-4474-973d-54a90e5fd1e0-kube-api-access-nw5m9\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.023487 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-dev\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.057394 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e6a8f674-82eb-4474-973d-54a90e5fd1e0-ceph\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.249484 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-etc-nvme\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.253528 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.253611 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6a8f674-82eb-4474-973d-54a90e5fd1e0-scripts\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.253641 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.253797 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.253868 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6a8f674-82eb-4474-973d-54a90e5fd1e0-config-data-custom\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.253918 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.253949 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6a8f674-82eb-4474-973d-54a90e5fd1e0-config-data\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.254130 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.254210 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6a8f674-82eb-4474-973d-54a90e5fd1e0-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.254253 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-lib-modules\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.254290 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-sys\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.287092 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.289584 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.312577 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-8p7c7" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.313016 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.313212 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.313406 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.323307 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.338432 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.339583 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.352489 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.353037 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.356549 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-etc-nvme\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.356659 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.356695 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6a8f674-82eb-4474-973d-54a90e5fd1e0-scripts\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.356721 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.356788 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.356818 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6a8f674-82eb-4474-973d-54a90e5fd1e0-config-data-custom\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.356849 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.356874 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6a8f674-82eb-4474-973d-54a90e5fd1e0-config-data\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.356938 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.356976 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6a8f674-82eb-4474-973d-54a90e5fd1e0-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.357009 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-lib-modules\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.357788 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-sys\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.357889 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-run\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.357949 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw5m9\" (UniqueName: \"kubernetes.io/projected/e6a8f674-82eb-4474-973d-54a90e5fd1e0-kube-api-access-nw5m9\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.358020 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-dev\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.358081 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e6a8f674-82eb-4474-973d-54a90e5fd1e0-ceph\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.362108 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.362774 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-run\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.368246 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.368312 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.368333 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-dev\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.368527 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-lib-modules\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.368558 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-sys\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.368612 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-etc-nvme\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.368674 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.368710 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6a8f674-82eb-4474-973d-54a90e5fd1e0-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.373118 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e6a8f674-82eb-4474-973d-54a90e5fd1e0-ceph\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.387156 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6a8f674-82eb-4474-973d-54a90e5fd1e0-config-data\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.388060 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.409607 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw5m9\" (UniqueName: \"kubernetes.io/projected/e6a8f674-82eb-4474-973d-54a90e5fd1e0-kube-api-access-nw5m9\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.410299 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6a8f674-82eb-4474-973d-54a90e5fd1e0-scripts\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.422949 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6a8f674-82eb-4474-973d-54a90e5fd1e0-config-data-custom\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.426329 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6a8f674-82eb-4474-973d-54a90e5fd1e0-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"e6a8f674-82eb-4474-973d-54a90e5fd1e0\") " pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463104 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-config-data\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463168 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t465\" (UniqueName: \"kubernetes.io/projected/49ee7179-02d8-4e07-9bb8-fce22456e804-kube-api-access-4t465\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463205 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/49ee7179-02d8-4e07-9bb8-fce22456e804-ceph\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463264 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463284 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-scripts\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463332 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463379 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463408 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/349527e3-93e0-4342-845c-eb8775ab3e5a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463430 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/349527e3-93e0-4342-845c-eb8775ab3e5a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463473 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463492 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463511 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49ee7179-02d8-4e07-9bb8-fce22456e804-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463535 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463570 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463590 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxdzv\" (UniqueName: \"kubernetes.io/projected/349527e3-93e0-4342-845c-eb8775ab3e5a-kube-api-access-kxdzv\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463619 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/349527e3-93e0-4342-845c-eb8775ab3e5a-logs\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463662 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.463677 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49ee7179-02d8-4e07-9bb8-fce22456e804-logs\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572196 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572283 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572310 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49ee7179-02d8-4e07-9bb8-fce22456e804-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572340 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572377 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572410 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxdzv\" (UniqueName: \"kubernetes.io/projected/349527e3-93e0-4342-845c-eb8775ab3e5a-kube-api-access-kxdzv\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572449 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/349527e3-93e0-4342-845c-eb8775ab3e5a-logs\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572479 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572498 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49ee7179-02d8-4e07-9bb8-fce22456e804-logs\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572520 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-config-data\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572546 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t465\" (UniqueName: \"kubernetes.io/projected/49ee7179-02d8-4e07-9bb8-fce22456e804-kube-api-access-4t465\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572581 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/49ee7179-02d8-4e07-9bb8-fce22456e804-ceph\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572676 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572703 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-scripts\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572764 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572814 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572848 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/349527e3-93e0-4342-845c-eb8775ab3e5a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.572880 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/349527e3-93e0-4342-845c-eb8775ab3e5a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.577512 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/349527e3-93e0-4342-845c-eb8775ab3e5a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.578022 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.583280 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49ee7179-02d8-4e07-9bb8-fce22456e804-logs\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.583444 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.588220 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49ee7179-02d8-4e07-9bb8-fce22456e804-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.589548 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.592486 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.599885 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/349527e3-93e0-4342-845c-eb8775ab3e5a-logs\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.600538 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/349527e3-93e0-4342-845c-eb8775ab3e5a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.618031 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.620285 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.621504 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-config-data\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.622374 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.623839 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t465\" (UniqueName: \"kubernetes.io/projected/49ee7179-02d8-4e07-9bb8-fce22456e804-kube-api-access-4t465\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.624439 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/49ee7179-02d8-4e07-9bb8-fce22456e804-ceph\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.624630 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-scripts\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.627568 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.633342 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.705446 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxdzv\" (UniqueName: \"kubernetes.io/projected/349527e3-93e0-4342-845c-eb8775ab3e5a-kube-api-access-kxdzv\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.723930 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.772298 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.784779 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.953438 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.976931 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-fdg68"] Mar 09 14:09:42 crc kubenswrapper[4764]: I0309 14:09:42.978370 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-fdg68" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:42.999230 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-fdg68"] Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.093480 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e1948a5-46f6-412d-91c3-bf9c255e02fc-operator-scripts\") pod \"manila-db-create-fdg68\" (UID: \"5e1948a5-46f6-412d-91c3-bf9c255e02fc\") " pod="openstack/manila-db-create-fdg68" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.094071 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnpqp\" (UniqueName: \"kubernetes.io/projected/5e1948a5-46f6-412d-91c3-bf9c255e02fc-kube-api-access-cnpqp\") pod \"manila-db-create-fdg68\" (UID: \"5e1948a5-46f6-412d-91c3-bf9c255e02fc\") " pod="openstack/manila-db-create-fdg68" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.137579 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-9d94c4c7-px8jh"] Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.139534 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.153863 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.154126 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.154294 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.157381 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-rwzsj" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.190810 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9d94c4c7-px8jh"] Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.198867 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktc9m\" (UniqueName: \"kubernetes.io/projected/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-kube-api-access-ktc9m\") pod \"horizon-9d94c4c7-px8jh\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.198939 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e1948a5-46f6-412d-91c3-bf9c255e02fc-operator-scripts\") pod \"manila-db-create-fdg68\" (UID: \"5e1948a5-46f6-412d-91c3-bf9c255e02fc\") " pod="openstack/manila-db-create-fdg68" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.198969 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnpqp\" (UniqueName: \"kubernetes.io/projected/5e1948a5-46f6-412d-91c3-bf9c255e02fc-kube-api-access-cnpqp\") pod \"manila-db-create-fdg68\" (UID: \"5e1948a5-46f6-412d-91c3-bf9c255e02fc\") " pod="openstack/manila-db-create-fdg68" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.198987 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-config-data\") pod \"horizon-9d94c4c7-px8jh\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.199037 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-scripts\") pod \"horizon-9d94c4c7-px8jh\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.199063 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-horizon-secret-key\") pod \"horizon-9d94c4c7-px8jh\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.199081 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-logs\") pod \"horizon-9d94c4c7-px8jh\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.207683 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e1948a5-46f6-412d-91c3-bf9c255e02fc-operator-scripts\") pod \"manila-db-create-fdg68\" (UID: \"5e1948a5-46f6-412d-91c3-bf9c255e02fc\") " pod="openstack/manila-db-create-fdg68" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.236544 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.264909 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnpqp\" (UniqueName: \"kubernetes.io/projected/5e1948a5-46f6-412d-91c3-bf9c255e02fc-kube-api-access-cnpqp\") pod \"manila-db-create-fdg68\" (UID: \"5e1948a5-46f6-412d-91c3-bf9c255e02fc\") " pod="openstack/manila-db-create-fdg68" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.307898 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-scripts\") pod \"horizon-9d94c4c7-px8jh\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.307978 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-horizon-secret-key\") pod \"horizon-9d94c4c7-px8jh\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.308004 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-logs\") pod \"horizon-9d94c4c7-px8jh\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.308201 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktc9m\" (UniqueName: \"kubernetes.io/projected/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-kube-api-access-ktc9m\") pod \"horizon-9d94c4c7-px8jh\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.312077 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-scripts\") pod \"horizon-9d94c4c7-px8jh\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.312168 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-logs\") pod \"horizon-9d94c4c7-px8jh\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.314172 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-config-data\") pod \"horizon-9d94c4c7-px8jh\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.315026 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-horizon-secret-key\") pod \"horizon-9d94c4c7-px8jh\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.317566 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-config-data\") pod \"horizon-9d94c4c7-px8jh\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.326802 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-fdg68" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.331726 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.391712 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktc9m\" (UniqueName: \"kubernetes.io/projected/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-kube-api-access-ktc9m\") pod \"horizon-9d94c4c7-px8jh\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.403814 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-1df0-account-create-update-bg564"] Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.405750 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-1df0-account-create-update-bg564" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.411921 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.435457 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-1df0-account-create-update-bg564"] Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.450341 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-55f8b7fc4c-6rdd7"] Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.466731 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.480317 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-55f8b7fc4c-6rdd7"] Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.489599 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.519023 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.524085 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlm9t\" (UniqueName: \"kubernetes.io/projected/6b944cf9-8278-4b16-b09c-0da6a2519b2a-kube-api-access-nlm9t\") pod \"horizon-55f8b7fc4c-6rdd7\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.524177 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ce9bce5-9c23-40ac-9683-6fb232e32c3c-operator-scripts\") pod \"manila-1df0-account-create-update-bg564\" (UID: \"7ce9bce5-9c23-40ac-9683-6fb232e32c3c\") " pod="openstack/manila-1df0-account-create-update-bg564" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.524221 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b944cf9-8278-4b16-b09c-0da6a2519b2a-config-data\") pod \"horizon-55f8b7fc4c-6rdd7\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.524273 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b944cf9-8278-4b16-b09c-0da6a2519b2a-scripts\") pod \"horizon-55f8b7fc4c-6rdd7\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.524320 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl52v\" (UniqueName: \"kubernetes.io/projected/7ce9bce5-9c23-40ac-9683-6fb232e32c3c-kube-api-access-pl52v\") pod \"manila-1df0-account-create-update-bg564\" (UID: \"7ce9bce5-9c23-40ac-9683-6fb232e32c3c\") " pod="openstack/manila-1df0-account-create-update-bg564" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.524352 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b944cf9-8278-4b16-b09c-0da6a2519b2a-logs\") pod \"horizon-55f8b7fc4c-6rdd7\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.524390 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6b944cf9-8278-4b16-b09c-0da6a2519b2a-horizon-secret-key\") pod \"horizon-55f8b7fc4c-6rdd7\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.626433 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl52v\" (UniqueName: \"kubernetes.io/projected/7ce9bce5-9c23-40ac-9683-6fb232e32c3c-kube-api-access-pl52v\") pod \"manila-1df0-account-create-update-bg564\" (UID: \"7ce9bce5-9c23-40ac-9683-6fb232e32c3c\") " pod="openstack/manila-1df0-account-create-update-bg564" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.626504 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b944cf9-8278-4b16-b09c-0da6a2519b2a-logs\") pod \"horizon-55f8b7fc4c-6rdd7\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.626541 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6b944cf9-8278-4b16-b09c-0da6a2519b2a-horizon-secret-key\") pod \"horizon-55f8b7fc4c-6rdd7\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.626682 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlm9t\" (UniqueName: \"kubernetes.io/projected/6b944cf9-8278-4b16-b09c-0da6a2519b2a-kube-api-access-nlm9t\") pod \"horizon-55f8b7fc4c-6rdd7\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.626807 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ce9bce5-9c23-40ac-9683-6fb232e32c3c-operator-scripts\") pod \"manila-1df0-account-create-update-bg564\" (UID: \"7ce9bce5-9c23-40ac-9683-6fb232e32c3c\") " pod="openstack/manila-1df0-account-create-update-bg564" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.626839 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b944cf9-8278-4b16-b09c-0da6a2519b2a-config-data\") pod \"horizon-55f8b7fc4c-6rdd7\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.626908 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b944cf9-8278-4b16-b09c-0da6a2519b2a-scripts\") pod \"horizon-55f8b7fc4c-6rdd7\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.627875 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b944cf9-8278-4b16-b09c-0da6a2519b2a-scripts\") pod \"horizon-55f8b7fc4c-6rdd7\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.628510 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b944cf9-8278-4b16-b09c-0da6a2519b2a-logs\") pod \"horizon-55f8b7fc4c-6rdd7\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.629391 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ce9bce5-9c23-40ac-9683-6fb232e32c3c-operator-scripts\") pod \"manila-1df0-account-create-update-bg564\" (UID: \"7ce9bce5-9c23-40ac-9683-6fb232e32c3c\") " pod="openstack/manila-1df0-account-create-update-bg564" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.632411 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b944cf9-8278-4b16-b09c-0da6a2519b2a-config-data\") pod \"horizon-55f8b7fc4c-6rdd7\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.643193 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6b944cf9-8278-4b16-b09c-0da6a2519b2a-horizon-secret-key\") pod \"horizon-55f8b7fc4c-6rdd7\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.695504 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl52v\" (UniqueName: \"kubernetes.io/projected/7ce9bce5-9c23-40ac-9683-6fb232e32c3c-kube-api-access-pl52v\") pod \"manila-1df0-account-create-update-bg564\" (UID: \"7ce9bce5-9c23-40ac-9683-6fb232e32c3c\") " pod="openstack/manila-1df0-account-create-update-bg564" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.695933 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlm9t\" (UniqueName: \"kubernetes.io/projected/6b944cf9-8278-4b16-b09c-0da6a2519b2a-kube-api-access-nlm9t\") pod \"horizon-55f8b7fc4c-6rdd7\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.831944 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.897431 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-1df0-account-create-update-bg564" Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.899332 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 09 14:09:43 crc kubenswrapper[4764]: I0309 14:09:43.924566 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:09:44 crc kubenswrapper[4764]: I0309 14:09:44.083823 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:09:44 crc kubenswrapper[4764]: I0309 14:09:44.206146 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-fdg68"] Mar 09 14:09:44 crc kubenswrapper[4764]: I0309 14:09:44.291564 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-fdg68" event={"ID":"5e1948a5-46f6-412d-91c3-bf9c255e02fc","Type":"ContainerStarted","Data":"2f209c4397604d7a0a9dbd934378982e63afb3fec9341073117892ee26e51dd6"} Mar 09 14:09:44 crc kubenswrapper[4764]: I0309 14:09:44.296715 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"349527e3-93e0-4342-845c-eb8775ab3e5a","Type":"ContainerStarted","Data":"6f8e9f0abe6ca5e7bff14c308c2db546c0b2c9b94e174f7fa80c547c31b16a26"} Mar 09 14:09:44 crc kubenswrapper[4764]: I0309 14:09:44.300820 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"49ee7179-02d8-4e07-9bb8-fce22456e804","Type":"ContainerStarted","Data":"1f84cee9a80ba50e44abce0cb213778729f5e3cb3d6103fdf6b0ebee436d5285"} Mar 09 14:09:44 crc kubenswrapper[4764]: I0309 14:09:44.302866 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f","Type":"ContainerStarted","Data":"1a5be9c37bf14ba0e2041bc44e4cd0834044e096c24171216c18d21b7820634e"} Mar 09 14:09:44 crc kubenswrapper[4764]: I0309 14:09:44.323291 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"e6a8f674-82eb-4474-973d-54a90e5fd1e0","Type":"ContainerStarted","Data":"c70ee790255d3544035eda73ecba123e00e43636af92765b71d3c5750bfab7a8"} Mar 09 14:09:44 crc kubenswrapper[4764]: I0309 14:09:44.485691 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9d94c4c7-px8jh"] Mar 09 14:09:44 crc kubenswrapper[4764]: I0309 14:09:44.637623 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-1df0-account-create-update-bg564"] Mar 09 14:09:44 crc kubenswrapper[4764]: I0309 14:09:44.657304 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-55f8b7fc4c-6rdd7"] Mar 09 14:09:44 crc kubenswrapper[4764]: W0309 14:09:44.795190 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b944cf9_8278_4b16_b09c_0da6a2519b2a.slice/crio-03eb026db669a1bbc8569cf6227719fa0346433ca9e38091ef450a1f5669648c WatchSource:0}: Error finding container 03eb026db669a1bbc8569cf6227719fa0346433ca9e38091ef450a1f5669648c: Status 404 returned error can't find the container with id 03eb026db669a1bbc8569cf6227719fa0346433ca9e38091ef450a1f5669648c Mar 09 14:09:45 crc kubenswrapper[4764]: I0309 14:09:45.344770 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9d94c4c7-px8jh" event={"ID":"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243","Type":"ContainerStarted","Data":"dc93d1c29f222a1602e24053b4830b3eddb16bbf1d9374aeb6068fdf3ba6030f"} Mar 09 14:09:45 crc kubenswrapper[4764]: I0309 14:09:45.349617 4764 generic.go:334] "Generic (PLEG): container finished" podID="5e1948a5-46f6-412d-91c3-bf9c255e02fc" containerID="89bd3940c282c4190fbd12e55d3b6572d76973664eb8e08171ebbd6ade38f50f" exitCode=0 Mar 09 14:09:45 crc kubenswrapper[4764]: I0309 14:09:45.349688 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-fdg68" event={"ID":"5e1948a5-46f6-412d-91c3-bf9c255e02fc","Type":"ContainerDied","Data":"89bd3940c282c4190fbd12e55d3b6572d76973664eb8e08171ebbd6ade38f50f"} Mar 09 14:09:45 crc kubenswrapper[4764]: I0309 14:09:45.351724 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55f8b7fc4c-6rdd7" event={"ID":"6b944cf9-8278-4b16-b09c-0da6a2519b2a","Type":"ContainerStarted","Data":"03eb026db669a1bbc8569cf6227719fa0346433ca9e38091ef450a1f5669648c"} Mar 09 14:09:45 crc kubenswrapper[4764]: I0309 14:09:45.355629 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"349527e3-93e0-4342-845c-eb8775ab3e5a","Type":"ContainerStarted","Data":"30776e46d343f3c10f72753cb59ea338812b698d41259546e8c5f506b57f2e53"} Mar 09 14:09:45 crc kubenswrapper[4764]: I0309 14:09:45.362674 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"49ee7179-02d8-4e07-9bb8-fce22456e804","Type":"ContainerStarted","Data":"07b51e4be4c9ba108c01cded8c1c80a772fa736483e292b6667b962df1606942"} Mar 09 14:09:45 crc kubenswrapper[4764]: I0309 14:09:45.367571 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-1df0-account-create-update-bg564" event={"ID":"7ce9bce5-9c23-40ac-9683-6fb232e32c3c","Type":"ContainerStarted","Data":"bdf07c58d276db116b39b558183c8a6af0bb01c84a705a0b57c15a4ac4f6b634"} Mar 09 14:09:45 crc kubenswrapper[4764]: I0309 14:09:45.367662 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-1df0-account-create-update-bg564" event={"ID":"7ce9bce5-9c23-40ac-9683-6fb232e32c3c","Type":"ContainerStarted","Data":"2002bb74ed38eb2c1f0e486f850b4e290555f1f76e89842042231f9e9946ffde"} Mar 09 14:09:45 crc kubenswrapper[4764]: I0309 14:09:45.411525 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-1df0-account-create-update-bg564" podStartSLOduration=2.411501615 podStartE2EDuration="2.411501615s" podCreationTimestamp="2026-03-09 14:09:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:09:45.401118068 +0000 UTC m=+2940.651289976" watchObservedRunningTime="2026-03-09 14:09:45.411501615 +0000 UTC m=+2940.661673523" Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.386634 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f","Type":"ContainerStarted","Data":"c18f47a7c064950f846a245ffd438b68e0328a931476ffa1517c78de39cd79df"} Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.389169 4764 generic.go:334] "Generic (PLEG): container finished" podID="7ce9bce5-9c23-40ac-9683-6fb232e32c3c" containerID="bdf07c58d276db116b39b558183c8a6af0bb01c84a705a0b57c15a4ac4f6b634" exitCode=0 Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.389219 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-1df0-account-create-update-bg564" event={"ID":"7ce9bce5-9c23-40ac-9683-6fb232e32c3c","Type":"ContainerDied","Data":"bdf07c58d276db116b39b558183c8a6af0bb01c84a705a0b57c15a4ac4f6b634"} Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.404125 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"e6a8f674-82eb-4474-973d-54a90e5fd1e0","Type":"ContainerStarted","Data":"fa40eb8f95c156ce0a45864c5ed3bf173bfb2745018c72edc9ef389478dae117"} Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.759334 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-55f8b7fc4c-6rdd7"] Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.851216 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-797d44c9b-wrlx7"] Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.866955 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.876578 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.893019 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-797d44c9b-wrlx7"] Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.922764 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-9d94c4c7-px8jh"] Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.954858 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e00fc104-ec73-4190-a598-86de7ca6cfa5-config-data\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.954940 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-horizon-tls-certs\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.954988 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-horizon-secret-key\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.955041 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e00fc104-ec73-4190-a598-86de7ca6cfa5-scripts\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.955074 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-combined-ca-bundle\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.955117 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e00fc104-ec73-4190-a598-86de7ca6cfa5-logs\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.955158 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l5bd\" (UniqueName: \"kubernetes.io/projected/e00fc104-ec73-4190-a598-86de7ca6cfa5-kube-api-access-9l5bd\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:46 crc kubenswrapper[4764]: I0309 14:09:46.970914 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-fdg68" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.000732 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-56bb55c768-vchmw"] Mar 09 14:09:47 crc kubenswrapper[4764]: E0309 14:09:47.001438 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e1948a5-46f6-412d-91c3-bf9c255e02fc" containerName="mariadb-database-create" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.001466 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e1948a5-46f6-412d-91c3-bf9c255e02fc" containerName="mariadb-database-create" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.001728 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e1948a5-46f6-412d-91c3-bf9c255e02fc" containerName="mariadb-database-create" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.003209 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.029106 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-56bb55c768-vchmw"] Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.057099 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnpqp\" (UniqueName: \"kubernetes.io/projected/5e1948a5-46f6-412d-91c3-bf9c255e02fc-kube-api-access-cnpqp\") pod \"5e1948a5-46f6-412d-91c3-bf9c255e02fc\" (UID: \"5e1948a5-46f6-412d-91c3-bf9c255e02fc\") " Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.057230 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e1948a5-46f6-412d-91c3-bf9c255e02fc-operator-scripts\") pod \"5e1948a5-46f6-412d-91c3-bf9c255e02fc\" (UID: \"5e1948a5-46f6-412d-91c3-bf9c255e02fc\") " Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.057554 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l5bd\" (UniqueName: \"kubernetes.io/projected/e00fc104-ec73-4190-a598-86de7ca6cfa5-kube-api-access-9l5bd\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.057609 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/47ef29f6-4627-4b84-968d-db9d7ed438da-config-data\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.057694 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzxxs\" (UniqueName: \"kubernetes.io/projected/47ef29f6-4627-4b84-968d-db9d7ed438da-kube-api-access-kzxxs\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.057775 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47ef29f6-4627-4b84-968d-db9d7ed438da-combined-ca-bundle\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.057843 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47ef29f6-4627-4b84-968d-db9d7ed438da-logs\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.057913 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e00fc104-ec73-4190-a598-86de7ca6cfa5-config-data\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.057933 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-horizon-tls-certs\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.057968 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/47ef29f6-4627-4b84-968d-db9d7ed438da-scripts\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.057998 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-horizon-secret-key\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.058073 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e00fc104-ec73-4190-a598-86de7ca6cfa5-scripts\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.058103 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/47ef29f6-4627-4b84-968d-db9d7ed438da-horizon-secret-key\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.058136 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-combined-ca-bundle\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.058168 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/47ef29f6-4627-4b84-968d-db9d7ed438da-horizon-tls-certs\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.058214 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e00fc104-ec73-4190-a598-86de7ca6cfa5-logs\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.058810 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e00fc104-ec73-4190-a598-86de7ca6cfa5-logs\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.060250 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e00fc104-ec73-4190-a598-86de7ca6cfa5-config-data\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.060859 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e1948a5-46f6-412d-91c3-bf9c255e02fc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5e1948a5-46f6-412d-91c3-bf9c255e02fc" (UID: "5e1948a5-46f6-412d-91c3-bf9c255e02fc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.066053 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e00fc104-ec73-4190-a598-86de7ca6cfa5-scripts\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.073244 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-horizon-secret-key\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.073669 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-combined-ca-bundle\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.080440 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l5bd\" (UniqueName: \"kubernetes.io/projected/e00fc104-ec73-4190-a598-86de7ca6cfa5-kube-api-access-9l5bd\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.107242 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-horizon-tls-certs\") pod \"horizon-797d44c9b-wrlx7\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.269203 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e1948a5-46f6-412d-91c3-bf9c255e02fc-kube-api-access-cnpqp" (OuterVolumeSpecName: "kube-api-access-cnpqp") pod "5e1948a5-46f6-412d-91c3-bf9c255e02fc" (UID: "5e1948a5-46f6-412d-91c3-bf9c255e02fc"). InnerVolumeSpecName "kube-api-access-cnpqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.271119 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.272056 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47ef29f6-4627-4b84-968d-db9d7ed438da-combined-ca-bundle\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.272186 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47ef29f6-4627-4b84-968d-db9d7ed438da-logs\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.272318 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/47ef29f6-4627-4b84-968d-db9d7ed438da-scripts\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.272429 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/47ef29f6-4627-4b84-968d-db9d7ed438da-horizon-secret-key\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.272498 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/47ef29f6-4627-4b84-968d-db9d7ed438da-horizon-tls-certs\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.272678 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/47ef29f6-4627-4b84-968d-db9d7ed438da-config-data\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.272778 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzxxs\" (UniqueName: \"kubernetes.io/projected/47ef29f6-4627-4b84-968d-db9d7ed438da-kube-api-access-kzxxs\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.273822 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnpqp\" (UniqueName: \"kubernetes.io/projected/5e1948a5-46f6-412d-91c3-bf9c255e02fc-kube-api-access-cnpqp\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.277568 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e1948a5-46f6-412d-91c3-bf9c255e02fc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.281133 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/47ef29f6-4627-4b84-968d-db9d7ed438da-scripts\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.283425 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47ef29f6-4627-4b84-968d-db9d7ed438da-logs\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.288374 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/47ef29f6-4627-4b84-968d-db9d7ed438da-horizon-tls-certs\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.289978 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/47ef29f6-4627-4b84-968d-db9d7ed438da-config-data\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.300194 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47ef29f6-4627-4b84-968d-db9d7ed438da-combined-ca-bundle\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.314273 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzxxs\" (UniqueName: \"kubernetes.io/projected/47ef29f6-4627-4b84-968d-db9d7ed438da-kube-api-access-kzxxs\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.322320 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/47ef29f6-4627-4b84-968d-db9d7ed438da-horizon-secret-key\") pod \"horizon-56bb55c768-vchmw\" (UID: \"47ef29f6-4627-4b84-968d-db9d7ed438da\") " pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.588283 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.688398 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"349527e3-93e0-4342-845c-eb8775ab3e5a","Type":"ContainerStarted","Data":"ecd62f1b0eb5226209d3ecf6887c5a0f2d0feaa519397c875dd114ffc89d8b3b"} Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.688664 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="349527e3-93e0-4342-845c-eb8775ab3e5a" containerName="glance-log" containerID="cri-o://30776e46d343f3c10f72753cb59ea338812b698d41259546e8c5f506b57f2e53" gracePeriod=30 Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.688918 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="349527e3-93e0-4342-845c-eb8775ab3e5a" containerName="glance-httpd" containerID="cri-o://ecd62f1b0eb5226209d3ecf6887c5a0f2d0feaa519397c875dd114ffc89d8b3b" gracePeriod=30 Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.697295 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"49ee7179-02d8-4e07-9bb8-fce22456e804","Type":"ContainerStarted","Data":"1d3f3c09a65ee0d96c5761cf37afd174b0dd657fc73b2eb650e087f0fb7b89e4"} Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.697389 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="49ee7179-02d8-4e07-9bb8-fce22456e804" containerName="glance-log" containerID="cri-o://07b51e4be4c9ba108c01cded8c1c80a772fa736483e292b6667b962df1606942" gracePeriod=30 Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.697413 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="49ee7179-02d8-4e07-9bb8-fce22456e804" containerName="glance-httpd" containerID="cri-o://1d3f3c09a65ee0d96c5761cf37afd174b0dd657fc73b2eb650e087f0fb7b89e4" gracePeriod=30 Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.703100 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"9aaa370f-a3d5-4fce-9761-873aeb8d7b1f","Type":"ContainerStarted","Data":"51fce251ae6f91898056967dc7598e9f180480171f8ac8dce0e6615b0fed1c2e"} Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.713926 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"e6a8f674-82eb-4474-973d-54a90e5fd1e0","Type":"ContainerStarted","Data":"129f5e2081595e346ca05bf1c0a5f39318c542efe9f77225ecf83f06ff7156ab"} Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.727951 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-fdg68" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.728917 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-fdg68" event={"ID":"5e1948a5-46f6-412d-91c3-bf9c255e02fc","Type":"ContainerDied","Data":"2f209c4397604d7a0a9dbd934378982e63afb3fec9341073117892ee26e51dd6"} Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.728974 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f209c4397604d7a0a9dbd934378982e63afb3fec9341073117892ee26e51dd6" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.734989 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.734958993 podStartE2EDuration="7.734958993s" podCreationTimestamp="2026-03-09 14:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:09:47.725950413 +0000 UTC m=+2942.976122321" watchObservedRunningTime="2026-03-09 14:09:47.734958993 +0000 UTC m=+2942.985130901" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.757804 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=6.006113753 podStartE2EDuration="7.757780743s" podCreationTimestamp="2026-03-09 14:09:40 +0000 UTC" firstStartedPulling="2026-03-09 14:09:43.267317795 +0000 UTC m=+2938.517489703" lastFinishedPulling="2026-03-09 14:09:45.018984785 +0000 UTC m=+2940.269156693" observedRunningTime="2026-03-09 14:09:47.753214111 +0000 UTC m=+2943.003386039" watchObservedRunningTime="2026-03-09 14:09:47.757780743 +0000 UTC m=+2943.007952651" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.859192 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=5.288626447 podStartE2EDuration="6.85916377s" podCreationTimestamp="2026-03-09 14:09:41 +0000 UTC" firstStartedPulling="2026-03-09 14:09:43.898605171 +0000 UTC m=+2939.148777079" lastFinishedPulling="2026-03-09 14:09:45.469142494 +0000 UTC m=+2940.719314402" observedRunningTime="2026-03-09 14:09:47.801252973 +0000 UTC m=+2943.051424891" watchObservedRunningTime="2026-03-09 14:09:47.85916377 +0000 UTC m=+2943.109335678" Mar 09 14:09:47 crc kubenswrapper[4764]: I0309 14:09:47.953133 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.953099468 podStartE2EDuration="7.953099468s" podCreationTimestamp="2026-03-09 14:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:09:47.823708463 +0000 UTC m=+2943.073880381" watchObservedRunningTime="2026-03-09 14:09:47.953099468 +0000 UTC m=+2943.203271386" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.442503 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-1df0-account-create-update-bg564" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.550980 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl52v\" (UniqueName: \"kubernetes.io/projected/7ce9bce5-9c23-40ac-9683-6fb232e32c3c-kube-api-access-pl52v\") pod \"7ce9bce5-9c23-40ac-9683-6fb232e32c3c\" (UID: \"7ce9bce5-9c23-40ac-9683-6fb232e32c3c\") " Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.551573 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ce9bce5-9c23-40ac-9683-6fb232e32c3c-operator-scripts\") pod \"7ce9bce5-9c23-40ac-9683-6fb232e32c3c\" (UID: \"7ce9bce5-9c23-40ac-9683-6fb232e32c3c\") " Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.554314 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ce9bce5-9c23-40ac-9683-6fb232e32c3c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7ce9bce5-9c23-40ac-9683-6fb232e32c3c" (UID: "7ce9bce5-9c23-40ac-9683-6fb232e32c3c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.563617 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ce9bce5-9c23-40ac-9683-6fb232e32c3c-kube-api-access-pl52v" (OuterVolumeSpecName: "kube-api-access-pl52v") pod "7ce9bce5-9c23-40ac-9683-6fb232e32c3c" (UID: "7ce9bce5-9c23-40ac-9683-6fb232e32c3c"). InnerVolumeSpecName "kube-api-access-pl52v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.566873 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-797d44c9b-wrlx7"] Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.577454 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-56bb55c768-vchmw"] Mar 09 14:09:48 crc kubenswrapper[4764]: W0309 14:09:48.624215 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47ef29f6_4627_4b84_968d_db9d7ed438da.slice/crio-99ac8673ac7f481cb7c33e55443030ac6b7a346a12cfd65d6c94f0f8452e36e3 WatchSource:0}: Error finding container 99ac8673ac7f481cb7c33e55443030ac6b7a346a12cfd65d6c94f0f8452e36e3: Status 404 returned error can't find the container with id 99ac8673ac7f481cb7c33e55443030ac6b7a346a12cfd65d6c94f0f8452e36e3 Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.670507 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl52v\" (UniqueName: \"kubernetes.io/projected/7ce9bce5-9c23-40ac-9683-6fb232e32c3c-kube-api-access-pl52v\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.675492 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ce9bce5-9c23-40ac-9683-6fb232e32c3c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.744423 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-797d44c9b-wrlx7" event={"ID":"e00fc104-ec73-4190-a598-86de7ca6cfa5","Type":"ContainerStarted","Data":"183964e4224d34ffefb68047495776b87e734441e831507181f1e3aafc498e51"} Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.752347 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-1df0-account-create-update-bg564" event={"ID":"7ce9bce5-9c23-40ac-9683-6fb232e32c3c","Type":"ContainerDied","Data":"2002bb74ed38eb2c1f0e486f850b4e290555f1f76e89842042231f9e9946ffde"} Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.752465 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2002bb74ed38eb2c1f0e486f850b4e290555f1f76e89842042231f9e9946ffde" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.752538 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-1df0-account-create-update-bg564" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.775193 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56bb55c768-vchmw" event={"ID":"47ef29f6-4627-4b84-968d-db9d7ed438da","Type":"ContainerStarted","Data":"99ac8673ac7f481cb7c33e55443030ac6b7a346a12cfd65d6c94f0f8452e36e3"} Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.788065 4764 generic.go:334] "Generic (PLEG): container finished" podID="349527e3-93e0-4342-845c-eb8775ab3e5a" containerID="ecd62f1b0eb5226209d3ecf6887c5a0f2d0feaa519397c875dd114ffc89d8b3b" exitCode=143 Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.788120 4764 generic.go:334] "Generic (PLEG): container finished" podID="349527e3-93e0-4342-845c-eb8775ab3e5a" containerID="30776e46d343f3c10f72753cb59ea338812b698d41259546e8c5f506b57f2e53" exitCode=143 Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.788255 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"349527e3-93e0-4342-845c-eb8775ab3e5a","Type":"ContainerDied","Data":"ecd62f1b0eb5226209d3ecf6887c5a0f2d0feaa519397c875dd114ffc89d8b3b"} Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.788297 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"349527e3-93e0-4342-845c-eb8775ab3e5a","Type":"ContainerDied","Data":"30776e46d343f3c10f72753cb59ea338812b698d41259546e8c5f506b57f2e53"} Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.788332 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"349527e3-93e0-4342-845c-eb8775ab3e5a","Type":"ContainerDied","Data":"6f8e9f0abe6ca5e7bff14c308c2db546c0b2c9b94e174f7fa80c547c31b16a26"} Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.788346 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f8e9f0abe6ca5e7bff14c308c2db546c0b2c9b94e174f7fa80c547c31b16a26" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.791829 4764 generic.go:334] "Generic (PLEG): container finished" podID="49ee7179-02d8-4e07-9bb8-fce22456e804" containerID="1d3f3c09a65ee0d96c5761cf37afd174b0dd657fc73b2eb650e087f0fb7b89e4" exitCode=143 Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.791863 4764 generic.go:334] "Generic (PLEG): container finished" podID="49ee7179-02d8-4e07-9bb8-fce22456e804" containerID="07b51e4be4c9ba108c01cded8c1c80a772fa736483e292b6667b962df1606942" exitCode=143 Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.792086 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"49ee7179-02d8-4e07-9bb8-fce22456e804","Type":"ContainerDied","Data":"1d3f3c09a65ee0d96c5761cf37afd174b0dd657fc73b2eb650e087f0fb7b89e4"} Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.792184 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"49ee7179-02d8-4e07-9bb8-fce22456e804","Type":"ContainerDied","Data":"07b51e4be4c9ba108c01cded8c1c80a772fa736483e292b6667b962df1606942"} Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.830522 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.879386 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-combined-ca-bundle\") pod \"349527e3-93e0-4342-845c-eb8775ab3e5a\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.881771 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/349527e3-93e0-4342-845c-eb8775ab3e5a-logs\") pod \"349527e3-93e0-4342-845c-eb8775ab3e5a\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.881832 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxdzv\" (UniqueName: \"kubernetes.io/projected/349527e3-93e0-4342-845c-eb8775ab3e5a-kube-api-access-kxdzv\") pod \"349527e3-93e0-4342-845c-eb8775ab3e5a\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.881894 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/349527e3-93e0-4342-845c-eb8775ab3e5a-ceph\") pod \"349527e3-93e0-4342-845c-eb8775ab3e5a\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.882060 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-internal-tls-certs\") pod \"349527e3-93e0-4342-845c-eb8775ab3e5a\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.882162 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-config-data\") pod \"349527e3-93e0-4342-845c-eb8775ab3e5a\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.882233 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"349527e3-93e0-4342-845c-eb8775ab3e5a\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.882286 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/349527e3-93e0-4342-845c-eb8775ab3e5a-httpd-run\") pod \"349527e3-93e0-4342-845c-eb8775ab3e5a\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.882373 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-scripts\") pod \"349527e3-93e0-4342-845c-eb8775ab3e5a\" (UID: \"349527e3-93e0-4342-845c-eb8775ab3e5a\") " Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.887760 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/349527e3-93e0-4342-845c-eb8775ab3e5a-logs" (OuterVolumeSpecName: "logs") pod "349527e3-93e0-4342-845c-eb8775ab3e5a" (UID: "349527e3-93e0-4342-845c-eb8775ab3e5a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.895350 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-scripts" (OuterVolumeSpecName: "scripts") pod "349527e3-93e0-4342-845c-eb8775ab3e5a" (UID: "349527e3-93e0-4342-845c-eb8775ab3e5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.895820 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/349527e3-93e0-4342-845c-eb8775ab3e5a-kube-api-access-kxdzv" (OuterVolumeSpecName: "kube-api-access-kxdzv") pod "349527e3-93e0-4342-845c-eb8775ab3e5a" (UID: "349527e3-93e0-4342-845c-eb8775ab3e5a"). InnerVolumeSpecName "kube-api-access-kxdzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.899154 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "349527e3-93e0-4342-845c-eb8775ab3e5a" (UID: "349527e3-93e0-4342-845c-eb8775ab3e5a"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.899984 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/349527e3-93e0-4342-845c-eb8775ab3e5a-ceph" (OuterVolumeSpecName: "ceph") pod "349527e3-93e0-4342-845c-eb8775ab3e5a" (UID: "349527e3-93e0-4342-845c-eb8775ab3e5a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.901004 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/349527e3-93e0-4342-845c-eb8775ab3e5a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "349527e3-93e0-4342-845c-eb8775ab3e5a" (UID: "349527e3-93e0-4342-845c-eb8775ab3e5a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.930263 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "349527e3-93e0-4342-845c-eb8775ab3e5a" (UID: "349527e3-93e0-4342-845c-eb8775ab3e5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.950828 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "349527e3-93e0-4342-845c-eb8775ab3e5a" (UID: "349527e3-93e0-4342-845c-eb8775ab3e5a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.994690 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/349527e3-93e0-4342-845c-eb8775ab3e5a-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.994733 4764 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.994763 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.994775 4764 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/349527e3-93e0-4342-845c-eb8775ab3e5a-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.994785 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.994797 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.994820 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/349527e3-93e0-4342-845c-eb8775ab3e5a-logs\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:48 crc kubenswrapper[4764]: I0309 14:09:48.994834 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxdzv\" (UniqueName: \"kubernetes.io/projected/349527e3-93e0-4342-845c-eb8775ab3e5a-kube-api-access-kxdzv\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:49 crc kubenswrapper[4764]: I0309 14:09:49.022663 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-config-data" (OuterVolumeSpecName: "config-data") pod "349527e3-93e0-4342-845c-eb8775ab3e5a" (UID: "349527e3-93e0-4342-845c-eb8775ab3e5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:49 crc kubenswrapper[4764]: I0309 14:09:49.025145 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 09 14:09:49 crc kubenswrapper[4764]: I0309 14:09:49.096933 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:49 crc kubenswrapper[4764]: I0309 14:09:49.096968 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/349527e3-93e0-4342-845c-eb8775ab3e5a-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:49 crc kubenswrapper[4764]: I0309 14:09:49.808106 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 14:09:49 crc kubenswrapper[4764]: I0309 14:09:49.954393 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 14:09:49 crc kubenswrapper[4764]: I0309 14:09:49.981181 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.005580 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.020237 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:09:50 crc kubenswrapper[4764]: E0309 14:09:50.020924 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ee7179-02d8-4e07-9bb8-fce22456e804" containerName="glance-httpd" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.020954 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ee7179-02d8-4e07-9bb8-fce22456e804" containerName="glance-httpd" Mar 09 14:09:50 crc kubenswrapper[4764]: E0309 14:09:50.020982 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="349527e3-93e0-4342-845c-eb8775ab3e5a" containerName="glance-httpd" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.020991 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="349527e3-93e0-4342-845c-eb8775ab3e5a" containerName="glance-httpd" Mar 09 14:09:50 crc kubenswrapper[4764]: E0309 14:09:50.021019 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="349527e3-93e0-4342-845c-eb8775ab3e5a" containerName="glance-log" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.021028 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="349527e3-93e0-4342-845c-eb8775ab3e5a" containerName="glance-log" Mar 09 14:09:50 crc kubenswrapper[4764]: E0309 14:09:50.021039 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ee7179-02d8-4e07-9bb8-fce22456e804" containerName="glance-log" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.021047 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ee7179-02d8-4e07-9bb8-fce22456e804" containerName="glance-log" Mar 09 14:09:50 crc kubenswrapper[4764]: E0309 14:09:50.021062 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ce9bce5-9c23-40ac-9683-6fb232e32c3c" containerName="mariadb-account-create-update" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.021071 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ce9bce5-9c23-40ac-9683-6fb232e32c3c" containerName="mariadb-account-create-update" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.021288 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ee7179-02d8-4e07-9bb8-fce22456e804" containerName="glance-log" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.021312 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="349527e3-93e0-4342-845c-eb8775ab3e5a" containerName="glance-log" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.021324 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ce9bce5-9c23-40ac-9683-6fb232e32c3c" containerName="mariadb-account-create-update" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.021338 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ee7179-02d8-4e07-9bb8-fce22456e804" containerName="glance-httpd" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.021359 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="349527e3-93e0-4342-845c-eb8775ab3e5a" containerName="glance-httpd" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.022735 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.026226 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.026337 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.047857 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49ee7179-02d8-4e07-9bb8-fce22456e804-httpd-run\") pod \"49ee7179-02d8-4e07-9bb8-fce22456e804\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.048117 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-config-data\") pod \"49ee7179-02d8-4e07-9bb8-fce22456e804\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.048175 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-public-tls-certs\") pod \"49ee7179-02d8-4e07-9bb8-fce22456e804\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.048228 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"49ee7179-02d8-4e07-9bb8-fce22456e804\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.049243 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49ee7179-02d8-4e07-9bb8-fce22456e804-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "49ee7179-02d8-4e07-9bb8-fce22456e804" (UID: "49ee7179-02d8-4e07-9bb8-fce22456e804"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.049718 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49ee7179-02d8-4e07-9bb8-fce22456e804-logs\") pod \"49ee7179-02d8-4e07-9bb8-fce22456e804\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.049791 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t465\" (UniqueName: \"kubernetes.io/projected/49ee7179-02d8-4e07-9bb8-fce22456e804-kube-api-access-4t465\") pod \"49ee7179-02d8-4e07-9bb8-fce22456e804\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.049825 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-scripts\") pod \"49ee7179-02d8-4e07-9bb8-fce22456e804\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.049890 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/49ee7179-02d8-4e07-9bb8-fce22456e804-ceph\") pod \"49ee7179-02d8-4e07-9bb8-fce22456e804\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.050051 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-combined-ca-bundle\") pod \"49ee7179-02d8-4e07-9bb8-fce22456e804\" (UID: \"49ee7179-02d8-4e07-9bb8-fce22456e804\") " Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.050189 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49ee7179-02d8-4e07-9bb8-fce22456e804-logs" (OuterVolumeSpecName: "logs") pod "49ee7179-02d8-4e07-9bb8-fce22456e804" (UID: "49ee7179-02d8-4e07-9bb8-fce22456e804"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.051239 4764 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49ee7179-02d8-4e07-9bb8-fce22456e804-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.051261 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49ee7179-02d8-4e07-9bb8-fce22456e804-logs\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.058321 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-scripts" (OuterVolumeSpecName: "scripts") pod "49ee7179-02d8-4e07-9bb8-fce22456e804" (UID: "49ee7179-02d8-4e07-9bb8-fce22456e804"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.063753 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.069232 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ee7179-02d8-4e07-9bb8-fce22456e804-kube-api-access-4t465" (OuterVolumeSpecName: "kube-api-access-4t465") pod "49ee7179-02d8-4e07-9bb8-fce22456e804" (UID: "49ee7179-02d8-4e07-9bb8-fce22456e804"). InnerVolumeSpecName "kube-api-access-4t465". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.071243 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ee7179-02d8-4e07-9bb8-fce22456e804-ceph" (OuterVolumeSpecName: "ceph") pod "49ee7179-02d8-4e07-9bb8-fce22456e804" (UID: "49ee7179-02d8-4e07-9bb8-fce22456e804"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.076992 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "49ee7179-02d8-4e07-9bb8-fce22456e804" (UID: "49ee7179-02d8-4e07-9bb8-fce22456e804"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.118620 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "49ee7179-02d8-4e07-9bb8-fce22456e804" (UID: "49ee7179-02d8-4e07-9bb8-fce22456e804"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.123997 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49ee7179-02d8-4e07-9bb8-fce22456e804" (UID: "49ee7179-02d8-4e07-9bb8-fce22456e804"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.137884 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-config-data" (OuterVolumeSpecName: "config-data") pod "49ee7179-02d8-4e07-9bb8-fce22456e804" (UID: "49ee7179-02d8-4e07-9bb8-fce22456e804"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.152952 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-logs\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.153049 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.153159 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.153185 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.153217 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.153244 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgvwl\" (UniqueName: \"kubernetes.io/projected/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-kube-api-access-pgvwl\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.153289 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-ceph\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.153326 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.153348 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.153419 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.153434 4764 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.153460 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.153473 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t465\" (UniqueName: \"kubernetes.io/projected/49ee7179-02d8-4e07-9bb8-fce22456e804-kube-api-access-4t465\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.153483 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.153493 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/49ee7179-02d8-4e07-9bb8-fce22456e804-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.153502 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ee7179-02d8-4e07-9bb8-fce22456e804-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.178853 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.256078 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.256129 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.256164 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.256185 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgvwl\" (UniqueName: \"kubernetes.io/projected/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-kube-api-access-pgvwl\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.256736 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-ceph\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.256799 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.256826 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.256959 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-logs\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.257030 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.257094 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.258117 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.259738 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-logs\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.260252 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.266000 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.266183 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-ceph\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.267585 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.267625 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.270536 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.286767 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgvwl\" (UniqueName: \"kubernetes.io/projected/66d58a1b-5d94-4d28-bcb3-0b20f0516eab-kube-api-access-pgvwl\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.333158 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"66d58a1b-5d94-4d28-bcb3-0b20f0516eab\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.350163 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.823657 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"49ee7179-02d8-4e07-9bb8-fce22456e804","Type":"ContainerDied","Data":"1f84cee9a80ba50e44abce0cb213778729f5e3cb3d6103fdf6b0ebee436d5285"} Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.824350 4764 scope.go:117] "RemoveContainer" containerID="1d3f3c09a65ee0d96c5761cf37afd174b0dd657fc73b2eb650e087f0fb7b89e4" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.824528 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.879439 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.903560 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.914288 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.916619 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.920009 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.921184 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.930829 4764 scope.go:117] "RemoveContainer" containerID="07b51e4be4c9ba108c01cded8c1c80a772fa736483e292b6667b962df1606942" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.940075 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.982688 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz2wk\" (UniqueName: \"kubernetes.io/projected/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-kube-api-access-xz2wk\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.982891 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-logs\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.982925 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-ceph\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.982975 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-config-data\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.983015 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.983088 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.983262 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.983321 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-scripts\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:50 crc kubenswrapper[4764]: I0309 14:09:50.983403 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.096266 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.096345 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.096527 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.096568 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-scripts\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.096659 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.096785 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz2wk\" (UniqueName: \"kubernetes.io/projected/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-kube-api-access-xz2wk\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.096992 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-logs\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.097025 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-ceph\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.097047 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-config-data\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.098827 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-logs\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.099139 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.099670 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.102154 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.106638 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.109108 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.122534 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-ceph\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.127004 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-scripts\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.132432 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-config-data\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.132949 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz2wk\" (UniqueName: \"kubernetes.io/projected/22563404-fb5a-4d95-bae1-dd24d6fcc8d1-kube-api-access-xz2wk\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.168029 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"22563404-fb5a-4d95-bae1-dd24d6fcc8d1\") " pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.261445 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.346679 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.592706 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="349527e3-93e0-4342-845c-eb8775ab3e5a" path="/var/lib/kubelet/pods/349527e3-93e0-4342-845c-eb8775ab3e5a/volumes" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.594714 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ee7179-02d8-4e07-9bb8-fce22456e804" path="/var/lib/kubelet/pods/49ee7179-02d8-4e07-9bb8-fce22456e804/volumes" Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.869765 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"66d58a1b-5d94-4d28-bcb3-0b20f0516eab","Type":"ContainerStarted","Data":"04f4bc8a503c5f3a059e9920c27317855087791ac437284180c815115a1d77ab"} Mar 09 14:09:51 crc kubenswrapper[4764]: I0309 14:09:51.933660 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Mar 09 14:09:52 crc kubenswrapper[4764]: I0309 14:09:52.101930 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:09:52 crc kubenswrapper[4764]: I0309 14:09:52.591860 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Mar 09 14:09:52 crc kubenswrapper[4764]: I0309 14:09:52.892733 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"66d58a1b-5d94-4d28-bcb3-0b20f0516eab","Type":"ContainerStarted","Data":"5efde14e4961a0c01b0d30dad46664f7ec8bdb3a095cf8f867dc1096e67930fe"} Mar 09 14:09:52 crc kubenswrapper[4764]: I0309 14:09:52.898012 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"22563404-fb5a-4d95-bae1-dd24d6fcc8d1","Type":"ContainerStarted","Data":"945ab8a9c2f227bba0dd20eac0acb19c987f60d231f148f4ac3cb22c038524f8"} Mar 09 14:09:52 crc kubenswrapper[4764]: I0309 14:09:52.919496 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.649799 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-j6s45"] Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.651726 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-j6s45" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.655917 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-692nx" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.677199 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-j6s45"] Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.677510 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.695424 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-job-config-data\") pod \"manila-db-sync-j6s45\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " pod="openstack/manila-db-sync-j6s45" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.695764 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svz5p\" (UniqueName: \"kubernetes.io/projected/6711cdff-410c-4d91-b172-c2065054c1be-kube-api-access-svz5p\") pod \"manila-db-sync-j6s45\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " pod="openstack/manila-db-sync-j6s45" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.696142 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-config-data\") pod \"manila-db-sync-j6s45\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " pod="openstack/manila-db-sync-j6s45" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.696472 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-combined-ca-bundle\") pod \"manila-db-sync-j6s45\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " pod="openstack/manila-db-sync-j6s45" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.799136 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-config-data\") pod \"manila-db-sync-j6s45\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " pod="openstack/manila-db-sync-j6s45" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.800501 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-combined-ca-bundle\") pod \"manila-db-sync-j6s45\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " pod="openstack/manila-db-sync-j6s45" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.800617 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-job-config-data\") pod \"manila-db-sync-j6s45\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " pod="openstack/manila-db-sync-j6s45" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.800702 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svz5p\" (UniqueName: \"kubernetes.io/projected/6711cdff-410c-4d91-b172-c2065054c1be-kube-api-access-svz5p\") pod \"manila-db-sync-j6s45\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " pod="openstack/manila-db-sync-j6s45" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.812155 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-combined-ca-bundle\") pod \"manila-db-sync-j6s45\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " pod="openstack/manila-db-sync-j6s45" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.812297 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-config-data\") pod \"manila-db-sync-j6s45\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " pod="openstack/manila-db-sync-j6s45" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.826479 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svz5p\" (UniqueName: \"kubernetes.io/projected/6711cdff-410c-4d91-b172-c2065054c1be-kube-api-access-svz5p\") pod \"manila-db-sync-j6s45\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " pod="openstack/manila-db-sync-j6s45" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.830188 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-job-config-data\") pod \"manila-db-sync-j6s45\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " pod="openstack/manila-db-sync-j6s45" Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.920307 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"22563404-fb5a-4d95-bae1-dd24d6fcc8d1","Type":"ContainerStarted","Data":"1a1e2836425112167dc3a353e0bfab1f8e138d9c7e0337213050fc6ff682af19"} Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.924355 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"66d58a1b-5d94-4d28-bcb3-0b20f0516eab","Type":"ContainerStarted","Data":"bd4eb58d364d43b43f36b9f6d2633e35d7a9461a68f5aaca98bd22b9d851df9a"} Mar 09 14:09:53 crc kubenswrapper[4764]: I0309 14:09:53.962391 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.962361898 podStartE2EDuration="4.962361898s" podCreationTimestamp="2026-03-09 14:09:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:09:53.950526402 +0000 UTC m=+2949.200698320" watchObservedRunningTime="2026-03-09 14:09:53.962361898 +0000 UTC m=+2949.212533816" Mar 09 14:09:54 crc kubenswrapper[4764]: I0309 14:09:54.006517 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-j6s45" Mar 09 14:09:58 crc kubenswrapper[4764]: I0309 14:09:58.370800 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:09:58 crc kubenswrapper[4764]: I0309 14:09:58.371718 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:09:59 crc kubenswrapper[4764]: I0309 14:09:59.980757 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-j6s45"] Mar 09 14:09:59 crc kubenswrapper[4764]: W0309 14:09:59.998718 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6711cdff_410c_4d91_b172_c2065054c1be.slice/crio-feee9d84b5899b620a2823aa6ab377cd98db0a958fd8bd41a85078ce9d01392c WatchSource:0}: Error finding container feee9d84b5899b620a2823aa6ab377cd98db0a958fd8bd41a85078ce9d01392c: Status 404 returned error can't find the container with id feee9d84b5899b620a2823aa6ab377cd98db0a958fd8bd41a85078ce9d01392c Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.016722 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55f8b7fc4c-6rdd7" event={"ID":"6b944cf9-8278-4b16-b09c-0da6a2519b2a","Type":"ContainerStarted","Data":"1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1"} Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.018938 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56bb55c768-vchmw" event={"ID":"47ef29f6-4627-4b84-968d-db9d7ed438da","Type":"ContainerStarted","Data":"2cde99c478c4a636d0754a9a4064c4e5d79e6ed799daf627481c02df995be0c3"} Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.023541 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-797d44c9b-wrlx7" event={"ID":"e00fc104-ec73-4190-a598-86de7ca6cfa5","Type":"ContainerStarted","Data":"ed7bd666dbaa56d7a77980118a8739a44cafc9a966ea8469e78e592a4a49ac96"} Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.026267 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9d94c4c7-px8jh" event={"ID":"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243","Type":"ContainerStarted","Data":"fabc9d27abd14d14db86a85f4413de77c2fd101b8722a207f502ad9ee60b4405"} Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.152279 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551090-8wdlp"] Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.154136 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551090-8wdlp" Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.160084 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.161592 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.161774 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.164365 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551090-8wdlp"] Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.302954 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqk4g\" (UniqueName: \"kubernetes.io/projected/4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e-kube-api-access-nqk4g\") pod \"auto-csr-approver-29551090-8wdlp\" (UID: \"4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e\") " pod="openshift-infra/auto-csr-approver-29551090-8wdlp" Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.351248 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.353699 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.386807 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.398919 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.405990 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqk4g\" (UniqueName: \"kubernetes.io/projected/4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e-kube-api-access-nqk4g\") pod \"auto-csr-approver-29551090-8wdlp\" (UID: \"4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e\") " pod="openshift-infra/auto-csr-approver-29551090-8wdlp" Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.428388 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqk4g\" (UniqueName: \"kubernetes.io/projected/4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e-kube-api-access-nqk4g\") pod \"auto-csr-approver-29551090-8wdlp\" (UID: \"4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e\") " pod="openshift-infra/auto-csr-approver-29551090-8wdlp" Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.474749 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551090-8wdlp" Mar 09 14:10:00 crc kubenswrapper[4764]: I0309 14:10:00.820764 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551090-8wdlp"] Mar 09 14:10:00 crc kubenswrapper[4764]: W0309 14:10:00.822794 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b56c1fe_b4de_45ba_8ca2_7bae98a2e97e.slice/crio-67b8f22e1dd658bc6fc7c70355b7dbf66bd2db713618122623ec70d154f78240 WatchSource:0}: Error finding container 67b8f22e1dd658bc6fc7c70355b7dbf66bd2db713618122623ec70d154f78240: Status 404 returned error can't find the container with id 67b8f22e1dd658bc6fc7c70355b7dbf66bd2db713618122623ec70d154f78240 Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.043144 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-j6s45" event={"ID":"6711cdff-410c-4d91-b172-c2065054c1be","Type":"ContainerStarted","Data":"feee9d84b5899b620a2823aa6ab377cd98db0a958fd8bd41a85078ce9d01392c"} Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.049330 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55f8b7fc4c-6rdd7" event={"ID":"6b944cf9-8278-4b16-b09c-0da6a2519b2a","Type":"ContainerStarted","Data":"b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd"} Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.049687 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-55f8b7fc4c-6rdd7" podUID="6b944cf9-8278-4b16-b09c-0da6a2519b2a" containerName="horizon" containerID="cri-o://b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd" gracePeriod=30 Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.049627 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-55f8b7fc4c-6rdd7" podUID="6b944cf9-8278-4b16-b09c-0da6a2519b2a" containerName="horizon-log" containerID="cri-o://1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1" gracePeriod=30 Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.064952 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56bb55c768-vchmw" event={"ID":"47ef29f6-4627-4b84-968d-db9d7ed438da","Type":"ContainerStarted","Data":"aaff32cc6e4babd2bdfeafec5fd5821c681fba782160c35443c5ff42d44cf549"} Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.067314 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551090-8wdlp" event={"ID":"4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e","Type":"ContainerStarted","Data":"67b8f22e1dd658bc6fc7c70355b7dbf66bd2db713618122623ec70d154f78240"} Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.069833 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"22563404-fb5a-4d95-bae1-dd24d6fcc8d1","Type":"ContainerStarted","Data":"42d1d334ebc93133ced56635a8f1065b36cde7d3488daab4ec92dcc8b0f826f6"} Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.073447 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-797d44c9b-wrlx7" event={"ID":"e00fc104-ec73-4190-a598-86de7ca6cfa5","Type":"ContainerStarted","Data":"c791eb4afad3d3b7319a4ecbb9ad7314d04c8eb6a4954765b916e231a57d04c0"} Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.082352 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-9d94c4c7-px8jh" podUID="849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" containerName="horizon-log" containerID="cri-o://fabc9d27abd14d14db86a85f4413de77c2fd101b8722a207f502ad9ee60b4405" gracePeriod=30 Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.082785 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9d94c4c7-px8jh" event={"ID":"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243","Type":"ContainerStarted","Data":"c5d8efe5920ca730c02900ccbfc9cdc3905256778569090d306fce8d9fecf051"} Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.082832 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.083392 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-9d94c4c7-px8jh" podUID="849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" containerName="horizon" containerID="cri-o://c5d8efe5920ca730c02900ccbfc9cdc3905256778569090d306fce8d9fecf051" gracePeriod=30 Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.083559 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.085044 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-55f8b7fc4c-6rdd7" podStartSLOduration=3.460068893 podStartE2EDuration="18.085029656s" podCreationTimestamp="2026-03-09 14:09:43 +0000 UTC" firstStartedPulling="2026-03-09 14:09:44.817417983 +0000 UTC m=+2940.067589891" lastFinishedPulling="2026-03-09 14:09:59.442378746 +0000 UTC m=+2954.692550654" observedRunningTime="2026-03-09 14:10:01.082889519 +0000 UTC m=+2956.333061447" watchObservedRunningTime="2026-03-09 14:10:01.085029656 +0000 UTC m=+2956.335201554" Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.140247 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=11.14022114 podStartE2EDuration="11.14022114s" podCreationTimestamp="2026-03-09 14:09:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:10:01.128278311 +0000 UTC m=+2956.378450219" watchObservedRunningTime="2026-03-09 14:10:01.14022114 +0000 UTC m=+2956.390393048" Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.175942 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-56bb55c768-vchmw" podStartSLOduration=4.30296083 podStartE2EDuration="15.175910583s" podCreationTimestamp="2026-03-09 14:09:46 +0000 UTC" firstStartedPulling="2026-03-09 14:09:48.671023826 +0000 UTC m=+2943.921195914" lastFinishedPulling="2026-03-09 14:09:59.543973749 +0000 UTC m=+2954.794145667" observedRunningTime="2026-03-09 14:10:01.157256604 +0000 UTC m=+2956.407428512" watchObservedRunningTime="2026-03-09 14:10:01.175910583 +0000 UTC m=+2956.426082491" Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.201368 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-9d94c4c7-px8jh" podStartSLOduration=3.584317391 podStartE2EDuration="18.201329481s" podCreationTimestamp="2026-03-09 14:09:43 +0000 UTC" firstStartedPulling="2026-03-09 14:09:44.825236682 +0000 UTC m=+2940.075408590" lastFinishedPulling="2026-03-09 14:09:59.442248782 +0000 UTC m=+2954.692420680" observedRunningTime="2026-03-09 14:10:01.188136729 +0000 UTC m=+2956.438308657" watchObservedRunningTime="2026-03-09 14:10:01.201329481 +0000 UTC m=+2956.451501409" Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.218424 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-797d44c9b-wrlx7" podStartSLOduration=4.353830188 podStartE2EDuration="15.218392137s" podCreationTimestamp="2026-03-09 14:09:46 +0000 UTC" firstStartedPulling="2026-03-09 14:09:48.680010096 +0000 UTC m=+2943.930182004" lastFinishedPulling="2026-03-09 14:09:59.544572045 +0000 UTC m=+2954.794743953" observedRunningTime="2026-03-09 14:10:01.216341782 +0000 UTC m=+2956.466513700" watchObservedRunningTime="2026-03-09 14:10:01.218392137 +0000 UTC m=+2956.468564065" Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.262477 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.262534 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.306110 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 09 14:10:01 crc kubenswrapper[4764]: I0309 14:10:01.317590 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 09 14:10:02 crc kubenswrapper[4764]: I0309 14:10:02.089917 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 09 14:10:02 crc kubenswrapper[4764]: I0309 14:10:02.089987 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 09 14:10:03 crc kubenswrapper[4764]: I0309 14:10:03.123720 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551090-8wdlp" event={"ID":"4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e","Type":"ContainerStarted","Data":"c50fe2a841af25f201b2df55f17fb4354fb702de4d79adbf221b5ea06463110b"} Mar 09 14:10:03 crc kubenswrapper[4764]: I0309 14:10:03.160800 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551090-8wdlp" podStartSLOduration=1.9157215650000001 podStartE2EDuration="3.160771329s" podCreationTimestamp="2026-03-09 14:10:00 +0000 UTC" firstStartedPulling="2026-03-09 14:10:00.826622196 +0000 UTC m=+2956.076794104" lastFinishedPulling="2026-03-09 14:10:02.07167196 +0000 UTC m=+2957.321843868" observedRunningTime="2026-03-09 14:10:03.152836737 +0000 UTC m=+2958.403008645" watchObservedRunningTime="2026-03-09 14:10:03.160771329 +0000 UTC m=+2958.410943257" Mar 09 14:10:03 crc kubenswrapper[4764]: I0309 14:10:03.528792 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:10:03 crc kubenswrapper[4764]: I0309 14:10:03.834162 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:10:04 crc kubenswrapper[4764]: I0309 14:10:04.135928 4764 generic.go:334] "Generic (PLEG): container finished" podID="4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e" containerID="c50fe2a841af25f201b2df55f17fb4354fb702de4d79adbf221b5ea06463110b" exitCode=0 Mar 09 14:10:04 crc kubenswrapper[4764]: I0309 14:10:04.135986 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551090-8wdlp" event={"ID":"4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e","Type":"ContainerDied","Data":"c50fe2a841af25f201b2df55f17fb4354fb702de4d79adbf221b5ea06463110b"} Mar 09 14:10:05 crc kubenswrapper[4764]: I0309 14:10:05.321865 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 09 14:10:05 crc kubenswrapper[4764]: I0309 14:10:05.345683 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 09 14:10:05 crc kubenswrapper[4764]: I0309 14:10:05.346371 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 14:10:05 crc kubenswrapper[4764]: I0309 14:10:05.494954 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 09 14:10:07 crc kubenswrapper[4764]: I0309 14:10:07.276773 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:10:07 crc kubenswrapper[4764]: I0309 14:10:07.277316 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:10:07 crc kubenswrapper[4764]: I0309 14:10:07.580767 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551090-8wdlp" Mar 09 14:10:07 crc kubenswrapper[4764]: I0309 14:10:07.588445 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:10:07 crc kubenswrapper[4764]: I0309 14:10:07.588491 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:10:07 crc kubenswrapper[4764]: I0309 14:10:07.711479 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 09 14:10:07 crc kubenswrapper[4764]: I0309 14:10:07.722163 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqk4g\" (UniqueName: \"kubernetes.io/projected/4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e-kube-api-access-nqk4g\") pod \"4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e\" (UID: \"4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e\") " Mar 09 14:10:07 crc kubenswrapper[4764]: I0309 14:10:07.743127 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e-kube-api-access-nqk4g" (OuterVolumeSpecName: "kube-api-access-nqk4g") pod "4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e" (UID: "4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e"). InnerVolumeSpecName "kube-api-access-nqk4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:10:07 crc kubenswrapper[4764]: I0309 14:10:07.827387 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqk4g\" (UniqueName: \"kubernetes.io/projected/4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e-kube-api-access-nqk4g\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:08 crc kubenswrapper[4764]: I0309 14:10:08.199807 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551090-8wdlp" Mar 09 14:10:08 crc kubenswrapper[4764]: I0309 14:10:08.216718 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551090-8wdlp" event={"ID":"4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e","Type":"ContainerDied","Data":"67b8f22e1dd658bc6fc7c70355b7dbf66bd2db713618122623ec70d154f78240"} Mar 09 14:10:08 crc kubenswrapper[4764]: I0309 14:10:08.216810 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67b8f22e1dd658bc6fc7c70355b7dbf66bd2db713618122623ec70d154f78240" Mar 09 14:10:08 crc kubenswrapper[4764]: I0309 14:10:08.677665 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551084-pchq7"] Mar 09 14:10:08 crc kubenswrapper[4764]: I0309 14:10:08.689173 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551084-pchq7"] Mar 09 14:10:09 crc kubenswrapper[4764]: I0309 14:10:09.572583 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91d1d723-d4e8-40d8-9d17-3dfee51e7aef" path="/var/lib/kubelet/pods/91d1d723-d4e8-40d8-9d17-3dfee51e7aef/volumes" Mar 09 14:10:09 crc kubenswrapper[4764]: I0309 14:10:09.776083 4764 scope.go:117] "RemoveContainer" containerID="6e6b381c4b8297d66803da8d662836720642fff63afbcf8243cdcd4157213d11" Mar 09 14:10:10 crc kubenswrapper[4764]: I0309 14:10:10.221590 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-j6s45" event={"ID":"6711cdff-410c-4d91-b172-c2065054c1be","Type":"ContainerStarted","Data":"8d377192006e66551cb0ea6108fb1f8c6b81bac19c99e99d5e21dffabe9e931d"} Mar 09 14:10:10 crc kubenswrapper[4764]: I0309 14:10:10.248615 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-j6s45" podStartSLOduration=8.311472832 podStartE2EDuration="17.248585037s" podCreationTimestamp="2026-03-09 14:09:53 +0000 UTC" firstStartedPulling="2026-03-09 14:10:00.002203904 +0000 UTC m=+2955.252375812" lastFinishedPulling="2026-03-09 14:10:08.939316119 +0000 UTC m=+2964.189488017" observedRunningTime="2026-03-09 14:10:10.237229233 +0000 UTC m=+2965.487401151" watchObservedRunningTime="2026-03-09 14:10:10.248585037 +0000 UTC m=+2965.498756945" Mar 09 14:10:13 crc kubenswrapper[4764]: I0309 14:10:13.079711 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vks4j"] Mar 09 14:10:13 crc kubenswrapper[4764]: E0309 14:10:13.080900 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e" containerName="oc" Mar 09 14:10:13 crc kubenswrapper[4764]: I0309 14:10:13.080920 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e" containerName="oc" Mar 09 14:10:13 crc kubenswrapper[4764]: I0309 14:10:13.081142 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e" containerName="oc" Mar 09 14:10:13 crc kubenswrapper[4764]: I0309 14:10:13.085535 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:13 crc kubenswrapper[4764]: I0309 14:10:13.124625 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vks4j"] Mar 09 14:10:13 crc kubenswrapper[4764]: I0309 14:10:13.174044 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9c5571a-71f1-42d9-8025-2f51e13a5f03-catalog-content\") pod \"redhat-marketplace-vks4j\" (UID: \"f9c5571a-71f1-42d9-8025-2f51e13a5f03\") " pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:13 crc kubenswrapper[4764]: I0309 14:10:13.174088 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9c5571a-71f1-42d9-8025-2f51e13a5f03-utilities\") pod \"redhat-marketplace-vks4j\" (UID: \"f9c5571a-71f1-42d9-8025-2f51e13a5f03\") " pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:13 crc kubenswrapper[4764]: I0309 14:10:13.174127 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52qrw\" (UniqueName: \"kubernetes.io/projected/f9c5571a-71f1-42d9-8025-2f51e13a5f03-kube-api-access-52qrw\") pod \"redhat-marketplace-vks4j\" (UID: \"f9c5571a-71f1-42d9-8025-2f51e13a5f03\") " pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:13 crc kubenswrapper[4764]: I0309 14:10:13.276882 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9c5571a-71f1-42d9-8025-2f51e13a5f03-catalog-content\") pod \"redhat-marketplace-vks4j\" (UID: \"f9c5571a-71f1-42d9-8025-2f51e13a5f03\") " pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:13 crc kubenswrapper[4764]: I0309 14:10:13.276951 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9c5571a-71f1-42d9-8025-2f51e13a5f03-utilities\") pod \"redhat-marketplace-vks4j\" (UID: \"f9c5571a-71f1-42d9-8025-2f51e13a5f03\") " pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:13 crc kubenswrapper[4764]: I0309 14:10:13.277004 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52qrw\" (UniqueName: \"kubernetes.io/projected/f9c5571a-71f1-42d9-8025-2f51e13a5f03-kube-api-access-52qrw\") pod \"redhat-marketplace-vks4j\" (UID: \"f9c5571a-71f1-42d9-8025-2f51e13a5f03\") " pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:13 crc kubenswrapper[4764]: I0309 14:10:13.277559 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9c5571a-71f1-42d9-8025-2f51e13a5f03-catalog-content\") pod \"redhat-marketplace-vks4j\" (UID: \"f9c5571a-71f1-42d9-8025-2f51e13a5f03\") " pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:13 crc kubenswrapper[4764]: I0309 14:10:13.277584 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9c5571a-71f1-42d9-8025-2f51e13a5f03-utilities\") pod \"redhat-marketplace-vks4j\" (UID: \"f9c5571a-71f1-42d9-8025-2f51e13a5f03\") " pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:13 crc kubenswrapper[4764]: I0309 14:10:13.310097 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52qrw\" (UniqueName: \"kubernetes.io/projected/f9c5571a-71f1-42d9-8025-2f51e13a5f03-kube-api-access-52qrw\") pod \"redhat-marketplace-vks4j\" (UID: \"f9c5571a-71f1-42d9-8025-2f51e13a5f03\") " pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:13 crc kubenswrapper[4764]: I0309 14:10:13.425617 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:14 crc kubenswrapper[4764]: I0309 14:10:14.016134 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vks4j"] Mar 09 14:10:14 crc kubenswrapper[4764]: I0309 14:10:14.273857 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vks4j" event={"ID":"f9c5571a-71f1-42d9-8025-2f51e13a5f03","Type":"ContainerStarted","Data":"c4a8e2df77908e08ac398134c388212868113a4e069dbf9e9ca754666552eee8"} Mar 09 14:10:14 crc kubenswrapper[4764]: I0309 14:10:14.273909 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vks4j" event={"ID":"f9c5571a-71f1-42d9-8025-2f51e13a5f03","Type":"ContainerStarted","Data":"3870660a56f929c820c0d702e6958704a17b57192fa2a52ac26a15baa43c7554"} Mar 09 14:10:15 crc kubenswrapper[4764]: I0309 14:10:15.285218 4764 generic.go:334] "Generic (PLEG): container finished" podID="f9c5571a-71f1-42d9-8025-2f51e13a5f03" containerID="c4a8e2df77908e08ac398134c388212868113a4e069dbf9e9ca754666552eee8" exitCode=0 Mar 09 14:10:15 crc kubenswrapper[4764]: I0309 14:10:15.285299 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vks4j" event={"ID":"f9c5571a-71f1-42d9-8025-2f51e13a5f03","Type":"ContainerDied","Data":"c4a8e2df77908e08ac398134c388212868113a4e069dbf9e9ca754666552eee8"} Mar 09 14:10:15 crc kubenswrapper[4764]: I0309 14:10:15.286039 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vks4j" event={"ID":"f9c5571a-71f1-42d9-8025-2f51e13a5f03","Type":"ContainerStarted","Data":"5810019c18336df6ff05082f8f948492687897d783cdf12ed4c00b5fb1717462"} Mar 09 14:10:16 crc kubenswrapper[4764]: I0309 14:10:16.299269 4764 generic.go:334] "Generic (PLEG): container finished" podID="f9c5571a-71f1-42d9-8025-2f51e13a5f03" containerID="5810019c18336df6ff05082f8f948492687897d783cdf12ed4c00b5fb1717462" exitCode=0 Mar 09 14:10:16 crc kubenswrapper[4764]: I0309 14:10:16.299361 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vks4j" event={"ID":"f9c5571a-71f1-42d9-8025-2f51e13a5f03","Type":"ContainerDied","Data":"5810019c18336df6ff05082f8f948492687897d783cdf12ed4c00b5fb1717462"} Mar 09 14:10:17 crc kubenswrapper[4764]: I0309 14:10:17.283979 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-797d44c9b-wrlx7" podUID="e00fc104-ec73-4190-a598-86de7ca6cfa5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.5:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.5:8443: connect: connection refused" Mar 09 14:10:17 crc kubenswrapper[4764]: I0309 14:10:17.347851 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vks4j" event={"ID":"f9c5571a-71f1-42d9-8025-2f51e13a5f03","Type":"ContainerStarted","Data":"fa0246eba6b220102d13a010f48f2dada3c5eb73ad9e6ea15c1975fe88b59d3b"} Mar 09 14:10:17 crc kubenswrapper[4764]: I0309 14:10:17.393864 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vks4j" podStartSLOduration=1.927187277 podStartE2EDuration="4.393838498s" podCreationTimestamp="2026-03-09 14:10:13 +0000 UTC" firstStartedPulling="2026-03-09 14:10:14.277155161 +0000 UTC m=+2969.527327069" lastFinishedPulling="2026-03-09 14:10:16.743806382 +0000 UTC m=+2971.993978290" observedRunningTime="2026-03-09 14:10:17.385081384 +0000 UTC m=+2972.635253332" watchObservedRunningTime="2026-03-09 14:10:17.393838498 +0000 UTC m=+2972.644010406" Mar 09 14:10:17 crc kubenswrapper[4764]: I0309 14:10:17.595040 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-56bb55c768-vchmw" podUID="47ef29f6-4627-4b84-968d-db9d7ed438da" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.6:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.6:8443: connect: connection refused" Mar 09 14:10:21 crc kubenswrapper[4764]: I0309 14:10:21.392385 4764 generic.go:334] "Generic (PLEG): container finished" podID="6711cdff-410c-4d91-b172-c2065054c1be" containerID="8d377192006e66551cb0ea6108fb1f8c6b81bac19c99e99d5e21dffabe9e931d" exitCode=0 Mar 09 14:10:21 crc kubenswrapper[4764]: I0309 14:10:21.392466 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-j6s45" event={"ID":"6711cdff-410c-4d91-b172-c2065054c1be","Type":"ContainerDied","Data":"8d377192006e66551cb0ea6108fb1f8c6b81bac19c99e99d5e21dffabe9e931d"} Mar 09 14:10:22 crc kubenswrapper[4764]: I0309 14:10:22.904142 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-j6s45" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.060782 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-config-data\") pod \"6711cdff-410c-4d91-b172-c2065054c1be\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.060961 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svz5p\" (UniqueName: \"kubernetes.io/projected/6711cdff-410c-4d91-b172-c2065054c1be-kube-api-access-svz5p\") pod \"6711cdff-410c-4d91-b172-c2065054c1be\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.061022 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-job-config-data\") pod \"6711cdff-410c-4d91-b172-c2065054c1be\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.061161 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-combined-ca-bundle\") pod \"6711cdff-410c-4d91-b172-c2065054c1be\" (UID: \"6711cdff-410c-4d91-b172-c2065054c1be\") " Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.090179 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "6711cdff-410c-4d91-b172-c2065054c1be" (UID: "6711cdff-410c-4d91-b172-c2065054c1be"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.090323 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6711cdff-410c-4d91-b172-c2065054c1be-kube-api-access-svz5p" (OuterVolumeSpecName: "kube-api-access-svz5p") pod "6711cdff-410c-4d91-b172-c2065054c1be" (UID: "6711cdff-410c-4d91-b172-c2065054c1be"). InnerVolumeSpecName "kube-api-access-svz5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.103522 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-config-data" (OuterVolumeSpecName: "config-data") pod "6711cdff-410c-4d91-b172-c2065054c1be" (UID: "6711cdff-410c-4d91-b172-c2065054c1be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.110145 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6711cdff-410c-4d91-b172-c2065054c1be" (UID: "6711cdff-410c-4d91-b172-c2065054c1be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.164172 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.164232 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svz5p\" (UniqueName: \"kubernetes.io/projected/6711cdff-410c-4d91-b172-c2065054c1be-kube-api-access-svz5p\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.164247 4764 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-job-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.164257 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6711cdff-410c-4d91-b172-c2065054c1be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.426350 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.426407 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.450302 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-j6s45" event={"ID":"6711cdff-410c-4d91-b172-c2065054c1be","Type":"ContainerDied","Data":"feee9d84b5899b620a2823aa6ab377cd98db0a958fd8bd41a85078ce9d01392c"} Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.450357 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="feee9d84b5899b620a2823aa6ab377cd98db0a958fd8bd41a85078ce9d01392c" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.450433 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-j6s45" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.495369 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.549935 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.748008 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vks4j"] Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.779453 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Mar 09 14:10:23 crc kubenswrapper[4764]: E0309 14:10:23.780033 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6711cdff-410c-4d91-b172-c2065054c1be" containerName="manila-db-sync" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.780048 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6711cdff-410c-4d91-b172-c2065054c1be" containerName="manila-db-sync" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.780281 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6711cdff-410c-4d91-b172-c2065054c1be" containerName="manila-db-sync" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.782523 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.789051 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.790252 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-692nx" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.790597 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.790853 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.815050 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.843148 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.845206 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.851145 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.857375 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.884254 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.884337 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.884364 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-scripts\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.884395 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.884427 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-config-data\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.884481 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq2r9\" (UniqueName: \"kubernetes.io/projected/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-kube-api-access-lq2r9\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.963516 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69655fd4bf-qlzqn"] Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.965320 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.987036 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/604c4e24-15e4-43e1-b08c-74d8337a2e71-ceph\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.987107 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-config-data\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.987168 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.987203 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.987237 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.987256 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-scripts\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.987278 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxzkm\" (UniqueName: \"kubernetes.io/projected/604c4e24-15e4-43e1-b08c-74d8337a2e71-kube-api-access-wxzkm\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.987304 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.987340 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-config-data\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.987357 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.987407 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/604c4e24-15e4-43e1-b08c-74d8337a2e71-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.987439 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq2r9\" (UniqueName: \"kubernetes.io/projected/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-kube-api-access-lq2r9\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.987468 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-scripts\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.987517 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/604c4e24-15e4-43e1-b08c-74d8337a2e71-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.987815 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:23 crc kubenswrapper[4764]: I0309 14:10:23.995780 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-config-data\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:23.999412 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.008856 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69655fd4bf-qlzqn"] Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.030291 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-scripts\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.050535 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.051152 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq2r9\" (UniqueName: \"kubernetes.io/projected/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-kube-api-access-lq2r9\") pod \"manila-scheduler-0\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.092014 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.092081 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxzkm\" (UniqueName: \"kubernetes.io/projected/604c4e24-15e4-43e1-b08c-74d8337a2e71-kube-api-access-wxzkm\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.092111 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1552c7db-c992-4b43-8f1e-2b752d718f36-config\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.092148 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.092187 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/604c4e24-15e4-43e1-b08c-74d8337a2e71-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.092213 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1552c7db-c992-4b43-8f1e-2b752d718f36-openstack-edpm-ipam\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.092239 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-scripts\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.092270 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pnsh\" (UniqueName: \"kubernetes.io/projected/1552c7db-c992-4b43-8f1e-2b752d718f36-kube-api-access-4pnsh\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.092295 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1552c7db-c992-4b43-8f1e-2b752d718f36-ovsdbserver-sb\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.092310 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/604c4e24-15e4-43e1-b08c-74d8337a2e71-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.092355 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1552c7db-c992-4b43-8f1e-2b752d718f36-ovsdbserver-nb\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.092411 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/604c4e24-15e4-43e1-b08c-74d8337a2e71-ceph\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.092440 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-config-data\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.092462 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1552c7db-c992-4b43-8f1e-2b752d718f36-dns-svc\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.092922 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/604c4e24-15e4-43e1-b08c-74d8337a2e71-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.093413 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/604c4e24-15e4-43e1-b08c-74d8337a2e71-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.104843 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-config-data\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.104942 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.106584 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.114162 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.115886 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.117223 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.117892 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-scripts\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.120090 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.129358 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxzkm\" (UniqueName: \"kubernetes.io/projected/604c4e24-15e4-43e1-b08c-74d8337a2e71-kube-api-access-wxzkm\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.134627 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/604c4e24-15e4-43e1-b08c-74d8337a2e71-ceph\") pod \"manila-share-share1-0\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.153128 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.179127 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.198906 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-config-data-custom\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.199544 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.199718 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b1ec63a-2933-4ca0-b695-d491eff9b77a-logs\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.199792 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-scripts\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.199843 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1552c7db-c992-4b43-8f1e-2b752d718f36-openstack-edpm-ipam\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.199940 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pnsh\" (UniqueName: \"kubernetes.io/projected/1552c7db-c992-4b43-8f1e-2b752d718f36-kube-api-access-4pnsh\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.199971 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b1ec63a-2933-4ca0-b695-d491eff9b77a-etc-machine-id\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.199996 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1552c7db-c992-4b43-8f1e-2b752d718f36-ovsdbserver-sb\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.200028 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-config-data\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.200123 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1552c7db-c992-4b43-8f1e-2b752d718f36-ovsdbserver-nb\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.201262 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1552c7db-c992-4b43-8f1e-2b752d718f36-openstack-edpm-ipam\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.201848 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1552c7db-c992-4b43-8f1e-2b752d718f36-dns-svc\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.201991 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1552c7db-c992-4b43-8f1e-2b752d718f36-config\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.202024 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59kpm\" (UniqueName: \"kubernetes.io/projected/6b1ec63a-2933-4ca0-b695-d491eff9b77a-kube-api-access-59kpm\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.202170 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1552c7db-c992-4b43-8f1e-2b752d718f36-ovsdbserver-sb\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.203003 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1552c7db-c992-4b43-8f1e-2b752d718f36-dns-svc\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.203766 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1552c7db-c992-4b43-8f1e-2b752d718f36-ovsdbserver-nb\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.208786 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1552c7db-c992-4b43-8f1e-2b752d718f36-config\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.235956 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pnsh\" (UniqueName: \"kubernetes.io/projected/1552c7db-c992-4b43-8f1e-2b752d718f36-kube-api-access-4pnsh\") pod \"dnsmasq-dns-69655fd4bf-qlzqn\" (UID: \"1552c7db-c992-4b43-8f1e-2b752d718f36\") " pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.307451 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b1ec63a-2933-4ca0-b695-d491eff9b77a-etc-machine-id\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.307517 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-config-data\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.307711 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59kpm\" (UniqueName: \"kubernetes.io/projected/6b1ec63a-2933-4ca0-b695-d491eff9b77a-kube-api-access-59kpm\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.307724 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b1ec63a-2933-4ca0-b695-d491eff9b77a-etc-machine-id\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.307744 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-config-data-custom\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.307839 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.307995 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b1ec63a-2933-4ca0-b695-d491eff9b77a-logs\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.308018 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-scripts\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.310086 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b1ec63a-2933-4ca0-b695-d491eff9b77a-logs\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.317263 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-scripts\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.317544 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-config-data\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.323614 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.329905 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-config-data-custom\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.337413 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.352295 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59kpm\" (UniqueName: \"kubernetes.io/projected/6b1ec63a-2933-4ca0-b695-d491eff9b77a-kube-api-access-59kpm\") pod \"manila-api-0\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.353412 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 09 14:10:24 crc kubenswrapper[4764]: I0309 14:10:24.832942 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 09 14:10:25 crc kubenswrapper[4764]: I0309 14:10:25.024513 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 09 14:10:25 crc kubenswrapper[4764]: I0309 14:10:25.231451 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69655fd4bf-qlzqn"] Mar 09 14:10:25 crc kubenswrapper[4764]: I0309 14:10:25.361180 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 09 14:10:25 crc kubenswrapper[4764]: W0309 14:10:25.450270 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b1ec63a_2933_4ca0_b695_d491eff9b77a.slice/crio-9f8093a4057728ba1ffbcdd3c65ef3993ffdc953d49b97b14351ed827aa851fa WatchSource:0}: Error finding container 9f8093a4057728ba1ffbcdd3c65ef3993ffdc953d49b97b14351ed827aa851fa: Status 404 returned error can't find the container with id 9f8093a4057728ba1ffbcdd3c65ef3993ffdc953d49b97b14351ed827aa851fa Mar 09 14:10:25 crc kubenswrapper[4764]: I0309 14:10:25.647727 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vks4j" podUID="f9c5571a-71f1-42d9-8025-2f51e13a5f03" containerName="registry-server" containerID="cri-o://fa0246eba6b220102d13a010f48f2dada3c5eb73ad9e6ea15c1975fe88b59d3b" gracePeriod=2 Mar 09 14:10:25 crc kubenswrapper[4764]: I0309 14:10:25.656863 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" event={"ID":"1552c7db-c992-4b43-8f1e-2b752d718f36","Type":"ContainerStarted","Data":"6bd1eef56b70234d3056028f3c04208e36deaced74a2fb80c47865a011b7fdf5"} Mar 09 14:10:25 crc kubenswrapper[4764]: I0309 14:10:25.656998 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"604c4e24-15e4-43e1-b08c-74d8337a2e71","Type":"ContainerStarted","Data":"99bc2f9e9539dc58eaf52226466b27a3c1c886f9a53c573b32136461c86d2a63"} Mar 09 14:10:25 crc kubenswrapper[4764]: I0309 14:10:25.657019 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6b1ec63a-2933-4ca0-b695-d491eff9b77a","Type":"ContainerStarted","Data":"9f8093a4057728ba1ffbcdd3c65ef3993ffdc953d49b97b14351ed827aa851fa"} Mar 09 14:10:25 crc kubenswrapper[4764]: I0309 14:10:25.657038 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"30f42bb2-6e26-4cbb-942b-a7d4ede4f128","Type":"ContainerStarted","Data":"69e9c4821e7bd2b8ea5bd5a2a4ce9a1fe30fca8c9b901cad13802f0443f55e38"} Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.323194 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.404378 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9c5571a-71f1-42d9-8025-2f51e13a5f03-catalog-content\") pod \"f9c5571a-71f1-42d9-8025-2f51e13a5f03\" (UID: \"f9c5571a-71f1-42d9-8025-2f51e13a5f03\") " Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.404514 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52qrw\" (UniqueName: \"kubernetes.io/projected/f9c5571a-71f1-42d9-8025-2f51e13a5f03-kube-api-access-52qrw\") pod \"f9c5571a-71f1-42d9-8025-2f51e13a5f03\" (UID: \"f9c5571a-71f1-42d9-8025-2f51e13a5f03\") " Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.404748 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9c5571a-71f1-42d9-8025-2f51e13a5f03-utilities\") pod \"f9c5571a-71f1-42d9-8025-2f51e13a5f03\" (UID: \"f9c5571a-71f1-42d9-8025-2f51e13a5f03\") " Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.406137 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9c5571a-71f1-42d9-8025-2f51e13a5f03-utilities" (OuterVolumeSpecName: "utilities") pod "f9c5571a-71f1-42d9-8025-2f51e13a5f03" (UID: "f9c5571a-71f1-42d9-8025-2f51e13a5f03"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.441380 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9c5571a-71f1-42d9-8025-2f51e13a5f03-kube-api-access-52qrw" (OuterVolumeSpecName: "kube-api-access-52qrw") pod "f9c5571a-71f1-42d9-8025-2f51e13a5f03" (UID: "f9c5571a-71f1-42d9-8025-2f51e13a5f03"). InnerVolumeSpecName "kube-api-access-52qrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.498443 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9c5571a-71f1-42d9-8025-2f51e13a5f03-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9c5571a-71f1-42d9-8025-2f51e13a5f03" (UID: "f9c5571a-71f1-42d9-8025-2f51e13a5f03"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.508014 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9c5571a-71f1-42d9-8025-2f51e13a5f03-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.508061 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52qrw\" (UniqueName: \"kubernetes.io/projected/f9c5571a-71f1-42d9-8025-2f51e13a5f03-kube-api-access-52qrw\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.508076 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9c5571a-71f1-42d9-8025-2f51e13a5f03-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.672697 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6b1ec63a-2933-4ca0-b695-d491eff9b77a","Type":"ContainerStarted","Data":"13ac0cacb1783ca93930b2f289e68734abb6dbe01c5928e15fc30410946d427b"} Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.683162 4764 generic.go:334] "Generic (PLEG): container finished" podID="f9c5571a-71f1-42d9-8025-2f51e13a5f03" containerID="fa0246eba6b220102d13a010f48f2dada3c5eb73ad9e6ea15c1975fe88b59d3b" exitCode=0 Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.683263 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vks4j" event={"ID":"f9c5571a-71f1-42d9-8025-2f51e13a5f03","Type":"ContainerDied","Data":"fa0246eba6b220102d13a010f48f2dada3c5eb73ad9e6ea15c1975fe88b59d3b"} Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.683308 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vks4j" event={"ID":"f9c5571a-71f1-42d9-8025-2f51e13a5f03","Type":"ContainerDied","Data":"3870660a56f929c820c0d702e6958704a17b57192fa2a52ac26a15baa43c7554"} Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.683333 4764 scope.go:117] "RemoveContainer" containerID="fa0246eba6b220102d13a010f48f2dada3c5eb73ad9e6ea15c1975fe88b59d3b" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.683551 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vks4j" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.697065 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"30f42bb2-6e26-4cbb-942b-a7d4ede4f128","Type":"ContainerStarted","Data":"b743badbf09f2d2b0dc2fef97dfa9da34473a9bbc60350880026711252d7f4f7"} Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.704693 4764 generic.go:334] "Generic (PLEG): container finished" podID="1552c7db-c992-4b43-8f1e-2b752d718f36" containerID="69acb3fb37ca44376fdac26f2a551a3d9baa29888217269c37f66f9e543d7d8b" exitCode=0 Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.704749 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" event={"ID":"1552c7db-c992-4b43-8f1e-2b752d718f36","Type":"ContainerDied","Data":"69acb3fb37ca44376fdac26f2a551a3d9baa29888217269c37f66f9e543d7d8b"} Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.743163 4764 scope.go:117] "RemoveContainer" containerID="5810019c18336df6ff05082f8f948492687897d783cdf12ed4c00b5fb1717462" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.769353 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vks4j"] Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.785731 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vks4j"] Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.900695 4764 scope.go:117] "RemoveContainer" containerID="c4a8e2df77908e08ac398134c388212868113a4e069dbf9e9ca754666552eee8" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.961444 4764 scope.go:117] "RemoveContainer" containerID="fa0246eba6b220102d13a010f48f2dada3c5eb73ad9e6ea15c1975fe88b59d3b" Mar 09 14:10:26 crc kubenswrapper[4764]: E0309 14:10:26.962226 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa0246eba6b220102d13a010f48f2dada3c5eb73ad9e6ea15c1975fe88b59d3b\": container with ID starting with fa0246eba6b220102d13a010f48f2dada3c5eb73ad9e6ea15c1975fe88b59d3b not found: ID does not exist" containerID="fa0246eba6b220102d13a010f48f2dada3c5eb73ad9e6ea15c1975fe88b59d3b" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.962298 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa0246eba6b220102d13a010f48f2dada3c5eb73ad9e6ea15c1975fe88b59d3b"} err="failed to get container status \"fa0246eba6b220102d13a010f48f2dada3c5eb73ad9e6ea15c1975fe88b59d3b\": rpc error: code = NotFound desc = could not find container \"fa0246eba6b220102d13a010f48f2dada3c5eb73ad9e6ea15c1975fe88b59d3b\": container with ID starting with fa0246eba6b220102d13a010f48f2dada3c5eb73ad9e6ea15c1975fe88b59d3b not found: ID does not exist" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.962331 4764 scope.go:117] "RemoveContainer" containerID="5810019c18336df6ff05082f8f948492687897d783cdf12ed4c00b5fb1717462" Mar 09 14:10:26 crc kubenswrapper[4764]: E0309 14:10:26.962895 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5810019c18336df6ff05082f8f948492687897d783cdf12ed4c00b5fb1717462\": container with ID starting with 5810019c18336df6ff05082f8f948492687897d783cdf12ed4c00b5fb1717462 not found: ID does not exist" containerID="5810019c18336df6ff05082f8f948492687897d783cdf12ed4c00b5fb1717462" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.962946 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5810019c18336df6ff05082f8f948492687897d783cdf12ed4c00b5fb1717462"} err="failed to get container status \"5810019c18336df6ff05082f8f948492687897d783cdf12ed4c00b5fb1717462\": rpc error: code = NotFound desc = could not find container \"5810019c18336df6ff05082f8f948492687897d783cdf12ed4c00b5fb1717462\": container with ID starting with 5810019c18336df6ff05082f8f948492687897d783cdf12ed4c00b5fb1717462 not found: ID does not exist" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.962984 4764 scope.go:117] "RemoveContainer" containerID="c4a8e2df77908e08ac398134c388212868113a4e069dbf9e9ca754666552eee8" Mar 09 14:10:26 crc kubenswrapper[4764]: E0309 14:10:26.963536 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4a8e2df77908e08ac398134c388212868113a4e069dbf9e9ca754666552eee8\": container with ID starting with c4a8e2df77908e08ac398134c388212868113a4e069dbf9e9ca754666552eee8 not found: ID does not exist" containerID="c4a8e2df77908e08ac398134c388212868113a4e069dbf9e9ca754666552eee8" Mar 09 14:10:26 crc kubenswrapper[4764]: I0309 14:10:26.963588 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4a8e2df77908e08ac398134c388212868113a4e069dbf9e9ca754666552eee8"} err="failed to get container status \"c4a8e2df77908e08ac398134c388212868113a4e069dbf9e9ca754666552eee8\": rpc error: code = NotFound desc = could not find container \"c4a8e2df77908e08ac398134c388212868113a4e069dbf9e9ca754666552eee8\": container with ID starting with c4a8e2df77908e08ac398134c388212868113a4e069dbf9e9ca754666552eee8 not found: ID does not exist" Mar 09 14:10:27 crc kubenswrapper[4764]: I0309 14:10:27.264453 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Mar 09 14:10:27 crc kubenswrapper[4764]: I0309 14:10:27.576161 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9c5571a-71f1-42d9-8025-2f51e13a5f03" path="/var/lib/kubelet/pods/f9c5571a-71f1-42d9-8025-2f51e13a5f03/volumes" Mar 09 14:10:27 crc kubenswrapper[4764]: I0309 14:10:27.723100 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6b1ec63a-2933-4ca0-b695-d491eff9b77a","Type":"ContainerStarted","Data":"5233e0d8d749134147c23cffe783f6754b3b1078a5357cfc1ad88930b4c34b07"} Mar 09 14:10:27 crc kubenswrapper[4764]: I0309 14:10:27.723260 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="6b1ec63a-2933-4ca0-b695-d491eff9b77a" containerName="manila-api-log" containerID="cri-o://13ac0cacb1783ca93930b2f289e68734abb6dbe01c5928e15fc30410946d427b" gracePeriod=30 Mar 09 14:10:27 crc kubenswrapper[4764]: I0309 14:10:27.723473 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="6b1ec63a-2933-4ca0-b695-d491eff9b77a" containerName="manila-api" containerID="cri-o://5233e0d8d749134147c23cffe783f6754b3b1078a5357cfc1ad88930b4c34b07" gracePeriod=30 Mar 09 14:10:27 crc kubenswrapper[4764]: I0309 14:10:27.723795 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Mar 09 14:10:27 crc kubenswrapper[4764]: I0309 14:10:27.731905 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"30f42bb2-6e26-4cbb-942b-a7d4ede4f128","Type":"ContainerStarted","Data":"cbc2d80e7e756e93ed514bd09529210c2d50987e0f38ca0f0f28ae42fe9a1fc9"} Mar 09 14:10:27 crc kubenswrapper[4764]: I0309 14:10:27.743973 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" event={"ID":"1552c7db-c992-4b43-8f1e-2b752d718f36","Type":"ContainerStarted","Data":"c90f8caec5bec2076fee7dbcaea582166bddda9aa7d92ec51426ab2db7e8b3a2"} Mar 09 14:10:27 crc kubenswrapper[4764]: I0309 14:10:27.744227 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:27 crc kubenswrapper[4764]: I0309 14:10:27.758014 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.757979625 podStartE2EDuration="3.757979625s" podCreationTimestamp="2026-03-09 14:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:10:27.74880048 +0000 UTC m=+2982.998972398" watchObservedRunningTime="2026-03-09 14:10:27.757979625 +0000 UTC m=+2983.008151533" Mar 09 14:10:27 crc kubenswrapper[4764]: I0309 14:10:27.798015 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" podStartSLOduration=4.797975033 podStartE2EDuration="4.797975033s" podCreationTimestamp="2026-03-09 14:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:10:27.791542611 +0000 UTC m=+2983.041714529" watchObservedRunningTime="2026-03-09 14:10:27.797975033 +0000 UTC m=+2983.048146941" Mar 09 14:10:27 crc kubenswrapper[4764]: I0309 14:10:27.830765 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=4.131416666 podStartE2EDuration="4.830730828s" podCreationTimestamp="2026-03-09 14:10:23 +0000 UTC" firstStartedPulling="2026-03-09 14:10:24.816075195 +0000 UTC m=+2980.066247103" lastFinishedPulling="2026-03-09 14:10:25.515389357 +0000 UTC m=+2980.765561265" observedRunningTime="2026-03-09 14:10:27.820832754 +0000 UTC m=+2983.071004662" watchObservedRunningTime="2026-03-09 14:10:27.830730828 +0000 UTC m=+2983.080902736" Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.370247 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.370609 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.370735 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.371705 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.371756 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" gracePeriod=600 Mar 09 14:10:28 crc kubenswrapper[4764]: E0309 14:10:28.536289 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.771369 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" exitCode=0 Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.771848 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb"} Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.771893 4764 scope.go:117] "RemoveContainer" containerID="63776e57dad14b6a74a94432519b0b437cfc2082448e28e0a32706fa439569a0" Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.772938 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:10:28 crc kubenswrapper[4764]: E0309 14:10:28.773228 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.781495 4764 generic.go:334] "Generic (PLEG): container finished" podID="6b1ec63a-2933-4ca0-b695-d491eff9b77a" containerID="5233e0d8d749134147c23cffe783f6754b3b1078a5357cfc1ad88930b4c34b07" exitCode=0 Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.781537 4764 generic.go:334] "Generic (PLEG): container finished" podID="6b1ec63a-2933-4ca0-b695-d491eff9b77a" containerID="13ac0cacb1783ca93930b2f289e68734abb6dbe01c5928e15fc30410946d427b" exitCode=143 Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.781591 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6b1ec63a-2933-4ca0-b695-d491eff9b77a","Type":"ContainerDied","Data":"5233e0d8d749134147c23cffe783f6754b3b1078a5357cfc1ad88930b4c34b07"} Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.781677 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6b1ec63a-2933-4ca0-b695-d491eff9b77a","Type":"ContainerDied","Data":"13ac0cacb1783ca93930b2f289e68734abb6dbe01c5928e15fc30410946d427b"} Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.874787 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.914689 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b1ec63a-2933-4ca0-b695-d491eff9b77a-etc-machine-id\") pod \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.914787 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b1ec63a-2933-4ca0-b695-d491eff9b77a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6b1ec63a-2933-4ca0-b695-d491eff9b77a" (UID: "6b1ec63a-2933-4ca0-b695-d491eff9b77a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.914975 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b1ec63a-2933-4ca0-b695-d491eff9b77a-logs\") pod \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.915612 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b1ec63a-2933-4ca0-b695-d491eff9b77a-logs" (OuterVolumeSpecName: "logs") pod "6b1ec63a-2933-4ca0-b695-d491eff9b77a" (UID: "6b1ec63a-2933-4ca0-b695-d491eff9b77a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.915732 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-config-data-custom\") pod \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.917265 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59kpm\" (UniqueName: \"kubernetes.io/projected/6b1ec63a-2933-4ca0-b695-d491eff9b77a-kube-api-access-59kpm\") pod \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.917438 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-combined-ca-bundle\") pod \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.917537 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-scripts\") pod \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.917822 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-config-data\") pod \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\" (UID: \"6b1ec63a-2933-4ca0-b695-d491eff9b77a\") " Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.920837 4764 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b1ec63a-2933-4ca0-b695-d491eff9b77a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.921075 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b1ec63a-2933-4ca0-b695-d491eff9b77a-logs\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.929348 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b1ec63a-2933-4ca0-b695-d491eff9b77a-kube-api-access-59kpm" (OuterVolumeSpecName: "kube-api-access-59kpm") pod "6b1ec63a-2933-4ca0-b695-d491eff9b77a" (UID: "6b1ec63a-2933-4ca0-b695-d491eff9b77a"). InnerVolumeSpecName "kube-api-access-59kpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.940616 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-scripts" (OuterVolumeSpecName: "scripts") pod "6b1ec63a-2933-4ca0-b695-d491eff9b77a" (UID: "6b1ec63a-2933-4ca0-b695-d491eff9b77a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.941871 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6b1ec63a-2933-4ca0-b695-d491eff9b77a" (UID: "6b1ec63a-2933-4ca0-b695-d491eff9b77a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:28 crc kubenswrapper[4764]: I0309 14:10:28.979030 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b1ec63a-2933-4ca0-b695-d491eff9b77a" (UID: "6b1ec63a-2933-4ca0-b695-d491eff9b77a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.023536 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.023585 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59kpm\" (UniqueName: \"kubernetes.io/projected/6b1ec63a-2933-4ca0-b695-d491eff9b77a-kube-api-access-59kpm\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.023610 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.023673 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.037474 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-config-data" (OuterVolumeSpecName: "config-data") pod "6b1ec63a-2933-4ca0-b695-d491eff9b77a" (UID: "6b1ec63a-2933-4ca0-b695-d491eff9b77a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.126499 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b1ec63a-2933-4ca0-b695-d491eff9b77a-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.797893 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6b1ec63a-2933-4ca0-b695-d491eff9b77a","Type":"ContainerDied","Data":"9f8093a4057728ba1ffbcdd3c65ef3993ffdc953d49b97b14351ed827aa851fa"} Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.797953 4764 scope.go:117] "RemoveContainer" containerID="5233e0d8d749134147c23cffe783f6754b3b1078a5357cfc1ad88930b4c34b07" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.797979 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.830003 4764 scope.go:117] "RemoveContainer" containerID="13ac0cacb1783ca93930b2f289e68734abb6dbe01c5928e15fc30410946d427b" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.831759 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.846774 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.874614 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Mar 09 14:10:29 crc kubenswrapper[4764]: E0309 14:10:29.875267 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c5571a-71f1-42d9-8025-2f51e13a5f03" containerName="extract-content" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.875288 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c5571a-71f1-42d9-8025-2f51e13a5f03" containerName="extract-content" Mar 09 14:10:29 crc kubenswrapper[4764]: E0309 14:10:29.875331 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b1ec63a-2933-4ca0-b695-d491eff9b77a" containerName="manila-api" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.875338 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b1ec63a-2933-4ca0-b695-d491eff9b77a" containerName="manila-api" Mar 09 14:10:29 crc kubenswrapper[4764]: E0309 14:10:29.875364 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c5571a-71f1-42d9-8025-2f51e13a5f03" containerName="extract-utilities" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.875375 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c5571a-71f1-42d9-8025-2f51e13a5f03" containerName="extract-utilities" Mar 09 14:10:29 crc kubenswrapper[4764]: E0309 14:10:29.875384 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c5571a-71f1-42d9-8025-2f51e13a5f03" containerName="registry-server" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.875391 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c5571a-71f1-42d9-8025-2f51e13a5f03" containerName="registry-server" Mar 09 14:10:29 crc kubenswrapper[4764]: E0309 14:10:29.875402 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b1ec63a-2933-4ca0-b695-d491eff9b77a" containerName="manila-api-log" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.875409 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b1ec63a-2933-4ca0-b695-d491eff9b77a" containerName="manila-api-log" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.875619 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9c5571a-71f1-42d9-8025-2f51e13a5f03" containerName="registry-server" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.875672 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b1ec63a-2933-4ca0-b695-d491eff9b77a" containerName="manila-api-log" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.875695 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b1ec63a-2933-4ca0-b695-d491eff9b77a" containerName="manila-api" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.876970 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.882218 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.882543 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.882846 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.912776 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.952424 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-config-data-custom\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.952503 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tlj9\" (UniqueName: \"kubernetes.io/projected/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-kube-api-access-6tlj9\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.952557 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.952582 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-internal-tls-certs\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.952613 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-config-data\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.952711 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-scripts\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.952748 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-logs\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.954038 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-public-tls-certs\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:29 crc kubenswrapper[4764]: I0309 14:10:29.954128 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-etc-machine-id\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.058793 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-config-data-custom\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.058864 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tlj9\" (UniqueName: \"kubernetes.io/projected/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-kube-api-access-6tlj9\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.059804 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.059840 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-internal-tls-certs\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.059871 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-config-data\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.060818 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-scripts\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.060854 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-logs\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.060979 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-public-tls-certs\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.061025 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-etc-machine-id\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.061263 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-etc-machine-id\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.062213 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-logs\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.066128 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-config-data-custom\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.066925 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-config-data\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.067481 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.081355 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-scripts\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.081476 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-internal-tls-certs\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.081747 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-public-tls-certs\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.085676 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tlj9\" (UniqueName: \"kubernetes.io/projected/d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d-kube-api-access-6tlj9\") pod \"manila-api-0\" (UID: \"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d\") " pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.218980 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.803736 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.804344 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="ceilometer-central-agent" containerID="cri-o://ff9bf0b1ebb34e175d58582b99ddb3f1097e979301d7cd95e80e6e75808835d6" gracePeriod=30 Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.804479 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="proxy-httpd" containerID="cri-o://5ee7ad3de1f9928c6b59a6676d88b6e84ba3c1fa18e74865387fac5e3b0fa60c" gracePeriod=30 Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.804514 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="sg-core" containerID="cri-o://c44f22885081c03e82b6817f6378545ad23fd40c3fd48004324856781c5d89d8" gracePeriod=30 Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.804550 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="ceilometer-notification-agent" containerID="cri-o://cb0403e3d98f6b11f541bb3ed05951b3aa36eed0ed52f4863be171ce5eaf63e2" gracePeriod=30 Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.884169 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.884831 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:10:30 crc kubenswrapper[4764]: I0309 14:10:30.917003 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.606090 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b1ec63a-2933-4ca0-b695-d491eff9b77a" path="/var/lib/kubelet/pods/6b1ec63a-2933-4ca0-b695-d491eff9b77a/volumes" Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.930305 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.965147 4764 generic.go:334] "Generic (PLEG): container finished" podID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerID="5ee7ad3de1f9928c6b59a6676d88b6e84ba3c1fa18e74865387fac5e3b0fa60c" exitCode=0 Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.965551 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7220ea42-daf2-4c41-85c9-0d2bda6d24eb","Type":"ContainerDied","Data":"5ee7ad3de1f9928c6b59a6676d88b6e84ba3c1fa18e74865387fac5e3b0fa60c"} Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.965708 4764 generic.go:334] "Generic (PLEG): container finished" podID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerID="c44f22885081c03e82b6817f6378545ad23fd40c3fd48004324856781c5d89d8" exitCode=2 Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.965724 4764 generic.go:334] "Generic (PLEG): container finished" podID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerID="cb0403e3d98f6b11f541bb3ed05951b3aa36eed0ed52f4863be171ce5eaf63e2" exitCode=0 Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.965735 4764 generic.go:334] "Generic (PLEG): container finished" podID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerID="ff9bf0b1ebb34e175d58582b99ddb3f1097e979301d7cd95e80e6e75808835d6" exitCode=0 Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.965904 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7220ea42-daf2-4c41-85c9-0d2bda6d24eb","Type":"ContainerDied","Data":"c44f22885081c03e82b6817f6378545ad23fd40c3fd48004324856781c5d89d8"} Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.965928 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7220ea42-daf2-4c41-85c9-0d2bda6d24eb","Type":"ContainerDied","Data":"cb0403e3d98f6b11f541bb3ed05951b3aa36eed0ed52f4863be171ce5eaf63e2"} Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.965947 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7220ea42-daf2-4c41-85c9-0d2bda6d24eb","Type":"ContainerDied","Data":"ff9bf0b1ebb34e175d58582b99ddb3f1097e979301d7cd95e80e6e75808835d6"} Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.982945 4764 generic.go:334] "Generic (PLEG): container finished" podID="849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" containerID="c5d8efe5920ca730c02900ccbfc9cdc3905256778569090d306fce8d9fecf051" exitCode=137 Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.982984 4764 generic.go:334] "Generic (PLEG): container finished" podID="849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" containerID="fabc9d27abd14d14db86a85f4413de77c2fd101b8722a207f502ad9ee60b4405" exitCode=137 Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.983035 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9d94c4c7-px8jh" event={"ID":"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243","Type":"ContainerDied","Data":"c5d8efe5920ca730c02900ccbfc9cdc3905256778569090d306fce8d9fecf051"} Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.983067 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9d94c4c7-px8jh" event={"ID":"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243","Type":"ContainerDied","Data":"fabc9d27abd14d14db86a85f4413de77c2fd101b8722a207f502ad9ee60b4405"} Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.994491 4764 generic.go:334] "Generic (PLEG): container finished" podID="6b944cf9-8278-4b16-b09c-0da6a2519b2a" containerID="b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd" exitCode=137 Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.994530 4764 generic.go:334] "Generic (PLEG): container finished" podID="6b944cf9-8278-4b16-b09c-0da6a2519b2a" containerID="1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1" exitCode=137 Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.994597 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55f8b7fc4c-6rdd7" event={"ID":"6b944cf9-8278-4b16-b09c-0da6a2519b2a","Type":"ContainerDied","Data":"b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd"} Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.994627 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55f8b7fc4c-6rdd7" event={"ID":"6b944cf9-8278-4b16-b09c-0da6a2519b2a","Type":"ContainerDied","Data":"1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1"} Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.994636 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55f8b7fc4c-6rdd7" event={"ID":"6b944cf9-8278-4b16-b09c-0da6a2519b2a","Type":"ContainerDied","Data":"03eb026db669a1bbc8569cf6227719fa0346433ca9e38091ef450a1f5669648c"} Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.994672 4764 scope.go:117] "RemoveContainer" containerID="b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd" Mar 09 14:10:31 crc kubenswrapper[4764]: I0309 14:10:31.994761 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55f8b7fc4c-6rdd7" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.005559 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d","Type":"ContainerStarted","Data":"970d9fd4daa3fbe40f5c9a6a8b41d61d6de1c7fac4538a6256c7207ce935e71b"} Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.005613 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d","Type":"ContainerStarted","Data":"fcf3d8e0199dcae7fca5124073c5e5842dc32d90dcd90cf221f911ec93541560"} Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.021630 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b944cf9-8278-4b16-b09c-0da6a2519b2a-scripts\") pod \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.021744 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlm9t\" (UniqueName: \"kubernetes.io/projected/6b944cf9-8278-4b16-b09c-0da6a2519b2a-kube-api-access-nlm9t\") pod \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.021906 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b944cf9-8278-4b16-b09c-0da6a2519b2a-config-data\") pod \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.022008 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b944cf9-8278-4b16-b09c-0da6a2519b2a-logs\") pod \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.022055 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6b944cf9-8278-4b16-b09c-0da6a2519b2a-horizon-secret-key\") pod \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\" (UID: \"6b944cf9-8278-4b16-b09c-0da6a2519b2a\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.024175 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b944cf9-8278-4b16-b09c-0da6a2519b2a-logs" (OuterVolumeSpecName: "logs") pod "6b944cf9-8278-4b16-b09c-0da6a2519b2a" (UID: "6b944cf9-8278-4b16-b09c-0da6a2519b2a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.025106 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.029288 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b944cf9-8278-4b16-b09c-0da6a2519b2a-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "6b944cf9-8278-4b16-b09c-0da6a2519b2a" (UID: "6b944cf9-8278-4b16-b09c-0da6a2519b2a"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.035624 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b944cf9-8278-4b16-b09c-0da6a2519b2a-kube-api-access-nlm9t" (OuterVolumeSpecName: "kube-api-access-nlm9t") pod "6b944cf9-8278-4b16-b09c-0da6a2519b2a" (UID: "6b944cf9-8278-4b16-b09c-0da6a2519b2a"). InnerVolumeSpecName "kube-api-access-nlm9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.117538 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b944cf9-8278-4b16-b09c-0da6a2519b2a-config-data" (OuterVolumeSpecName: "config-data") pod "6b944cf9-8278-4b16-b09c-0da6a2519b2a" (UID: "6b944cf9-8278-4b16-b09c-0da6a2519b2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.117474 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b944cf9-8278-4b16-b09c-0da6a2519b2a-scripts" (OuterVolumeSpecName: "scripts") pod "6b944cf9-8278-4b16-b09c-0da6a2519b2a" (UID: "6b944cf9-8278-4b16-b09c-0da6a2519b2a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.124458 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-scripts\") pod \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.124609 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-horizon-secret-key\") pod \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.124726 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktc9m\" (UniqueName: \"kubernetes.io/projected/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-kube-api-access-ktc9m\") pod \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.124799 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-logs\") pod \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.124855 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-config-data\") pod \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\" (UID: \"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.125974 4764 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6b944cf9-8278-4b16-b09c-0da6a2519b2a-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.125995 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b944cf9-8278-4b16-b09c-0da6a2519b2a-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.126010 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlm9t\" (UniqueName: \"kubernetes.io/projected/6b944cf9-8278-4b16-b09c-0da6a2519b2a-kube-api-access-nlm9t\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.126026 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b944cf9-8278-4b16-b09c-0da6a2519b2a-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.126038 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b944cf9-8278-4b16-b09c-0da6a2519b2a-logs\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.130425 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-logs" (OuterVolumeSpecName: "logs") pod "849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" (UID: "849dc0e6-d7a3-4745-8cc0-7b7af3e4d243"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.134432 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-kube-api-access-ktc9m" (OuterVolumeSpecName: "kube-api-access-ktc9m") pod "849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" (UID: "849dc0e6-d7a3-4745-8cc0-7b7af3e4d243"). InnerVolumeSpecName "kube-api-access-ktc9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.137526 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" (UID: "849dc0e6-d7a3-4745-8cc0-7b7af3e4d243"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.151978 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-scripts" (OuterVolumeSpecName: "scripts") pod "849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" (UID: "849dc0e6-d7a3-4745-8cc0-7b7af3e4d243"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.168885 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-config-data" (OuterVolumeSpecName: "config-data") pod "849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" (UID: "849dc0e6-d7a3-4745-8cc0-7b7af3e4d243"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.228494 4764 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.228541 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktc9m\" (UniqueName: \"kubernetes.io/projected/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-kube-api-access-ktc9m\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.228556 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-logs\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.228569 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.228581 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.306512 4764 scope.go:117] "RemoveContainer" containerID="1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.488300 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.498582 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-55f8b7fc4c-6rdd7"] Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.527335 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-55f8b7fc4c-6rdd7"] Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.537533 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-sg-core-conf-yaml\") pod \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.537599 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-combined-ca-bundle\") pod \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.537689 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-run-httpd\") pod \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.537723 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-scripts\") pod \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.537812 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-log-httpd\") pod \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.537853 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-config-data\") pod \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.537888 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-ceilometer-tls-certs\") pod \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.538112 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxjf8\" (UniqueName: \"kubernetes.io/projected/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-kube-api-access-kxjf8\") pod \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\" (UID: \"7220ea42-daf2-4c41-85c9-0d2bda6d24eb\") " Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.540235 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7220ea42-daf2-4c41-85c9-0d2bda6d24eb" (UID: "7220ea42-daf2-4c41-85c9-0d2bda6d24eb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.540336 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7220ea42-daf2-4c41-85c9-0d2bda6d24eb" (UID: "7220ea42-daf2-4c41-85c9-0d2bda6d24eb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.547518 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-scripts" (OuterVolumeSpecName: "scripts") pod "7220ea42-daf2-4c41-85c9-0d2bda6d24eb" (UID: "7220ea42-daf2-4c41-85c9-0d2bda6d24eb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.575715 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-kube-api-access-kxjf8" (OuterVolumeSpecName: "kube-api-access-kxjf8") pod "7220ea42-daf2-4c41-85c9-0d2bda6d24eb" (UID: "7220ea42-daf2-4c41-85c9-0d2bda6d24eb"). InnerVolumeSpecName "kube-api-access-kxjf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.611946 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7220ea42-daf2-4c41-85c9-0d2bda6d24eb" (UID: "7220ea42-daf2-4c41-85c9-0d2bda6d24eb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.647450 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.647503 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.647518 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.647531 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxjf8\" (UniqueName: \"kubernetes.io/projected/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-kube-api-access-kxjf8\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.647543 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.703163 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7220ea42-daf2-4c41-85c9-0d2bda6d24eb" (UID: "7220ea42-daf2-4c41-85c9-0d2bda6d24eb"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.750495 4764 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.787537 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7220ea42-daf2-4c41-85c9-0d2bda6d24eb" (UID: "7220ea42-daf2-4c41-85c9-0d2bda6d24eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.799446 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-config-data" (OuterVolumeSpecName: "config-data") pod "7220ea42-daf2-4c41-85c9-0d2bda6d24eb" (UID: "7220ea42-daf2-4c41-85c9-0d2bda6d24eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.852736 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:32 crc kubenswrapper[4764]: I0309 14:10:32.852798 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7220ea42-daf2-4c41-85c9-0d2bda6d24eb-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.039164 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9d94c4c7-px8jh" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.039195 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9d94c4c7-px8jh" event={"ID":"849dc0e6-d7a3-4745-8cc0-7b7af3e4d243","Type":"ContainerDied","Data":"dc93d1c29f222a1602e24053b4830b3eddb16bbf1d9374aeb6068fdf3ba6030f"} Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.078774 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-9d94c4c7-px8jh"] Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.081530 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d","Type":"ContainerStarted","Data":"8f2ad30ba8f80d63139adfaea98a7ecd975946f15c7c05e8f173a403c0e77b0a"} Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.081823 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.095674 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-9d94c4c7-px8jh"] Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.100845 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7220ea42-daf2-4c41-85c9-0d2bda6d24eb","Type":"ContainerDied","Data":"83415e6b4960e541c9fc0ec3cd4865cce73b704e20a342572bf182a8978c8bc9"} Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.101059 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.115027 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.115007431 podStartE2EDuration="4.115007431s" podCreationTimestamp="2026-03-09 14:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:10:33.109446332 +0000 UTC m=+2988.359618270" watchObservedRunningTime="2026-03-09 14:10:33.115007431 +0000 UTC m=+2988.365179339" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.181815 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.252302 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.267980 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:10:33 crc kubenswrapper[4764]: E0309 14:10:33.268724 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b944cf9-8278-4b16-b09c-0da6a2519b2a" containerName="horizon" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.268744 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b944cf9-8278-4b16-b09c-0da6a2519b2a" containerName="horizon" Mar 09 14:10:33 crc kubenswrapper[4764]: E0309 14:10:33.268765 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="ceilometer-central-agent" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.268773 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="ceilometer-central-agent" Mar 09 14:10:33 crc kubenswrapper[4764]: E0309 14:10:33.268793 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" containerName="horizon-log" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.268802 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" containerName="horizon-log" Mar 09 14:10:33 crc kubenswrapper[4764]: E0309 14:10:33.268817 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="ceilometer-notification-agent" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.268824 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="ceilometer-notification-agent" Mar 09 14:10:33 crc kubenswrapper[4764]: E0309 14:10:33.268842 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b944cf9-8278-4b16-b09c-0da6a2519b2a" containerName="horizon-log" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.268852 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b944cf9-8278-4b16-b09c-0da6a2519b2a" containerName="horizon-log" Mar 09 14:10:33 crc kubenswrapper[4764]: E0309 14:10:33.268870 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="proxy-httpd" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.268877 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="proxy-httpd" Mar 09 14:10:33 crc kubenswrapper[4764]: E0309 14:10:33.268904 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" containerName="horizon" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.268912 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" containerName="horizon" Mar 09 14:10:33 crc kubenswrapper[4764]: E0309 14:10:33.268926 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="sg-core" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.268944 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="sg-core" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.269162 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="proxy-httpd" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.269176 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="sg-core" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.269183 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="ceilometer-notification-agent" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.269193 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" containerName="horizon-log" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.269204 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b944cf9-8278-4b16-b09c-0da6a2519b2a" containerName="horizon" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.269216 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b944cf9-8278-4b16-b09c-0da6a2519b2a" containerName="horizon-log" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.269225 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" containerName="ceilometer-central-agent" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.269239 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" containerName="horizon" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.271545 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.279758 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.281289 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.281806 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.282163 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.312395 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.381564 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.381704 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cba66048-6589-4bda-99bc-f2b62d5a16cd-log-httpd\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.381772 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.381841 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-scripts\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.381878 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-config-data\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.381944 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cba66048-6589-4bda-99bc-f2b62d5a16cd-run-httpd\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.382044 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccg6p\" (UniqueName: \"kubernetes.io/projected/cba66048-6589-4bda-99bc-f2b62d5a16cd-kube-api-access-ccg6p\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.382103 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.444282 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-56bb55c768-vchmw" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.485255 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cba66048-6589-4bda-99bc-f2b62d5a16cd-run-httpd\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.485343 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccg6p\" (UniqueName: \"kubernetes.io/projected/cba66048-6589-4bda-99bc-f2b62d5a16cd-kube-api-access-ccg6p\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.485378 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.485430 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.485457 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cba66048-6589-4bda-99bc-f2b62d5a16cd-log-httpd\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.485501 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.485546 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-scripts\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.485590 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-config-data\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.487610 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cba66048-6589-4bda-99bc-f2b62d5a16cd-run-httpd\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.488068 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cba66048-6589-4bda-99bc-f2b62d5a16cd-log-httpd\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.494496 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.495165 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.510534 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.512599 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-config-data\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.524944 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-scripts\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.538405 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccg6p\" (UniqueName: \"kubernetes.io/projected/cba66048-6589-4bda-99bc-f2b62d5a16cd-kube-api-access-ccg6p\") pod \"ceilometer-0\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " pod="openstack/ceilometer-0" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.584272 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b944cf9-8278-4b16-b09c-0da6a2519b2a" path="/var/lib/kubelet/pods/6b944cf9-8278-4b16-b09c-0da6a2519b2a/volumes" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.585285 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7220ea42-daf2-4c41-85c9-0d2bda6d24eb" path="/var/lib/kubelet/pods/7220ea42-daf2-4c41-85c9-0d2bda6d24eb/volumes" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.586798 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="849dc0e6-d7a3-4745-8cc0-7b7af3e4d243" path="/var/lib/kubelet/pods/849dc0e6-d7a3-4745-8cc0-7b7af3e4d243/volumes" Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.587526 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-797d44c9b-wrlx7"] Mar 09 14:10:33 crc kubenswrapper[4764]: I0309 14:10:33.683403 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:10:34 crc kubenswrapper[4764]: I0309 14:10:34.113135 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-797d44c9b-wrlx7" podUID="e00fc104-ec73-4190-a598-86de7ca6cfa5" containerName="horizon-log" containerID="cri-o://ed7bd666dbaa56d7a77980118a8739a44cafc9a966ea8469e78e592a4a49ac96" gracePeriod=30 Mar 09 14:10:34 crc kubenswrapper[4764]: I0309 14:10:34.113291 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-797d44c9b-wrlx7" podUID="e00fc104-ec73-4190-a598-86de7ca6cfa5" containerName="horizon" containerID="cri-o://c791eb4afad3d3b7319a4ecbb9ad7314d04c8eb6a4954765b916e231a57d04c0" gracePeriod=30 Mar 09 14:10:34 crc kubenswrapper[4764]: I0309 14:10:34.120123 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Mar 09 14:10:34 crc kubenswrapper[4764]: I0309 14:10:34.339857 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69655fd4bf-qlzqn" Mar 09 14:10:34 crc kubenswrapper[4764]: I0309 14:10:34.477443 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fbc59fbb7-688wh"] Mar 09 14:10:34 crc kubenswrapper[4764]: I0309 14:10:34.478063 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" podUID="6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff" containerName="dnsmasq-dns" containerID="cri-o://0b92f70675fc36bf2d871b5244dabd98b74bc41c0324eee9ebebf35ece42f834" gracePeriod=10 Mar 09 14:10:35 crc kubenswrapper[4764]: I0309 14:10:35.134919 4764 generic.go:334] "Generic (PLEG): container finished" podID="6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff" containerID="0b92f70675fc36bf2d871b5244dabd98b74bc41c0324eee9ebebf35ece42f834" exitCode=0 Mar 09 14:10:35 crc kubenswrapper[4764]: I0309 14:10:35.134982 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" event={"ID":"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff","Type":"ContainerDied","Data":"0b92f70675fc36bf2d871b5244dabd98b74bc41c0324eee9ebebf35ece42f834"} Mar 09 14:10:36 crc kubenswrapper[4764]: I0309 14:10:36.718632 4764 scope.go:117] "RemoveContainer" containerID="b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd" Mar 09 14:10:36 crc kubenswrapper[4764]: E0309 14:10:36.719369 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd\": container with ID starting with b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd not found: ID does not exist" containerID="b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd" Mar 09 14:10:36 crc kubenswrapper[4764]: I0309 14:10:36.719437 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd"} err="failed to get container status \"b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd\": rpc error: code = NotFound desc = could not find container \"b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd\": container with ID starting with b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd not found: ID does not exist" Mar 09 14:10:36 crc kubenswrapper[4764]: I0309 14:10:36.719472 4764 scope.go:117] "RemoveContainer" containerID="1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1" Mar 09 14:10:36 crc kubenswrapper[4764]: E0309 14:10:36.719873 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1\": container with ID starting with 1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1 not found: ID does not exist" containerID="1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1" Mar 09 14:10:36 crc kubenswrapper[4764]: I0309 14:10:36.719942 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1"} err="failed to get container status \"1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1\": rpc error: code = NotFound desc = could not find container \"1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1\": container with ID starting with 1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1 not found: ID does not exist" Mar 09 14:10:36 crc kubenswrapper[4764]: I0309 14:10:36.720004 4764 scope.go:117] "RemoveContainer" containerID="b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd" Mar 09 14:10:36 crc kubenswrapper[4764]: I0309 14:10:36.720203 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd"} err="failed to get container status \"b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd\": rpc error: code = NotFound desc = could not find container \"b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd\": container with ID starting with b27e84a67d9f2a8b29e994ee9d77770f5c7c484de1e6b73aa1c344c5608643bd not found: ID does not exist" Mar 09 14:10:36 crc kubenswrapper[4764]: I0309 14:10:36.720252 4764 scope.go:117] "RemoveContainer" containerID="1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1" Mar 09 14:10:36 crc kubenswrapper[4764]: I0309 14:10:36.720434 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1"} err="failed to get container status \"1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1\": rpc error: code = NotFound desc = could not find container \"1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1\": container with ID starting with 1f4d4a3494bb93e1fe6318e4f872bd8ce3d51c074d2638a3dabf2ca00148a1d1 not found: ID does not exist" Mar 09 14:10:36 crc kubenswrapper[4764]: I0309 14:10:36.720477 4764 scope.go:117] "RemoveContainer" containerID="c5d8efe5920ca730c02900ccbfc9cdc3905256778569090d306fce8d9fecf051" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.018430 4764 scope.go:117] "RemoveContainer" containerID="fabc9d27abd14d14db86a85f4413de77c2fd101b8722a207f502ad9ee60b4405" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.177339 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" event={"ID":"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff","Type":"ContainerDied","Data":"7623e23ae56b8d47625b88ad80b059396febc09e926f11449c6996f010201d29"} Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.177761 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7623e23ae56b8d47625b88ad80b059396febc09e926f11449c6996f010201d29" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.185354 4764 scope.go:117] "RemoveContainer" containerID="5ee7ad3de1f9928c6b59a6676d88b6e84ba3c1fa18e74865387fac5e3b0fa60c" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.227929 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.256891 4764 scope.go:117] "RemoveContainer" containerID="c44f22885081c03e82b6817f6378545ad23fd40c3fd48004324856781c5d89d8" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.292513 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-797d44c9b-wrlx7" podUID="e00fc104-ec73-4190-a598-86de7ca6cfa5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.5:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:48566->10.217.1.5:8443: read: connection reset by peer" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.293499 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j76rw\" (UniqueName: \"kubernetes.io/projected/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-kube-api-access-j76rw\") pod \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.293558 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-dns-svc\") pod \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.293714 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-openstack-edpm-ipam\") pod \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.293850 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-config\") pod \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.293890 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-ovsdbserver-sb\") pod \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.294577 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-ovsdbserver-nb\") pod \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\" (UID: \"6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff\") " Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.308088 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-kube-api-access-j76rw" (OuterVolumeSpecName: "kube-api-access-j76rw") pod "6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff" (UID: "6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff"). InnerVolumeSpecName "kube-api-access-j76rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.311914 4764 scope.go:117] "RemoveContainer" containerID="cb0403e3d98f6b11f541bb3ed05951b3aa36eed0ed52f4863be171ce5eaf63e2" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.377925 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff" (UID: "6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.398718 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.399077 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j76rw\" (UniqueName: \"kubernetes.io/projected/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-kube-api-access-j76rw\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.430390 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff" (UID: "6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.433240 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff" (UID: "6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.460372 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff" (UID: "6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.466699 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-config" (OuterVolumeSpecName: "config") pod "6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff" (UID: "6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.467288 4764 scope.go:117] "RemoveContainer" containerID="ff9bf0b1ebb34e175d58582b99ddb3f1097e979301d7cd95e80e6e75808835d6" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.501828 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.501875 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.501894 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.501905 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:37 crc kubenswrapper[4764]: I0309 14:10:37.577417 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:10:37 crc kubenswrapper[4764]: W0309 14:10:37.584290 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcba66048_6589_4bda_99bc_f2b62d5a16cd.slice/crio-214f8758c7bcbceff8c2118cea823d3832d509fe2a476aefc1476edcb74a1b81 WatchSource:0}: Error finding container 214f8758c7bcbceff8c2118cea823d3832d509fe2a476aefc1476edcb74a1b81: Status 404 returned error can't find the container with id 214f8758c7bcbceff8c2118cea823d3832d509fe2a476aefc1476edcb74a1b81 Mar 09 14:10:38 crc kubenswrapper[4764]: I0309 14:10:38.197270 4764 generic.go:334] "Generic (PLEG): container finished" podID="e00fc104-ec73-4190-a598-86de7ca6cfa5" containerID="c791eb4afad3d3b7319a4ecbb9ad7314d04c8eb6a4954765b916e231a57d04c0" exitCode=0 Mar 09 14:10:38 crc kubenswrapper[4764]: I0309 14:10:38.197491 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-797d44c9b-wrlx7" event={"ID":"e00fc104-ec73-4190-a598-86de7ca6cfa5","Type":"ContainerDied","Data":"c791eb4afad3d3b7319a4ecbb9ad7314d04c8eb6a4954765b916e231a57d04c0"} Mar 09 14:10:38 crc kubenswrapper[4764]: I0309 14:10:38.209417 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cba66048-6589-4bda-99bc-f2b62d5a16cd","Type":"ContainerStarted","Data":"214f8758c7bcbceff8c2118cea823d3832d509fe2a476aefc1476edcb74a1b81"} Mar 09 14:10:38 crc kubenswrapper[4764]: I0309 14:10:38.216325 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fbc59fbb7-688wh" Mar 09 14:10:38 crc kubenswrapper[4764]: I0309 14:10:38.217361 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"604c4e24-15e4-43e1-b08c-74d8337a2e71","Type":"ContainerStarted","Data":"824266a7cc9cb5d7239bcaba8f3fa368c333aac554ec8e5c7b1cc64368838f39"} Mar 09 14:10:38 crc kubenswrapper[4764]: I0309 14:10:38.289466 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fbc59fbb7-688wh"] Mar 09 14:10:38 crc kubenswrapper[4764]: I0309 14:10:38.308084 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fbc59fbb7-688wh"] Mar 09 14:10:39 crc kubenswrapper[4764]: I0309 14:10:39.226900 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:10:39 crc kubenswrapper[4764]: I0309 14:10:39.232713 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cba66048-6589-4bda-99bc-f2b62d5a16cd","Type":"ContainerStarted","Data":"9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0"} Mar 09 14:10:39 crc kubenswrapper[4764]: I0309 14:10:39.232791 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cba66048-6589-4bda-99bc-f2b62d5a16cd","Type":"ContainerStarted","Data":"7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46"} Mar 09 14:10:39 crc kubenswrapper[4764]: I0309 14:10:39.241323 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"604c4e24-15e4-43e1-b08c-74d8337a2e71","Type":"ContainerStarted","Data":"03c13ad847d2a8fd02eb6ed9a1ace4039b7b02e90605b4014e12bc57ed3abe55"} Mar 09 14:10:39 crc kubenswrapper[4764]: I0309 14:10:39.275405 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=4.27057249 podStartE2EDuration="16.275383565s" podCreationTimestamp="2026-03-09 14:10:23 +0000 UTC" firstStartedPulling="2026-03-09 14:10:25.024464499 +0000 UTC m=+2980.274636407" lastFinishedPulling="2026-03-09 14:10:37.029275574 +0000 UTC m=+2992.279447482" observedRunningTime="2026-03-09 14:10:39.267007172 +0000 UTC m=+2994.517179080" watchObservedRunningTime="2026-03-09 14:10:39.275383565 +0000 UTC m=+2994.525555473" Mar 09 14:10:39 crc kubenswrapper[4764]: I0309 14:10:39.574118 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff" path="/var/lib/kubelet/pods/6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff/volumes" Mar 09 14:10:40 crc kubenswrapper[4764]: I0309 14:10:40.255363 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cba66048-6589-4bda-99bc-f2b62d5a16cd","Type":"ContainerStarted","Data":"9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa"} Mar 09 14:10:43 crc kubenswrapper[4764]: I0309 14:10:43.287664 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cba66048-6589-4bda-99bc-f2b62d5a16cd","Type":"ContainerStarted","Data":"606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f"} Mar 09 14:10:43 crc kubenswrapper[4764]: I0309 14:10:43.287850 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="ceilometer-central-agent" containerID="cri-o://7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46" gracePeriod=30 Mar 09 14:10:43 crc kubenswrapper[4764]: I0309 14:10:43.287916 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="proxy-httpd" containerID="cri-o://606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f" gracePeriod=30 Mar 09 14:10:43 crc kubenswrapper[4764]: I0309 14:10:43.287930 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="sg-core" containerID="cri-o://9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa" gracePeriod=30 Mar 09 14:10:43 crc kubenswrapper[4764]: I0309 14:10:43.287940 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="ceilometer-notification-agent" containerID="cri-o://9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0" gracePeriod=30 Mar 09 14:10:43 crc kubenswrapper[4764]: I0309 14:10:43.290868 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 14:10:43 crc kubenswrapper[4764]: I0309 14:10:43.320067 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.767577018 podStartE2EDuration="10.32003947s" podCreationTimestamp="2026-03-09 14:10:33 +0000 UTC" firstStartedPulling="2026-03-09 14:10:37.58978513 +0000 UTC m=+2992.839957028" lastFinishedPulling="2026-03-09 14:10:42.142247572 +0000 UTC m=+2997.392419480" observedRunningTime="2026-03-09 14:10:43.313558037 +0000 UTC m=+2998.563729945" watchObservedRunningTime="2026-03-09 14:10:43.32003947 +0000 UTC m=+2998.570211408" Mar 09 14:10:43 crc kubenswrapper[4764]: I0309 14:10:43.560757 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:10:43 crc kubenswrapper[4764]: E0309 14:10:43.561735 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.101317 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.179935 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.188078 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccg6p\" (UniqueName: \"kubernetes.io/projected/cba66048-6589-4bda-99bc-f2b62d5a16cd-kube-api-access-ccg6p\") pod \"cba66048-6589-4bda-99bc-f2b62d5a16cd\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.188196 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cba66048-6589-4bda-99bc-f2b62d5a16cd-run-httpd\") pod \"cba66048-6589-4bda-99bc-f2b62d5a16cd\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.188309 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-combined-ca-bundle\") pod \"cba66048-6589-4bda-99bc-f2b62d5a16cd\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.188369 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cba66048-6589-4bda-99bc-f2b62d5a16cd-log-httpd\") pod \"cba66048-6589-4bda-99bc-f2b62d5a16cd\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.188400 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-config-data\") pod \"cba66048-6589-4bda-99bc-f2b62d5a16cd\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.188541 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-ceilometer-tls-certs\") pod \"cba66048-6589-4bda-99bc-f2b62d5a16cd\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.188709 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-sg-core-conf-yaml\") pod \"cba66048-6589-4bda-99bc-f2b62d5a16cd\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.188773 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-scripts\") pod \"cba66048-6589-4bda-99bc-f2b62d5a16cd\" (UID: \"cba66048-6589-4bda-99bc-f2b62d5a16cd\") " Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.189036 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba66048-6589-4bda-99bc-f2b62d5a16cd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cba66048-6589-4bda-99bc-f2b62d5a16cd" (UID: "cba66048-6589-4bda-99bc-f2b62d5a16cd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.189456 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cba66048-6589-4bda-99bc-f2b62d5a16cd-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.190214 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba66048-6589-4bda-99bc-f2b62d5a16cd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cba66048-6589-4bda-99bc-f2b62d5a16cd" (UID: "cba66048-6589-4bda-99bc-f2b62d5a16cd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.195957 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cba66048-6589-4bda-99bc-f2b62d5a16cd-kube-api-access-ccg6p" (OuterVolumeSpecName: "kube-api-access-ccg6p") pod "cba66048-6589-4bda-99bc-f2b62d5a16cd" (UID: "cba66048-6589-4bda-99bc-f2b62d5a16cd"). InnerVolumeSpecName "kube-api-access-ccg6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.196011 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-scripts" (OuterVolumeSpecName: "scripts") pod "cba66048-6589-4bda-99bc-f2b62d5a16cd" (UID: "cba66048-6589-4bda-99bc-f2b62d5a16cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.231551 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cba66048-6589-4bda-99bc-f2b62d5a16cd" (UID: "cba66048-6589-4bda-99bc-f2b62d5a16cd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.247830 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "cba66048-6589-4bda-99bc-f2b62d5a16cd" (UID: "cba66048-6589-4bda-99bc-f2b62d5a16cd"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.288898 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cba66048-6589-4bda-99bc-f2b62d5a16cd" (UID: "cba66048-6589-4bda-99bc-f2b62d5a16cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.292762 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.292797 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.292811 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccg6p\" (UniqueName: \"kubernetes.io/projected/cba66048-6589-4bda-99bc-f2b62d5a16cd-kube-api-access-ccg6p\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.292826 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.292838 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cba66048-6589-4bda-99bc-f2b62d5a16cd-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.292850 4764 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.301961 4764 generic.go:334] "Generic (PLEG): container finished" podID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerID="606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f" exitCode=0 Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.302001 4764 generic.go:334] "Generic (PLEG): container finished" podID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerID="9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa" exitCode=2 Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.302012 4764 generic.go:334] "Generic (PLEG): container finished" podID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerID="9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0" exitCode=0 Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.302022 4764 generic.go:334] "Generic (PLEG): container finished" podID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerID="7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46" exitCode=0 Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.302054 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.302051 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cba66048-6589-4bda-99bc-f2b62d5a16cd","Type":"ContainerDied","Data":"606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f"} Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.302123 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cba66048-6589-4bda-99bc-f2b62d5a16cd","Type":"ContainerDied","Data":"9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa"} Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.302138 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cba66048-6589-4bda-99bc-f2b62d5a16cd","Type":"ContainerDied","Data":"9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0"} Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.302177 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cba66048-6589-4bda-99bc-f2b62d5a16cd","Type":"ContainerDied","Data":"7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46"} Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.302186 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cba66048-6589-4bda-99bc-f2b62d5a16cd","Type":"ContainerDied","Data":"214f8758c7bcbceff8c2118cea823d3832d509fe2a476aefc1476edcb74a1b81"} Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.302184 4764 scope.go:117] "RemoveContainer" containerID="606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.328201 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-config-data" (OuterVolumeSpecName: "config-data") pod "cba66048-6589-4bda-99bc-f2b62d5a16cd" (UID: "cba66048-6589-4bda-99bc-f2b62d5a16cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.333328 4764 scope.go:117] "RemoveContainer" containerID="9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.355439 4764 scope.go:117] "RemoveContainer" containerID="9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.378860 4764 scope.go:117] "RemoveContainer" containerID="7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.395274 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba66048-6589-4bda-99bc-f2b62d5a16cd-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.399833 4764 scope.go:117] "RemoveContainer" containerID="606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f" Mar 09 14:10:44 crc kubenswrapper[4764]: E0309 14:10:44.410499 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f\": container with ID starting with 606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f not found: ID does not exist" containerID="606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.410544 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f"} err="failed to get container status \"606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f\": rpc error: code = NotFound desc = could not find container \"606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f\": container with ID starting with 606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.410573 4764 scope.go:117] "RemoveContainer" containerID="9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa" Mar 09 14:10:44 crc kubenswrapper[4764]: E0309 14:10:44.411167 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa\": container with ID starting with 9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa not found: ID does not exist" containerID="9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.411188 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa"} err="failed to get container status \"9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa\": rpc error: code = NotFound desc = could not find container \"9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa\": container with ID starting with 9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.411202 4764 scope.go:117] "RemoveContainer" containerID="9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0" Mar 09 14:10:44 crc kubenswrapper[4764]: E0309 14:10:44.411465 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0\": container with ID starting with 9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0 not found: ID does not exist" containerID="9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.411485 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0"} err="failed to get container status \"9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0\": rpc error: code = NotFound desc = could not find container \"9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0\": container with ID starting with 9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0 not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.411498 4764 scope.go:117] "RemoveContainer" containerID="7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46" Mar 09 14:10:44 crc kubenswrapper[4764]: E0309 14:10:44.411754 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46\": container with ID starting with 7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46 not found: ID does not exist" containerID="7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.411773 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46"} err="failed to get container status \"7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46\": rpc error: code = NotFound desc = could not find container \"7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46\": container with ID starting with 7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46 not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.411785 4764 scope.go:117] "RemoveContainer" containerID="606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.412122 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f"} err="failed to get container status \"606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f\": rpc error: code = NotFound desc = could not find container \"606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f\": container with ID starting with 606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.412141 4764 scope.go:117] "RemoveContainer" containerID="9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.412343 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa"} err="failed to get container status \"9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa\": rpc error: code = NotFound desc = could not find container \"9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa\": container with ID starting with 9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.412362 4764 scope.go:117] "RemoveContainer" containerID="9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.412667 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0"} err="failed to get container status \"9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0\": rpc error: code = NotFound desc = could not find container \"9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0\": container with ID starting with 9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0 not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.412686 4764 scope.go:117] "RemoveContainer" containerID="7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.412896 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46"} err="failed to get container status \"7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46\": rpc error: code = NotFound desc = could not find container \"7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46\": container with ID starting with 7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46 not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.412918 4764 scope.go:117] "RemoveContainer" containerID="606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.413102 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f"} err="failed to get container status \"606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f\": rpc error: code = NotFound desc = could not find container \"606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f\": container with ID starting with 606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.413121 4764 scope.go:117] "RemoveContainer" containerID="9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.413374 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa"} err="failed to get container status \"9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa\": rpc error: code = NotFound desc = could not find container \"9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa\": container with ID starting with 9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.413394 4764 scope.go:117] "RemoveContainer" containerID="9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.413662 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0"} err="failed to get container status \"9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0\": rpc error: code = NotFound desc = could not find container \"9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0\": container with ID starting with 9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0 not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.413680 4764 scope.go:117] "RemoveContainer" containerID="7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.413928 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46"} err="failed to get container status \"7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46\": rpc error: code = NotFound desc = could not find container \"7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46\": container with ID starting with 7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46 not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.413947 4764 scope.go:117] "RemoveContainer" containerID="606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.414185 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f"} err="failed to get container status \"606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f\": rpc error: code = NotFound desc = could not find container \"606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f\": container with ID starting with 606b492dabd65291ff967eedeb1cc2a6599bea41f8828eecb6cc0ba85c71eb6f not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.414237 4764 scope.go:117] "RemoveContainer" containerID="9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.414505 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa"} err="failed to get container status \"9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa\": rpc error: code = NotFound desc = could not find container \"9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa\": container with ID starting with 9f9177e0fdd37283e97f8e4b49bcc207153bc286b11b7d77b8e0976f8e0cb4fa not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.414525 4764 scope.go:117] "RemoveContainer" containerID="9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.414746 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0"} err="failed to get container status \"9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0\": rpc error: code = NotFound desc = could not find container \"9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0\": container with ID starting with 9d47b74403609a167cdcccd2ced8710e8ea3efe2b025482fe77cefd1bd1cd0f0 not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.414770 4764 scope.go:117] "RemoveContainer" containerID="7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.415044 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46"} err="failed to get container status \"7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46\": rpc error: code = NotFound desc = could not find container \"7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46\": container with ID starting with 7dea23760bec929bafdb55f4e0c9793d4e53e06f14fc1f7a1f042bd745cebf46 not found: ID does not exist" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.649358 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.663137 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.688277 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:10:44 crc kubenswrapper[4764]: E0309 14:10:44.689321 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="ceilometer-notification-agent" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.689422 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="ceilometer-notification-agent" Mar 09 14:10:44 crc kubenswrapper[4764]: E0309 14:10:44.689528 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff" containerName="init" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.690141 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff" containerName="init" Mar 09 14:10:44 crc kubenswrapper[4764]: E0309 14:10:44.690263 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff" containerName="dnsmasq-dns" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.690330 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff" containerName="dnsmasq-dns" Mar 09 14:10:44 crc kubenswrapper[4764]: E0309 14:10:44.690456 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="ceilometer-central-agent" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.690525 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="ceilometer-central-agent" Mar 09 14:10:44 crc kubenswrapper[4764]: E0309 14:10:44.690608 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="sg-core" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.690701 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="sg-core" Mar 09 14:10:44 crc kubenswrapper[4764]: E0309 14:10:44.690790 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="proxy-httpd" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.690955 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="proxy-httpd" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.691417 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="ceilometer-central-agent" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.691504 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ed586ce-ac9a-4d0e-9686-6b526ab1c6ff" containerName="dnsmasq-dns" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.691574 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="sg-core" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.691672 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="ceilometer-notification-agent" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.691754 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" containerName="proxy-httpd" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.694858 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.707668 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.710410 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.710801 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.710910 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.816482 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/182886b1-5569-456a-aa1e-129021e95bfe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.816949 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/182886b1-5569-456a-aa1e-129021e95bfe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.817036 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/182886b1-5569-456a-aa1e-129021e95bfe-scripts\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.817060 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/182886b1-5569-456a-aa1e-129021e95bfe-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.817097 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/182886b1-5569-456a-aa1e-129021e95bfe-config-data\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.817128 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n74nk\" (UniqueName: \"kubernetes.io/projected/182886b1-5569-456a-aa1e-129021e95bfe-kube-api-access-n74nk\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.817199 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/182886b1-5569-456a-aa1e-129021e95bfe-run-httpd\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.817224 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/182886b1-5569-456a-aa1e-129021e95bfe-log-httpd\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.919911 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/182886b1-5569-456a-aa1e-129021e95bfe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.920014 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/182886b1-5569-456a-aa1e-129021e95bfe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.920216 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/182886b1-5569-456a-aa1e-129021e95bfe-scripts\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.920251 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/182886b1-5569-456a-aa1e-129021e95bfe-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.920315 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/182886b1-5569-456a-aa1e-129021e95bfe-config-data\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.920373 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n74nk\" (UniqueName: \"kubernetes.io/projected/182886b1-5569-456a-aa1e-129021e95bfe-kube-api-access-n74nk\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.920467 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/182886b1-5569-456a-aa1e-129021e95bfe-run-httpd\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.920495 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/182886b1-5569-456a-aa1e-129021e95bfe-log-httpd\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.921663 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/182886b1-5569-456a-aa1e-129021e95bfe-log-httpd\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.922775 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/182886b1-5569-456a-aa1e-129021e95bfe-run-httpd\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.929160 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/182886b1-5569-456a-aa1e-129021e95bfe-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.931890 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/182886b1-5569-456a-aa1e-129021e95bfe-scripts\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.932441 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/182886b1-5569-456a-aa1e-129021e95bfe-config-data\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.932524 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/182886b1-5569-456a-aa1e-129021e95bfe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.932836 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/182886b1-5569-456a-aa1e-129021e95bfe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:44 crc kubenswrapper[4764]: I0309 14:10:44.951210 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n74nk\" (UniqueName: \"kubernetes.io/projected/182886b1-5569-456a-aa1e-129021e95bfe-kube-api-access-n74nk\") pod \"ceilometer-0\" (UID: \"182886b1-5569-456a-aa1e-129021e95bfe\") " pod="openstack/ceilometer-0" Mar 09 14:10:45 crc kubenswrapper[4764]: I0309 14:10:45.037595 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:10:45 crc kubenswrapper[4764]: I0309 14:10:45.525787 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:10:45 crc kubenswrapper[4764]: I0309 14:10:45.598911 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cba66048-6589-4bda-99bc-f2b62d5a16cd" path="/var/lib/kubelet/pods/cba66048-6589-4bda-99bc-f2b62d5a16cd/volumes" Mar 09 14:10:46 crc kubenswrapper[4764]: I0309 14:10:46.085943 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Mar 09 14:10:46 crc kubenswrapper[4764]: I0309 14:10:46.135813 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Mar 09 14:10:46 crc kubenswrapper[4764]: I0309 14:10:46.334569 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"182886b1-5569-456a-aa1e-129021e95bfe","Type":"ContainerStarted","Data":"ede734f8333a2fa733a5fce350a7ca032ce48234ca605f47c897a4fb42ea2a53"} Mar 09 14:10:46 crc kubenswrapper[4764]: I0309 14:10:46.335091 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="30f42bb2-6e26-4cbb-942b-a7d4ede4f128" containerName="manila-scheduler" containerID="cri-o://b743badbf09f2d2b0dc2fef97dfa9da34473a9bbc60350880026711252d7f4f7" gracePeriod=30 Mar 09 14:10:46 crc kubenswrapper[4764]: I0309 14:10:46.335164 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="30f42bb2-6e26-4cbb-942b-a7d4ede4f128" containerName="probe" containerID="cri-o://cbc2d80e7e756e93ed514bd09529210c2d50987e0f38ca0f0f28ae42fe9a1fc9" gracePeriod=30 Mar 09 14:10:47 crc kubenswrapper[4764]: I0309 14:10:47.273598 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-797d44c9b-wrlx7" podUID="e00fc104-ec73-4190-a598-86de7ca6cfa5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.5:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.5:8443: connect: connection refused" Mar 09 14:10:47 crc kubenswrapper[4764]: I0309 14:10:47.346335 4764 generic.go:334] "Generic (PLEG): container finished" podID="30f42bb2-6e26-4cbb-942b-a7d4ede4f128" containerID="cbc2d80e7e756e93ed514bd09529210c2d50987e0f38ca0f0f28ae42fe9a1fc9" exitCode=0 Mar 09 14:10:47 crc kubenswrapper[4764]: I0309 14:10:47.346458 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"30f42bb2-6e26-4cbb-942b-a7d4ede4f128","Type":"ContainerDied","Data":"cbc2d80e7e756e93ed514bd09529210c2d50987e0f38ca0f0f28ae42fe9a1fc9"} Mar 09 14:10:47 crc kubenswrapper[4764]: I0309 14:10:47.348238 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"182886b1-5569-456a-aa1e-129021e95bfe","Type":"ContainerStarted","Data":"c221b73ef18ea3e73a5026b139c9df4d59925e3fefafdd30786207edc3ef50b0"} Mar 09 14:10:48 crc kubenswrapper[4764]: I0309 14:10:48.359670 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"182886b1-5569-456a-aa1e-129021e95bfe","Type":"ContainerStarted","Data":"4673bf510635fd12a6bc053c4b946a3d5cf647ce3ddfb9f8f5fa850ae40a11dc"} Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.150565 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.230183 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-combined-ca-bundle\") pod \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.230348 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-etc-machine-id\") pod \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.230371 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-scripts\") pod \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.230403 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-config-data\") pod \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.230524 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq2r9\" (UniqueName: \"kubernetes.io/projected/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-kube-api-access-lq2r9\") pod \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.230575 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-config-data-custom\") pod \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\" (UID: \"30f42bb2-6e26-4cbb-942b-a7d4ede4f128\") " Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.231291 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "30f42bb2-6e26-4cbb-942b-a7d4ede4f128" (UID: "30f42bb2-6e26-4cbb-942b-a7d4ede4f128"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.240880 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "30f42bb2-6e26-4cbb-942b-a7d4ede4f128" (UID: "30f42bb2-6e26-4cbb-942b-a7d4ede4f128"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.241984 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-kube-api-access-lq2r9" (OuterVolumeSpecName: "kube-api-access-lq2r9") pod "30f42bb2-6e26-4cbb-942b-a7d4ede4f128" (UID: "30f42bb2-6e26-4cbb-942b-a7d4ede4f128"). InnerVolumeSpecName "kube-api-access-lq2r9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.252850 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-scripts" (OuterVolumeSpecName: "scripts") pod "30f42bb2-6e26-4cbb-942b-a7d4ede4f128" (UID: "30f42bb2-6e26-4cbb-942b-a7d4ede4f128"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.334595 4764 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.334661 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.334676 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq2r9\" (UniqueName: \"kubernetes.io/projected/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-kube-api-access-lq2r9\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.334688 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.345899 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30f42bb2-6e26-4cbb-942b-a7d4ede4f128" (UID: "30f42bb2-6e26-4cbb-942b-a7d4ede4f128"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.373184 4764 generic.go:334] "Generic (PLEG): container finished" podID="30f42bb2-6e26-4cbb-942b-a7d4ede4f128" containerID="b743badbf09f2d2b0dc2fef97dfa9da34473a9bbc60350880026711252d7f4f7" exitCode=0 Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.373280 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"30f42bb2-6e26-4cbb-942b-a7d4ede4f128","Type":"ContainerDied","Data":"b743badbf09f2d2b0dc2fef97dfa9da34473a9bbc60350880026711252d7f4f7"} Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.373299 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.373322 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"30f42bb2-6e26-4cbb-942b-a7d4ede4f128","Type":"ContainerDied","Data":"69e9c4821e7bd2b8ea5bd5a2a4ce9a1fe30fca8c9b901cad13802f0443f55e38"} Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.373348 4764 scope.go:117] "RemoveContainer" containerID="cbc2d80e7e756e93ed514bd09529210c2d50987e0f38ca0f0f28ae42fe9a1fc9" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.379053 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"182886b1-5569-456a-aa1e-129021e95bfe","Type":"ContainerStarted","Data":"f7a1e9b9529e4d8555cc65fbd8d4b0c25e0e18ccde1cec73861b294fca116b0e"} Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.410019 4764 scope.go:117] "RemoveContainer" containerID="b743badbf09f2d2b0dc2fef97dfa9da34473a9bbc60350880026711252d7f4f7" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.436682 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.441576 4764 scope.go:117] "RemoveContainer" containerID="cbc2d80e7e756e93ed514bd09529210c2d50987e0f38ca0f0f28ae42fe9a1fc9" Mar 09 14:10:49 crc kubenswrapper[4764]: E0309 14:10:49.442277 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbc2d80e7e756e93ed514bd09529210c2d50987e0f38ca0f0f28ae42fe9a1fc9\": container with ID starting with cbc2d80e7e756e93ed514bd09529210c2d50987e0f38ca0f0f28ae42fe9a1fc9 not found: ID does not exist" containerID="cbc2d80e7e756e93ed514bd09529210c2d50987e0f38ca0f0f28ae42fe9a1fc9" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.442323 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbc2d80e7e756e93ed514bd09529210c2d50987e0f38ca0f0f28ae42fe9a1fc9"} err="failed to get container status \"cbc2d80e7e756e93ed514bd09529210c2d50987e0f38ca0f0f28ae42fe9a1fc9\": rpc error: code = NotFound desc = could not find container \"cbc2d80e7e756e93ed514bd09529210c2d50987e0f38ca0f0f28ae42fe9a1fc9\": container with ID starting with cbc2d80e7e756e93ed514bd09529210c2d50987e0f38ca0f0f28ae42fe9a1fc9 not found: ID does not exist" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.442350 4764 scope.go:117] "RemoveContainer" containerID="b743badbf09f2d2b0dc2fef97dfa9da34473a9bbc60350880026711252d7f4f7" Mar 09 14:10:49 crc kubenswrapper[4764]: E0309 14:10:49.442934 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b743badbf09f2d2b0dc2fef97dfa9da34473a9bbc60350880026711252d7f4f7\": container with ID starting with b743badbf09f2d2b0dc2fef97dfa9da34473a9bbc60350880026711252d7f4f7 not found: ID does not exist" containerID="b743badbf09f2d2b0dc2fef97dfa9da34473a9bbc60350880026711252d7f4f7" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.443064 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b743badbf09f2d2b0dc2fef97dfa9da34473a9bbc60350880026711252d7f4f7"} err="failed to get container status \"b743badbf09f2d2b0dc2fef97dfa9da34473a9bbc60350880026711252d7f4f7\": rpc error: code = NotFound desc = could not find container \"b743badbf09f2d2b0dc2fef97dfa9da34473a9bbc60350880026711252d7f4f7\": container with ID starting with b743badbf09f2d2b0dc2fef97dfa9da34473a9bbc60350880026711252d7f4f7 not found: ID does not exist" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.445905 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-config-data" (OuterVolumeSpecName: "config-data") pod "30f42bb2-6e26-4cbb-942b-a7d4ede4f128" (UID: "30f42bb2-6e26-4cbb-942b-a7d4ede4f128"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.539174 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f42bb2-6e26-4cbb-942b-a7d4ede4f128-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.732531 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.744783 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.755641 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Mar 09 14:10:49 crc kubenswrapper[4764]: E0309 14:10:49.756134 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30f42bb2-6e26-4cbb-942b-a7d4ede4f128" containerName="probe" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.756154 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="30f42bb2-6e26-4cbb-942b-a7d4ede4f128" containerName="probe" Mar 09 14:10:49 crc kubenswrapper[4764]: E0309 14:10:49.756169 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30f42bb2-6e26-4cbb-942b-a7d4ede4f128" containerName="manila-scheduler" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.756176 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="30f42bb2-6e26-4cbb-942b-a7d4ede4f128" containerName="manila-scheduler" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.756369 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="30f42bb2-6e26-4cbb-942b-a7d4ede4f128" containerName="manila-scheduler" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.756384 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="30f42bb2-6e26-4cbb-942b-a7d4ede4f128" containerName="probe" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.757515 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.761466 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.772123 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.847334 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/492d78a8-09ea-4239-a53f-b8d0480fcf36-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.847461 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/492d78a8-09ea-4239-a53f-b8d0480fcf36-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.847687 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/492d78a8-09ea-4239-a53f-b8d0480fcf36-scripts\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.847757 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/492d78a8-09ea-4239-a53f-b8d0480fcf36-config-data\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.847957 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsflw\" (UniqueName: \"kubernetes.io/projected/492d78a8-09ea-4239-a53f-b8d0480fcf36-kube-api-access-bsflw\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.848120 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/492d78a8-09ea-4239-a53f-b8d0480fcf36-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.950762 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsflw\" (UniqueName: \"kubernetes.io/projected/492d78a8-09ea-4239-a53f-b8d0480fcf36-kube-api-access-bsflw\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.950951 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/492d78a8-09ea-4239-a53f-b8d0480fcf36-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.951022 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/492d78a8-09ea-4239-a53f-b8d0480fcf36-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.951095 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/492d78a8-09ea-4239-a53f-b8d0480fcf36-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.951153 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/492d78a8-09ea-4239-a53f-b8d0480fcf36-scripts\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.951186 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/492d78a8-09ea-4239-a53f-b8d0480fcf36-config-data\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.951961 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/492d78a8-09ea-4239-a53f-b8d0480fcf36-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.956399 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/492d78a8-09ea-4239-a53f-b8d0480fcf36-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.956430 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/492d78a8-09ea-4239-a53f-b8d0480fcf36-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.957541 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/492d78a8-09ea-4239-a53f-b8d0480fcf36-config-data\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.961942 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/492d78a8-09ea-4239-a53f-b8d0480fcf36-scripts\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:49 crc kubenswrapper[4764]: I0309 14:10:49.969178 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsflw\" (UniqueName: \"kubernetes.io/projected/492d78a8-09ea-4239-a53f-b8d0480fcf36-kube-api-access-bsflw\") pod \"manila-scheduler-0\" (UID: \"492d78a8-09ea-4239-a53f-b8d0480fcf36\") " pod="openstack/manila-scheduler-0" Mar 09 14:10:50 crc kubenswrapper[4764]: I0309 14:10:50.136102 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 09 14:10:50 crc kubenswrapper[4764]: I0309 14:10:50.473620 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 09 14:10:50 crc kubenswrapper[4764]: W0309 14:10:50.481012 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod492d78a8_09ea_4239_a53f_b8d0480fcf36.slice/crio-df86a97d5a10593d6e7f1198c9631d358e374b4e5920b01ae0d48bd2b50454bb WatchSource:0}: Error finding container df86a97d5a10593d6e7f1198c9631d358e374b4e5920b01ae0d48bd2b50454bb: Status 404 returned error can't find the container with id df86a97d5a10593d6e7f1198c9631d358e374b4e5920b01ae0d48bd2b50454bb Mar 09 14:10:51 crc kubenswrapper[4764]: I0309 14:10:51.408722 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"492d78a8-09ea-4239-a53f-b8d0480fcf36","Type":"ContainerStarted","Data":"bb4283f38719f6811bc261c4481786fa88f25ac364ff8b2ad91f4031e6d2e769"} Mar 09 14:10:51 crc kubenswrapper[4764]: I0309 14:10:51.409560 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"492d78a8-09ea-4239-a53f-b8d0480fcf36","Type":"ContainerStarted","Data":"df86a97d5a10593d6e7f1198c9631d358e374b4e5920b01ae0d48bd2b50454bb"} Mar 09 14:10:51 crc kubenswrapper[4764]: I0309 14:10:51.436457 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"182886b1-5569-456a-aa1e-129021e95bfe","Type":"ContainerStarted","Data":"962128e6d28446f61932e6679464215a3def7afa777dff5d7a16332a4480165a"} Mar 09 14:10:51 crc kubenswrapper[4764]: I0309 14:10:51.438182 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 14:10:51 crc kubenswrapper[4764]: I0309 14:10:51.474030 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.7179550260000003 podStartE2EDuration="7.473998004s" podCreationTimestamp="2026-03-09 14:10:44 +0000 UTC" firstStartedPulling="2026-03-09 14:10:45.577503275 +0000 UTC m=+3000.827675183" lastFinishedPulling="2026-03-09 14:10:50.333546253 +0000 UTC m=+3005.583718161" observedRunningTime="2026-03-09 14:10:51.46035396 +0000 UTC m=+3006.710525878" watchObservedRunningTime="2026-03-09 14:10:51.473998004 +0000 UTC m=+3006.724169922" Mar 09 14:10:51 crc kubenswrapper[4764]: I0309 14:10:51.573752 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30f42bb2-6e26-4cbb-942b-a7d4ede4f128" path="/var/lib/kubelet/pods/30f42bb2-6e26-4cbb-942b-a7d4ede4f128/volumes" Mar 09 14:10:51 crc kubenswrapper[4764]: I0309 14:10:51.795929 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Mar 09 14:10:52 crc kubenswrapper[4764]: I0309 14:10:52.450790 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"492d78a8-09ea-4239-a53f-b8d0480fcf36","Type":"ContainerStarted","Data":"79b4c0474938cd83712d03dc98824c4915b148a402a45ba43f0f180e88656641"} Mar 09 14:10:52 crc kubenswrapper[4764]: I0309 14:10:52.493711 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.493681621 podStartE2EDuration="3.493681621s" podCreationTimestamp="2026-03-09 14:10:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:10:52.478797703 +0000 UTC m=+3007.728969631" watchObservedRunningTime="2026-03-09 14:10:52.493681621 +0000 UTC m=+3007.743853559" Mar 09 14:10:55 crc kubenswrapper[4764]: I0309 14:10:55.928246 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Mar 09 14:10:56 crc kubenswrapper[4764]: I0309 14:10:56.018765 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Mar 09 14:10:56 crc kubenswrapper[4764]: I0309 14:10:56.490755 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="604c4e24-15e4-43e1-b08c-74d8337a2e71" containerName="manila-share" containerID="cri-o://824266a7cc9cb5d7239bcaba8f3fa368c333aac554ec8e5c7b1cc64368838f39" gracePeriod=30 Mar 09 14:10:56 crc kubenswrapper[4764]: I0309 14:10:56.490807 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="604c4e24-15e4-43e1-b08c-74d8337a2e71" containerName="probe" containerID="cri-o://03c13ad847d2a8fd02eb6ed9a1ace4039b7b02e90605b4014e12bc57ed3abe55" gracePeriod=30 Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.274126 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-797d44c9b-wrlx7" podUID="e00fc104-ec73-4190-a598-86de7ca6cfa5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.5:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.5:8443: connect: connection refused" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.275057 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.525425 4764 generic.go:334] "Generic (PLEG): container finished" podID="604c4e24-15e4-43e1-b08c-74d8337a2e71" containerID="03c13ad847d2a8fd02eb6ed9a1ace4039b7b02e90605b4014e12bc57ed3abe55" exitCode=0 Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.525466 4764 generic.go:334] "Generic (PLEG): container finished" podID="604c4e24-15e4-43e1-b08c-74d8337a2e71" containerID="824266a7cc9cb5d7239bcaba8f3fa368c333aac554ec8e5c7b1cc64368838f39" exitCode=1 Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.525489 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"604c4e24-15e4-43e1-b08c-74d8337a2e71","Type":"ContainerDied","Data":"03c13ad847d2a8fd02eb6ed9a1ace4039b7b02e90605b4014e12bc57ed3abe55"} Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.525517 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"604c4e24-15e4-43e1-b08c-74d8337a2e71","Type":"ContainerDied","Data":"824266a7cc9cb5d7239bcaba8f3fa368c333aac554ec8e5c7b1cc64368838f39"} Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.560975 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:10:57 crc kubenswrapper[4764]: E0309 14:10:57.561351 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.627005 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.752382 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-config-data\") pod \"604c4e24-15e4-43e1-b08c-74d8337a2e71\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.752469 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-scripts\") pod \"604c4e24-15e4-43e1-b08c-74d8337a2e71\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.752493 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxzkm\" (UniqueName: \"kubernetes.io/projected/604c4e24-15e4-43e1-b08c-74d8337a2e71-kube-api-access-wxzkm\") pod \"604c4e24-15e4-43e1-b08c-74d8337a2e71\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.752593 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/604c4e24-15e4-43e1-b08c-74d8337a2e71-etc-machine-id\") pod \"604c4e24-15e4-43e1-b08c-74d8337a2e71\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.752693 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/604c4e24-15e4-43e1-b08c-74d8337a2e71-ceph\") pod \"604c4e24-15e4-43e1-b08c-74d8337a2e71\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.752860 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-combined-ca-bundle\") pod \"604c4e24-15e4-43e1-b08c-74d8337a2e71\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.752936 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-config-data-custom\") pod \"604c4e24-15e4-43e1-b08c-74d8337a2e71\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.752999 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/604c4e24-15e4-43e1-b08c-74d8337a2e71-var-lib-manila\") pod \"604c4e24-15e4-43e1-b08c-74d8337a2e71\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.753784 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/604c4e24-15e4-43e1-b08c-74d8337a2e71-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "604c4e24-15e4-43e1-b08c-74d8337a2e71" (UID: "604c4e24-15e4-43e1-b08c-74d8337a2e71"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.754381 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/604c4e24-15e4-43e1-b08c-74d8337a2e71-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "604c4e24-15e4-43e1-b08c-74d8337a2e71" (UID: "604c4e24-15e4-43e1-b08c-74d8337a2e71"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.760832 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/604c4e24-15e4-43e1-b08c-74d8337a2e71-kube-api-access-wxzkm" (OuterVolumeSpecName: "kube-api-access-wxzkm") pod "604c4e24-15e4-43e1-b08c-74d8337a2e71" (UID: "604c4e24-15e4-43e1-b08c-74d8337a2e71"). InnerVolumeSpecName "kube-api-access-wxzkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.760908 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/604c4e24-15e4-43e1-b08c-74d8337a2e71-ceph" (OuterVolumeSpecName: "ceph") pod "604c4e24-15e4-43e1-b08c-74d8337a2e71" (UID: "604c4e24-15e4-43e1-b08c-74d8337a2e71"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.761421 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-scripts" (OuterVolumeSpecName: "scripts") pod "604c4e24-15e4-43e1-b08c-74d8337a2e71" (UID: "604c4e24-15e4-43e1-b08c-74d8337a2e71"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.761531 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "604c4e24-15e4-43e1-b08c-74d8337a2e71" (UID: "604c4e24-15e4-43e1-b08c-74d8337a2e71"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.816524 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "604c4e24-15e4-43e1-b08c-74d8337a2e71" (UID: "604c4e24-15e4-43e1-b08c-74d8337a2e71"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.855001 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-config-data" (OuterVolumeSpecName: "config-data") pod "604c4e24-15e4-43e1-b08c-74d8337a2e71" (UID: "604c4e24-15e4-43e1-b08c-74d8337a2e71"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.855676 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-config-data\") pod \"604c4e24-15e4-43e1-b08c-74d8337a2e71\" (UID: \"604c4e24-15e4-43e1-b08c-74d8337a2e71\") " Mar 09 14:10:57 crc kubenswrapper[4764]: W0309 14:10:57.855812 4764 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/604c4e24-15e4-43e1-b08c-74d8337a2e71/volumes/kubernetes.io~secret/config-data Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.855831 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-config-data" (OuterVolumeSpecName: "config-data") pod "604c4e24-15e4-43e1-b08c-74d8337a2e71" (UID: "604c4e24-15e4-43e1-b08c-74d8337a2e71"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.858176 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.858225 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.858240 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxzkm\" (UniqueName: \"kubernetes.io/projected/604c4e24-15e4-43e1-b08c-74d8337a2e71-kube-api-access-wxzkm\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.858254 4764 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/604c4e24-15e4-43e1-b08c-74d8337a2e71-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.858264 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/604c4e24-15e4-43e1-b08c-74d8337a2e71-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.858274 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.858284 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/604c4e24-15e4-43e1-b08c-74d8337a2e71-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:57 crc kubenswrapper[4764]: I0309 14:10:57.858294 4764 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/604c4e24-15e4-43e1-b08c-74d8337a2e71-var-lib-manila\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.542380 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"604c4e24-15e4-43e1-b08c-74d8337a2e71","Type":"ContainerDied","Data":"99bc2f9e9539dc58eaf52226466b27a3c1c886f9a53c573b32136461c86d2a63"} Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.543769 4764 scope.go:117] "RemoveContainer" containerID="03c13ad847d2a8fd02eb6ed9a1ace4039b7b02e90605b4014e12bc57ed3abe55" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.542506 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.584083 4764 scope.go:117] "RemoveContainer" containerID="824266a7cc9cb5d7239bcaba8f3fa368c333aac554ec8e5c7b1cc64368838f39" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.591298 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.600328 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.623896 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Mar 09 14:10:58 crc kubenswrapper[4764]: E0309 14:10:58.624370 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="604c4e24-15e4-43e1-b08c-74d8337a2e71" containerName="probe" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.624386 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="604c4e24-15e4-43e1-b08c-74d8337a2e71" containerName="probe" Mar 09 14:10:58 crc kubenswrapper[4764]: E0309 14:10:58.624420 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="604c4e24-15e4-43e1-b08c-74d8337a2e71" containerName="manila-share" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.624426 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="604c4e24-15e4-43e1-b08c-74d8337a2e71" containerName="manila-share" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.624618 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="604c4e24-15e4-43e1-b08c-74d8337a2e71" containerName="probe" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.624665 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="604c4e24-15e4-43e1-b08c-74d8337a2e71" containerName="manila-share" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.626352 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.632941 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.656231 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.679375 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/2caafd00-b539-4f40-b1c6-af6957bcb458-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.679446 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2caafd00-b539-4f40-b1c6-af6957bcb458-config-data\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.679474 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2caafd00-b539-4f40-b1c6-af6957bcb458-scripts\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.679559 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crvf4\" (UniqueName: \"kubernetes.io/projected/2caafd00-b539-4f40-b1c6-af6957bcb458-kube-api-access-crvf4\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.679593 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2caafd00-b539-4f40-b1c6-af6957bcb458-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.679627 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2caafd00-b539-4f40-b1c6-af6957bcb458-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.679689 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2caafd00-b539-4f40-b1c6-af6957bcb458-ceph\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.679723 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2caafd00-b539-4f40-b1c6-af6957bcb458-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.782166 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2caafd00-b539-4f40-b1c6-af6957bcb458-config-data\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.782259 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2caafd00-b539-4f40-b1c6-af6957bcb458-scripts\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.782361 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crvf4\" (UniqueName: \"kubernetes.io/projected/2caafd00-b539-4f40-b1c6-af6957bcb458-kube-api-access-crvf4\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.782409 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2caafd00-b539-4f40-b1c6-af6957bcb458-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.782453 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2caafd00-b539-4f40-b1c6-af6957bcb458-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.783416 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2caafd00-b539-4f40-b1c6-af6957bcb458-ceph\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.783459 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2caafd00-b539-4f40-b1c6-af6957bcb458-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.783578 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/2caafd00-b539-4f40-b1c6-af6957bcb458-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.783790 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/2caafd00-b539-4f40-b1c6-af6957bcb458-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.783887 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2caafd00-b539-4f40-b1c6-af6957bcb458-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.787338 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2caafd00-b539-4f40-b1c6-af6957bcb458-scripts\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.787455 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2caafd00-b539-4f40-b1c6-af6957bcb458-ceph\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.789028 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2caafd00-b539-4f40-b1c6-af6957bcb458-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.796614 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2caafd00-b539-4f40-b1c6-af6957bcb458-config-data\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.805545 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2caafd00-b539-4f40-b1c6-af6957bcb458-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.805848 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crvf4\" (UniqueName: \"kubernetes.io/projected/2caafd00-b539-4f40-b1c6-af6957bcb458-kube-api-access-crvf4\") pod \"manila-share-share1-0\" (UID: \"2caafd00-b539-4f40-b1c6-af6957bcb458\") " pod="openstack/manila-share-share1-0" Mar 09 14:10:58 crc kubenswrapper[4764]: I0309 14:10:58.950835 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 09 14:10:59 crc kubenswrapper[4764]: I0309 14:10:59.526265 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 09 14:10:59 crc kubenswrapper[4764]: I0309 14:10:59.601585 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="604c4e24-15e4-43e1-b08c-74d8337a2e71" path="/var/lib/kubelet/pods/604c4e24-15e4-43e1-b08c-74d8337a2e71/volumes" Mar 09 14:10:59 crc kubenswrapper[4764]: I0309 14:10:59.604221 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"2caafd00-b539-4f40-b1c6-af6957bcb458","Type":"ContainerStarted","Data":"f7fd9e8bd13abf9c5c2923dfa95c2915e4ea26dfb1c7b93494c3dd27e85dab7e"} Mar 09 14:11:00 crc kubenswrapper[4764]: I0309 14:11:00.137180 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Mar 09 14:11:00 crc kubenswrapper[4764]: I0309 14:11:00.610390 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"2caafd00-b539-4f40-b1c6-af6957bcb458","Type":"ContainerStarted","Data":"6c988ca090ac7db06d24d10df0cb1762a1a1c8f2fe2b55bd53a6bf46db54f750"} Mar 09 14:11:01 crc kubenswrapper[4764]: I0309 14:11:01.624903 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"2caafd00-b539-4f40-b1c6-af6957bcb458","Type":"ContainerStarted","Data":"2cfe387aadda7534e64755323f99bba27961b51d18e37a4c5e9977c9fdc6d4ee"} Mar 09 14:11:01 crc kubenswrapper[4764]: I0309 14:11:01.653545 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.65345204 podStartE2EDuration="3.65345204s" podCreationTimestamp="2026-03-09 14:10:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:11:01.652360871 +0000 UTC m=+3016.902532779" watchObservedRunningTime="2026-03-09 14:11:01.65345204 +0000 UTC m=+3016.903623948" Mar 09 14:11:01 crc kubenswrapper[4764]: I0309 14:11:01.839921 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.577811 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.657515 4764 generic.go:334] "Generic (PLEG): container finished" podID="e00fc104-ec73-4190-a598-86de7ca6cfa5" containerID="ed7bd666dbaa56d7a77980118a8739a44cafc9a966ea8469e78e592a4a49ac96" exitCode=137 Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.657577 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-797d44c9b-wrlx7" event={"ID":"e00fc104-ec73-4190-a598-86de7ca6cfa5","Type":"ContainerDied","Data":"ed7bd666dbaa56d7a77980118a8739a44cafc9a966ea8469e78e592a4a49ac96"} Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.657617 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-797d44c9b-wrlx7" event={"ID":"e00fc104-ec73-4190-a598-86de7ca6cfa5","Type":"ContainerDied","Data":"183964e4224d34ffefb68047495776b87e734441e831507181f1e3aafc498e51"} Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.657686 4764 scope.go:117] "RemoveContainer" containerID="c791eb4afad3d3b7319a4ecbb9ad7314d04c8eb6a4954765b916e231a57d04c0" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.657887 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-797d44c9b-wrlx7" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.684714 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l5bd\" (UniqueName: \"kubernetes.io/projected/e00fc104-ec73-4190-a598-86de7ca6cfa5-kube-api-access-9l5bd\") pod \"e00fc104-ec73-4190-a598-86de7ca6cfa5\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.684781 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e00fc104-ec73-4190-a598-86de7ca6cfa5-config-data\") pod \"e00fc104-ec73-4190-a598-86de7ca6cfa5\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.684847 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-horizon-secret-key\") pod \"e00fc104-ec73-4190-a598-86de7ca6cfa5\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.684953 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e00fc104-ec73-4190-a598-86de7ca6cfa5-logs\") pod \"e00fc104-ec73-4190-a598-86de7ca6cfa5\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.685111 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e00fc104-ec73-4190-a598-86de7ca6cfa5-scripts\") pod \"e00fc104-ec73-4190-a598-86de7ca6cfa5\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.685132 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-horizon-tls-certs\") pod \"e00fc104-ec73-4190-a598-86de7ca6cfa5\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.685172 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-combined-ca-bundle\") pod \"e00fc104-ec73-4190-a598-86de7ca6cfa5\" (UID: \"e00fc104-ec73-4190-a598-86de7ca6cfa5\") " Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.687189 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e00fc104-ec73-4190-a598-86de7ca6cfa5-logs" (OuterVolumeSpecName: "logs") pod "e00fc104-ec73-4190-a598-86de7ca6cfa5" (UID: "e00fc104-ec73-4190-a598-86de7ca6cfa5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.692315 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e00fc104-ec73-4190-a598-86de7ca6cfa5-kube-api-access-9l5bd" (OuterVolumeSpecName: "kube-api-access-9l5bd") pod "e00fc104-ec73-4190-a598-86de7ca6cfa5" (UID: "e00fc104-ec73-4190-a598-86de7ca6cfa5"). InnerVolumeSpecName "kube-api-access-9l5bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.694690 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e00fc104-ec73-4190-a598-86de7ca6cfa5" (UID: "e00fc104-ec73-4190-a598-86de7ca6cfa5"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.714302 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e00fc104-ec73-4190-a598-86de7ca6cfa5-scripts" (OuterVolumeSpecName: "scripts") pod "e00fc104-ec73-4190-a598-86de7ca6cfa5" (UID: "e00fc104-ec73-4190-a598-86de7ca6cfa5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.715209 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e00fc104-ec73-4190-a598-86de7ca6cfa5-config-data" (OuterVolumeSpecName: "config-data") pod "e00fc104-ec73-4190-a598-86de7ca6cfa5" (UID: "e00fc104-ec73-4190-a598-86de7ca6cfa5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.719932 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e00fc104-ec73-4190-a598-86de7ca6cfa5" (UID: "e00fc104-ec73-4190-a598-86de7ca6cfa5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.744218 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "e00fc104-ec73-4190-a598-86de7ca6cfa5" (UID: "e00fc104-ec73-4190-a598-86de7ca6cfa5"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.795556 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9l5bd\" (UniqueName: \"kubernetes.io/projected/e00fc104-ec73-4190-a598-86de7ca6cfa5-kube-api-access-9l5bd\") on node \"crc\" DevicePath \"\"" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.795615 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e00fc104-ec73-4190-a598-86de7ca6cfa5-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.795633 4764 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.795669 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e00fc104-ec73-4190-a598-86de7ca6cfa5-logs\") on node \"crc\" DevicePath \"\"" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.795685 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e00fc104-ec73-4190-a598-86de7ca6cfa5-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.795695 4764 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.795713 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e00fc104-ec73-4190-a598-86de7ca6cfa5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.822001 4764 scope.go:117] "RemoveContainer" containerID="ed7bd666dbaa56d7a77980118a8739a44cafc9a966ea8469e78e592a4a49ac96" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.849428 4764 scope.go:117] "RemoveContainer" containerID="c791eb4afad3d3b7319a4ecbb9ad7314d04c8eb6a4954765b916e231a57d04c0" Mar 09 14:11:04 crc kubenswrapper[4764]: E0309 14:11:04.849888 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c791eb4afad3d3b7319a4ecbb9ad7314d04c8eb6a4954765b916e231a57d04c0\": container with ID starting with c791eb4afad3d3b7319a4ecbb9ad7314d04c8eb6a4954765b916e231a57d04c0 not found: ID does not exist" containerID="c791eb4afad3d3b7319a4ecbb9ad7314d04c8eb6a4954765b916e231a57d04c0" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.849928 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c791eb4afad3d3b7319a4ecbb9ad7314d04c8eb6a4954765b916e231a57d04c0"} err="failed to get container status \"c791eb4afad3d3b7319a4ecbb9ad7314d04c8eb6a4954765b916e231a57d04c0\": rpc error: code = NotFound desc = could not find container \"c791eb4afad3d3b7319a4ecbb9ad7314d04c8eb6a4954765b916e231a57d04c0\": container with ID starting with c791eb4afad3d3b7319a4ecbb9ad7314d04c8eb6a4954765b916e231a57d04c0 not found: ID does not exist" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.849952 4764 scope.go:117] "RemoveContainer" containerID="ed7bd666dbaa56d7a77980118a8739a44cafc9a966ea8469e78e592a4a49ac96" Mar 09 14:11:04 crc kubenswrapper[4764]: E0309 14:11:04.850226 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed7bd666dbaa56d7a77980118a8739a44cafc9a966ea8469e78e592a4a49ac96\": container with ID starting with ed7bd666dbaa56d7a77980118a8739a44cafc9a966ea8469e78e592a4a49ac96 not found: ID does not exist" containerID="ed7bd666dbaa56d7a77980118a8739a44cafc9a966ea8469e78e592a4a49ac96" Mar 09 14:11:04 crc kubenswrapper[4764]: I0309 14:11:04.850253 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed7bd666dbaa56d7a77980118a8739a44cafc9a966ea8469e78e592a4a49ac96"} err="failed to get container status \"ed7bd666dbaa56d7a77980118a8739a44cafc9a966ea8469e78e592a4a49ac96\": rpc error: code = NotFound desc = could not find container \"ed7bd666dbaa56d7a77980118a8739a44cafc9a966ea8469e78e592a4a49ac96\": container with ID starting with ed7bd666dbaa56d7a77980118a8739a44cafc9a966ea8469e78e592a4a49ac96 not found: ID does not exist" Mar 09 14:11:05 crc kubenswrapper[4764]: I0309 14:11:05.000673 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-797d44c9b-wrlx7"] Mar 09 14:11:05 crc kubenswrapper[4764]: I0309 14:11:05.010620 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-797d44c9b-wrlx7"] Mar 09 14:11:05 crc kubenswrapper[4764]: I0309 14:11:05.572999 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e00fc104-ec73-4190-a598-86de7ca6cfa5" path="/var/lib/kubelet/pods/e00fc104-ec73-4190-a598-86de7ca6cfa5/volumes" Mar 09 14:11:08 crc kubenswrapper[4764]: I0309 14:11:08.951842 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Mar 09 14:11:09 crc kubenswrapper[4764]: I0309 14:11:09.929824 4764 scope.go:117] "RemoveContainer" containerID="0b92f70675fc36bf2d871b5244dabd98b74bc41c0324eee9ebebf35ece42f834" Mar 09 14:11:09 crc kubenswrapper[4764]: I0309 14:11:09.958363 4764 scope.go:117] "RemoveContainer" containerID="3b50e9b2a9f264bfcb22b787b33fb1bad8b4dde68292e87c3d306716bd5422e2" Mar 09 14:11:11 crc kubenswrapper[4764]: I0309 14:11:11.560622 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:11:11 crc kubenswrapper[4764]: E0309 14:11:11.561364 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:11:15 crc kubenswrapper[4764]: I0309 14:11:15.047793 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 09 14:11:20 crc kubenswrapper[4764]: I0309 14:11:20.544151 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Mar 09 14:11:25 crc kubenswrapper[4764]: I0309 14:11:25.566405 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:11:25 crc kubenswrapper[4764]: E0309 14:11:25.568860 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:11:40 crc kubenswrapper[4764]: I0309 14:11:40.560627 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:11:40 crc kubenswrapper[4764]: E0309 14:11:40.561954 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:11:53 crc kubenswrapper[4764]: I0309 14:11:53.560116 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:11:53 crc kubenswrapper[4764]: E0309 14:11:53.561839 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:12:00 crc kubenswrapper[4764]: I0309 14:12:00.158324 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551092-qs44j"] Mar 09 14:12:00 crc kubenswrapper[4764]: E0309 14:12:00.159898 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e00fc104-ec73-4190-a598-86de7ca6cfa5" containerName="horizon" Mar 09 14:12:00 crc kubenswrapper[4764]: I0309 14:12:00.159930 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e00fc104-ec73-4190-a598-86de7ca6cfa5" containerName="horizon" Mar 09 14:12:00 crc kubenswrapper[4764]: E0309 14:12:00.159969 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e00fc104-ec73-4190-a598-86de7ca6cfa5" containerName="horizon-log" Mar 09 14:12:00 crc kubenswrapper[4764]: I0309 14:12:00.159981 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e00fc104-ec73-4190-a598-86de7ca6cfa5" containerName="horizon-log" Mar 09 14:12:00 crc kubenswrapper[4764]: I0309 14:12:00.160371 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e00fc104-ec73-4190-a598-86de7ca6cfa5" containerName="horizon-log" Mar 09 14:12:00 crc kubenswrapper[4764]: I0309 14:12:00.160392 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e00fc104-ec73-4190-a598-86de7ca6cfa5" containerName="horizon" Mar 09 14:12:00 crc kubenswrapper[4764]: I0309 14:12:00.161607 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551092-qs44j" Mar 09 14:12:00 crc kubenswrapper[4764]: I0309 14:12:00.164345 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:12:00 crc kubenswrapper[4764]: I0309 14:12:00.164633 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:12:00 crc kubenswrapper[4764]: I0309 14:12:00.169467 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:12:00 crc kubenswrapper[4764]: I0309 14:12:00.173546 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551092-qs44j"] Mar 09 14:12:00 crc kubenswrapper[4764]: I0309 14:12:00.265501 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vpzk\" (UniqueName: \"kubernetes.io/projected/81a7f588-07b1-4ef1-97ee-420e944ad16b-kube-api-access-5vpzk\") pod \"auto-csr-approver-29551092-qs44j\" (UID: \"81a7f588-07b1-4ef1-97ee-420e944ad16b\") " pod="openshift-infra/auto-csr-approver-29551092-qs44j" Mar 09 14:12:00 crc kubenswrapper[4764]: I0309 14:12:00.367874 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vpzk\" (UniqueName: \"kubernetes.io/projected/81a7f588-07b1-4ef1-97ee-420e944ad16b-kube-api-access-5vpzk\") pod \"auto-csr-approver-29551092-qs44j\" (UID: \"81a7f588-07b1-4ef1-97ee-420e944ad16b\") " pod="openshift-infra/auto-csr-approver-29551092-qs44j" Mar 09 14:12:00 crc kubenswrapper[4764]: I0309 14:12:00.389878 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vpzk\" (UniqueName: \"kubernetes.io/projected/81a7f588-07b1-4ef1-97ee-420e944ad16b-kube-api-access-5vpzk\") pod \"auto-csr-approver-29551092-qs44j\" (UID: \"81a7f588-07b1-4ef1-97ee-420e944ad16b\") " pod="openshift-infra/auto-csr-approver-29551092-qs44j" Mar 09 14:12:00 crc kubenswrapper[4764]: I0309 14:12:00.493250 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551092-qs44j" Mar 09 14:12:00 crc kubenswrapper[4764]: I0309 14:12:00.974293 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551092-qs44j"] Mar 09 14:12:01 crc kubenswrapper[4764]: I0309 14:12:01.270125 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551092-qs44j" event={"ID":"81a7f588-07b1-4ef1-97ee-420e944ad16b","Type":"ContainerStarted","Data":"0757244299baed248a05242137e929ee6d2c210e50d4f4fe38f70f73594886c2"} Mar 09 14:12:03 crc kubenswrapper[4764]: I0309 14:12:03.297250 4764 generic.go:334] "Generic (PLEG): container finished" podID="81a7f588-07b1-4ef1-97ee-420e944ad16b" containerID="4148e832dbd4302f41c6dbad3eab96d625f5a7299088cd33baf2acb47b5d3267" exitCode=0 Mar 09 14:12:03 crc kubenswrapper[4764]: I0309 14:12:03.297333 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551092-qs44j" event={"ID":"81a7f588-07b1-4ef1-97ee-420e944ad16b","Type":"ContainerDied","Data":"4148e832dbd4302f41c6dbad3eab96d625f5a7299088cd33baf2acb47b5d3267"} Mar 09 14:12:04 crc kubenswrapper[4764]: I0309 14:12:04.705769 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551092-qs44j" Mar 09 14:12:04 crc kubenswrapper[4764]: I0309 14:12:04.881581 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vpzk\" (UniqueName: \"kubernetes.io/projected/81a7f588-07b1-4ef1-97ee-420e944ad16b-kube-api-access-5vpzk\") pod \"81a7f588-07b1-4ef1-97ee-420e944ad16b\" (UID: \"81a7f588-07b1-4ef1-97ee-420e944ad16b\") " Mar 09 14:12:04 crc kubenswrapper[4764]: I0309 14:12:04.889447 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81a7f588-07b1-4ef1-97ee-420e944ad16b-kube-api-access-5vpzk" (OuterVolumeSpecName: "kube-api-access-5vpzk") pod "81a7f588-07b1-4ef1-97ee-420e944ad16b" (UID: "81a7f588-07b1-4ef1-97ee-420e944ad16b"). InnerVolumeSpecName "kube-api-access-5vpzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:12:04 crc kubenswrapper[4764]: I0309 14:12:04.985786 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vpzk\" (UniqueName: \"kubernetes.io/projected/81a7f588-07b1-4ef1-97ee-420e944ad16b-kube-api-access-5vpzk\") on node \"crc\" DevicePath \"\"" Mar 09 14:12:05 crc kubenswrapper[4764]: I0309 14:12:05.325091 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551092-qs44j" event={"ID":"81a7f588-07b1-4ef1-97ee-420e944ad16b","Type":"ContainerDied","Data":"0757244299baed248a05242137e929ee6d2c210e50d4f4fe38f70f73594886c2"} Mar 09 14:12:05 crc kubenswrapper[4764]: I0309 14:12:05.325558 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0757244299baed248a05242137e929ee6d2c210e50d4f4fe38f70f73594886c2" Mar 09 14:12:05 crc kubenswrapper[4764]: I0309 14:12:05.325225 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551092-qs44j" Mar 09 14:12:05 crc kubenswrapper[4764]: I0309 14:12:05.799203 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551086-dn2gl"] Mar 09 14:12:05 crc kubenswrapper[4764]: I0309 14:12:05.812055 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551086-dn2gl"] Mar 09 14:12:06 crc kubenswrapper[4764]: I0309 14:12:06.560637 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:12:06 crc kubenswrapper[4764]: E0309 14:12:06.561222 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:12:07 crc kubenswrapper[4764]: I0309 14:12:07.570471 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55c3951f-6e8b-46f4-9332-9c5d658862e4" path="/var/lib/kubelet/pods/55c3951f-6e8b-46f4-9332-9c5d658862e4/volumes" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.758923 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 09 14:12:08 crc kubenswrapper[4764]: E0309 14:12:08.759952 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81a7f588-07b1-4ef1-97ee-420e944ad16b" containerName="oc" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.759971 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="81a7f588-07b1-4ef1-97ee-420e944ad16b" containerName="oc" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.760286 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="81a7f588-07b1-4ef1-97ee-420e944ad16b" containerName="oc" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.761283 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.765977 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-9bk55" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.766898 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.767005 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.767237 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.771616 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.878591 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9db4k\" (UniqueName: \"kubernetes.io/projected/5db22a0e-ee1a-4b26-9e49-b26644266834-kube-api-access-9db4k\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.879167 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5db22a0e-ee1a-4b26-9e49-b26644266834-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.879203 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.879234 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5db22a0e-ee1a-4b26-9e49-b26644266834-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.879496 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5db22a0e-ee1a-4b26-9e49-b26644266834-config-data\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.879864 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.880248 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5db22a0e-ee1a-4b26-9e49-b26644266834-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.880359 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.880585 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.982390 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5db22a0e-ee1a-4b26-9e49-b26644266834-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.982459 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.982497 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5db22a0e-ee1a-4b26-9e49-b26644266834-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.982546 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5db22a0e-ee1a-4b26-9e49-b26644266834-config-data\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.982591 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.982722 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5db22a0e-ee1a-4b26-9e49-b26644266834-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.982759 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.982802 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.982833 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9db4k\" (UniqueName: \"kubernetes.io/projected/5db22a0e-ee1a-4b26-9e49-b26644266834-kube-api-access-9db4k\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.983030 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5db22a0e-ee1a-4b26-9e49-b26644266834-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.983405 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.983497 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5db22a0e-ee1a-4b26-9e49-b26644266834-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.983817 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5db22a0e-ee1a-4b26-9e49-b26644266834-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.984804 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5db22a0e-ee1a-4b26-9e49-b26644266834-config-data\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.992516 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.993156 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:08 crc kubenswrapper[4764]: I0309 14:12:08.993918 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:09 crc kubenswrapper[4764]: I0309 14:12:09.006806 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9db4k\" (UniqueName: \"kubernetes.io/projected/5db22a0e-ee1a-4b26-9e49-b26644266834-kube-api-access-9db4k\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:09 crc kubenswrapper[4764]: I0309 14:12:09.013150 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " pod="openstack/tempest-tests-tempest" Mar 09 14:12:09 crc kubenswrapper[4764]: I0309 14:12:09.090999 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 09 14:12:09 crc kubenswrapper[4764]: I0309 14:12:09.616309 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 09 14:12:09 crc kubenswrapper[4764]: I0309 14:12:09.624156 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 14:12:10 crc kubenswrapper[4764]: I0309 14:12:10.176904 4764 scope.go:117] "RemoveContainer" containerID="4dfa06f7a78f4ddaf54d7700f7fde17ae280a4e75103177bb66105376bf4e6a2" Mar 09 14:12:10 crc kubenswrapper[4764]: I0309 14:12:10.377743 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5db22a0e-ee1a-4b26-9e49-b26644266834","Type":"ContainerStarted","Data":"bb1050b512fa7c7108bc1149d279ca654a7191121729b646cce9a919a3a91226"} Mar 09 14:12:18 crc kubenswrapper[4764]: I0309 14:12:18.563933 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:12:18 crc kubenswrapper[4764]: E0309 14:12:18.566322 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:12:31 crc kubenswrapper[4764]: I0309 14:12:31.566467 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:12:31 crc kubenswrapper[4764]: E0309 14:12:31.567261 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:12:40 crc kubenswrapper[4764]: E0309 14:12:40.555666 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 09 14:12:40 crc kubenswrapper[4764]: E0309 14:12:40.557966 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9db4k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(5db22a0e-ee1a-4b26-9e49-b26644266834): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 14:12:40 crc kubenswrapper[4764]: E0309 14:12:40.561636 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="5db22a0e-ee1a-4b26-9e49-b26644266834" Mar 09 14:12:40 crc kubenswrapper[4764]: E0309 14:12:40.754943 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="5db22a0e-ee1a-4b26-9e49-b26644266834" Mar 09 14:12:45 crc kubenswrapper[4764]: I0309 14:12:45.571538 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:12:45 crc kubenswrapper[4764]: E0309 14:12:45.573074 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:12:56 crc kubenswrapper[4764]: I0309 14:12:56.212541 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 09 14:12:57 crc kubenswrapper[4764]: I0309 14:12:57.952585 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5db22a0e-ee1a-4b26-9e49-b26644266834","Type":"ContainerStarted","Data":"88276b497758f3b5f0f6060dd845ad4b0ff499d7eeb52fb1af2c65524e2ceaba"} Mar 09 14:12:57 crc kubenswrapper[4764]: I0309 14:12:57.976546 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.391193628 podStartE2EDuration="50.976520558s" podCreationTimestamp="2026-03-09 14:12:07 +0000 UTC" firstStartedPulling="2026-03-09 14:12:09.623880542 +0000 UTC m=+3084.874052450" lastFinishedPulling="2026-03-09 14:12:56.209207472 +0000 UTC m=+3131.459379380" observedRunningTime="2026-03-09 14:12:57.973540488 +0000 UTC m=+3133.223712416" watchObservedRunningTime="2026-03-09 14:12:57.976520558 +0000 UTC m=+3133.226692476" Mar 09 14:12:59 crc kubenswrapper[4764]: I0309 14:12:59.560725 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:12:59 crc kubenswrapper[4764]: E0309 14:12:59.561564 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:13:11 crc kubenswrapper[4764]: I0309 14:13:11.560050 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:13:11 crc kubenswrapper[4764]: E0309 14:13:11.561360 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:13:24 crc kubenswrapper[4764]: I0309 14:13:24.560008 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:13:24 crc kubenswrapper[4764]: E0309 14:13:24.561180 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:13:38 crc kubenswrapper[4764]: I0309 14:13:38.319292 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9lzgr"] Mar 09 14:13:38 crc kubenswrapper[4764]: I0309 14:13:38.323426 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:38 crc kubenswrapper[4764]: I0309 14:13:38.333850 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9lzgr"] Mar 09 14:13:38 crc kubenswrapper[4764]: I0309 14:13:38.414308 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpc6l\" (UniqueName: \"kubernetes.io/projected/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-kube-api-access-tpc6l\") pod \"certified-operators-9lzgr\" (UID: \"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db\") " pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:38 crc kubenswrapper[4764]: I0309 14:13:38.414462 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-utilities\") pod \"certified-operators-9lzgr\" (UID: \"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db\") " pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:38 crc kubenswrapper[4764]: I0309 14:13:38.414842 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-catalog-content\") pod \"certified-operators-9lzgr\" (UID: \"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db\") " pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:38 crc kubenswrapper[4764]: I0309 14:13:38.517905 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-utilities\") pod \"certified-operators-9lzgr\" (UID: \"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db\") " pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:38 crc kubenswrapper[4764]: I0309 14:13:38.518144 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-catalog-content\") pod \"certified-operators-9lzgr\" (UID: \"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db\") " pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:38 crc kubenswrapper[4764]: I0309 14:13:38.518230 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpc6l\" (UniqueName: \"kubernetes.io/projected/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-kube-api-access-tpc6l\") pod \"certified-operators-9lzgr\" (UID: \"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db\") " pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:38 crc kubenswrapper[4764]: I0309 14:13:38.518525 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-utilities\") pod \"certified-operators-9lzgr\" (UID: \"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db\") " pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:38 crc kubenswrapper[4764]: I0309 14:13:38.518579 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-catalog-content\") pod \"certified-operators-9lzgr\" (UID: \"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db\") " pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:38 crc kubenswrapper[4764]: I0309 14:13:38.545320 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpc6l\" (UniqueName: \"kubernetes.io/projected/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-kube-api-access-tpc6l\") pod \"certified-operators-9lzgr\" (UID: \"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db\") " pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:38 crc kubenswrapper[4764]: I0309 14:13:38.560227 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:13:38 crc kubenswrapper[4764]: E0309 14:13:38.560556 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:13:38 crc kubenswrapper[4764]: I0309 14:13:38.658583 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:39 crc kubenswrapper[4764]: I0309 14:13:39.232391 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9lzgr"] Mar 09 14:13:39 crc kubenswrapper[4764]: I0309 14:13:39.587781 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lzgr" event={"ID":"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db","Type":"ContainerStarted","Data":"1d30d3bafe785b61035cfd6776bbf1fbd30eae865400a81c90fd508159791920"} Mar 09 14:13:39 crc kubenswrapper[4764]: I0309 14:13:39.587848 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lzgr" event={"ID":"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db","Type":"ContainerStarted","Data":"8a4440269200c76a2fe2d685a91e0e74538a4b746197ce327947e0be241c133f"} Mar 09 14:13:40 crc kubenswrapper[4764]: I0309 14:13:40.600001 4764 generic.go:334] "Generic (PLEG): container finished" podID="8414d1b3-79f4-4eb4-b7fc-e85caf18e1db" containerID="1d30d3bafe785b61035cfd6776bbf1fbd30eae865400a81c90fd508159791920" exitCode=0 Mar 09 14:13:40 crc kubenswrapper[4764]: I0309 14:13:40.600085 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lzgr" event={"ID":"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db","Type":"ContainerDied","Data":"1d30d3bafe785b61035cfd6776bbf1fbd30eae865400a81c90fd508159791920"} Mar 09 14:13:41 crc kubenswrapper[4764]: I0309 14:13:41.615593 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lzgr" event={"ID":"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db","Type":"ContainerStarted","Data":"a83e9cbe22286a479af2a92b6e16b2800ac926082aaf40b98e4defaf7a33610b"} Mar 09 14:13:44 crc kubenswrapper[4764]: I0309 14:13:44.652767 4764 generic.go:334] "Generic (PLEG): container finished" podID="8414d1b3-79f4-4eb4-b7fc-e85caf18e1db" containerID="a83e9cbe22286a479af2a92b6e16b2800ac926082aaf40b98e4defaf7a33610b" exitCode=0 Mar 09 14:13:44 crc kubenswrapper[4764]: I0309 14:13:44.652813 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lzgr" event={"ID":"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db","Type":"ContainerDied","Data":"a83e9cbe22286a479af2a92b6e16b2800ac926082aaf40b98e4defaf7a33610b"} Mar 09 14:13:45 crc kubenswrapper[4764]: I0309 14:13:45.668348 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lzgr" event={"ID":"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db","Type":"ContainerStarted","Data":"25dd59f6d05d4fb8f17a37a81bd52fddb49fdd1d34f6ac322a9adb36613a0c6b"} Mar 09 14:13:45 crc kubenswrapper[4764]: I0309 14:13:45.694726 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9lzgr" podStartSLOduration=2.155137017 podStartE2EDuration="7.694689691s" podCreationTimestamp="2026-03-09 14:13:38 +0000 UTC" firstStartedPulling="2026-03-09 14:13:39.591257981 +0000 UTC m=+3174.841429889" lastFinishedPulling="2026-03-09 14:13:45.130810615 +0000 UTC m=+3180.380982563" observedRunningTime="2026-03-09 14:13:45.691210038 +0000 UTC m=+3180.941381956" watchObservedRunningTime="2026-03-09 14:13:45.694689691 +0000 UTC m=+3180.944861669" Mar 09 14:13:48 crc kubenswrapper[4764]: I0309 14:13:48.659295 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:48 crc kubenswrapper[4764]: I0309 14:13:48.660550 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:48 crc kubenswrapper[4764]: I0309 14:13:48.750392 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:52 crc kubenswrapper[4764]: I0309 14:13:52.559526 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:13:52 crc kubenswrapper[4764]: E0309 14:13:52.560540 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:13:58 crc kubenswrapper[4764]: I0309 14:13:58.716770 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:59 crc kubenswrapper[4764]: I0309 14:13:59.386867 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9lzgr"] Mar 09 14:13:59 crc kubenswrapper[4764]: I0309 14:13:59.387168 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9lzgr" podUID="8414d1b3-79f4-4eb4-b7fc-e85caf18e1db" containerName="registry-server" containerID="cri-o://25dd59f6d05d4fb8f17a37a81bd52fddb49fdd1d34f6ac322a9adb36613a0c6b" gracePeriod=2 Mar 09 14:13:59 crc kubenswrapper[4764]: I0309 14:13:59.834045 4764 generic.go:334] "Generic (PLEG): container finished" podID="8414d1b3-79f4-4eb4-b7fc-e85caf18e1db" containerID="25dd59f6d05d4fb8f17a37a81bd52fddb49fdd1d34f6ac322a9adb36613a0c6b" exitCode=0 Mar 09 14:13:59 crc kubenswrapper[4764]: I0309 14:13:59.834598 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lzgr" event={"ID":"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db","Type":"ContainerDied","Data":"25dd59f6d05d4fb8f17a37a81bd52fddb49fdd1d34f6ac322a9adb36613a0c6b"} Mar 09 14:13:59 crc kubenswrapper[4764]: I0309 14:13:59.834637 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lzgr" event={"ID":"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db","Type":"ContainerDied","Data":"8a4440269200c76a2fe2d685a91e0e74538a4b746197ce327947e0be241c133f"} Mar 09 14:13:59 crc kubenswrapper[4764]: I0309 14:13:59.834738 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a4440269200c76a2fe2d685a91e0e74538a4b746197ce327947e0be241c133f" Mar 09 14:13:59 crc kubenswrapper[4764]: I0309 14:13:59.879136 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:13:59 crc kubenswrapper[4764]: I0309 14:13:59.979424 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-utilities\") pod \"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db\" (UID: \"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db\") " Mar 09 14:13:59 crc kubenswrapper[4764]: I0309 14:13:59.979673 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpc6l\" (UniqueName: \"kubernetes.io/projected/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-kube-api-access-tpc6l\") pod \"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db\" (UID: \"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db\") " Mar 09 14:13:59 crc kubenswrapper[4764]: I0309 14:13:59.979830 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-catalog-content\") pod \"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db\" (UID: \"8414d1b3-79f4-4eb4-b7fc-e85caf18e1db\") " Mar 09 14:13:59 crc kubenswrapper[4764]: I0309 14:13:59.980438 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-utilities" (OuterVolumeSpecName: "utilities") pod "8414d1b3-79f4-4eb4-b7fc-e85caf18e1db" (UID: "8414d1b3-79f4-4eb4-b7fc-e85caf18e1db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:13:59 crc kubenswrapper[4764]: I0309 14:13:59.981599 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:13:59 crc kubenswrapper[4764]: I0309 14:13:59.988414 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-kube-api-access-tpc6l" (OuterVolumeSpecName: "kube-api-access-tpc6l") pod "8414d1b3-79f4-4eb4-b7fc-e85caf18e1db" (UID: "8414d1b3-79f4-4eb4-b7fc-e85caf18e1db"). InnerVolumeSpecName "kube-api-access-tpc6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.063857 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8414d1b3-79f4-4eb4-b7fc-e85caf18e1db" (UID: "8414d1b3-79f4-4eb4-b7fc-e85caf18e1db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.084454 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpc6l\" (UniqueName: \"kubernetes.io/projected/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-kube-api-access-tpc6l\") on node \"crc\" DevicePath \"\"" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.084497 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.155814 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551094-w5qt4"] Mar 09 14:14:00 crc kubenswrapper[4764]: E0309 14:14:00.156428 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8414d1b3-79f4-4eb4-b7fc-e85caf18e1db" containerName="extract-utilities" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.156444 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8414d1b3-79f4-4eb4-b7fc-e85caf18e1db" containerName="extract-utilities" Mar 09 14:14:00 crc kubenswrapper[4764]: E0309 14:14:00.156460 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8414d1b3-79f4-4eb4-b7fc-e85caf18e1db" containerName="registry-server" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.156466 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8414d1b3-79f4-4eb4-b7fc-e85caf18e1db" containerName="registry-server" Mar 09 14:14:00 crc kubenswrapper[4764]: E0309 14:14:00.156510 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8414d1b3-79f4-4eb4-b7fc-e85caf18e1db" containerName="extract-content" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.156516 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8414d1b3-79f4-4eb4-b7fc-e85caf18e1db" containerName="extract-content" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.156800 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8414d1b3-79f4-4eb4-b7fc-e85caf18e1db" containerName="registry-server" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.157826 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551094-w5qt4" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.161349 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.161410 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.161436 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.170780 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551094-w5qt4"] Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.289042 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrvgc\" (UniqueName: \"kubernetes.io/projected/d7d72c38-b071-4fcb-89b4-935542a1943e-kube-api-access-vrvgc\") pod \"auto-csr-approver-29551094-w5qt4\" (UID: \"d7d72c38-b071-4fcb-89b4-935542a1943e\") " pod="openshift-infra/auto-csr-approver-29551094-w5qt4" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.390863 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrvgc\" (UniqueName: \"kubernetes.io/projected/d7d72c38-b071-4fcb-89b4-935542a1943e-kube-api-access-vrvgc\") pod \"auto-csr-approver-29551094-w5qt4\" (UID: \"d7d72c38-b071-4fcb-89b4-935542a1943e\") " pod="openshift-infra/auto-csr-approver-29551094-w5qt4" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.411592 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrvgc\" (UniqueName: \"kubernetes.io/projected/d7d72c38-b071-4fcb-89b4-935542a1943e-kube-api-access-vrvgc\") pod \"auto-csr-approver-29551094-w5qt4\" (UID: \"d7d72c38-b071-4fcb-89b4-935542a1943e\") " pod="openshift-infra/auto-csr-approver-29551094-w5qt4" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.485439 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551094-w5qt4" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.843146 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9lzgr" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.892539 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9lzgr"] Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.901749 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9lzgr"] Mar 09 14:14:00 crc kubenswrapper[4764]: E0309 14:14:00.987930 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8414d1b3_79f4_4eb4_b7fc_e85caf18e1db.slice/crio-8a4440269200c76a2fe2d685a91e0e74538a4b746197ce327947e0be241c133f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8414d1b3_79f4_4eb4_b7fc_e85caf18e1db.slice\": RecentStats: unable to find data in memory cache]" Mar 09 14:14:00 crc kubenswrapper[4764]: I0309 14:14:00.996804 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551094-w5qt4"] Mar 09 14:14:01 crc kubenswrapper[4764]: W0309 14:14:01.011424 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7d72c38_b071_4fcb_89b4_935542a1943e.slice/crio-68241c662e00c69533831193ee003a89b7f9795f5fb6f4f7cd8b120dc63d0036 WatchSource:0}: Error finding container 68241c662e00c69533831193ee003a89b7f9795f5fb6f4f7cd8b120dc63d0036: Status 404 returned error can't find the container with id 68241c662e00c69533831193ee003a89b7f9795f5fb6f4f7cd8b120dc63d0036 Mar 09 14:14:01 crc kubenswrapper[4764]: I0309 14:14:01.574478 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8414d1b3-79f4-4eb4-b7fc-e85caf18e1db" path="/var/lib/kubelet/pods/8414d1b3-79f4-4eb4-b7fc-e85caf18e1db/volumes" Mar 09 14:14:01 crc kubenswrapper[4764]: I0309 14:14:01.859988 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551094-w5qt4" event={"ID":"d7d72c38-b071-4fcb-89b4-935542a1943e","Type":"ContainerStarted","Data":"68241c662e00c69533831193ee003a89b7f9795f5fb6f4f7cd8b120dc63d0036"} Mar 09 14:14:02 crc kubenswrapper[4764]: I0309 14:14:02.873881 4764 generic.go:334] "Generic (PLEG): container finished" podID="d7d72c38-b071-4fcb-89b4-935542a1943e" containerID="865365e28885b4a1f3c0039c8f24992ea422c88be0ea3e7581c981c450049a6c" exitCode=0 Mar 09 14:14:02 crc kubenswrapper[4764]: I0309 14:14:02.873975 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551094-w5qt4" event={"ID":"d7d72c38-b071-4fcb-89b4-935542a1943e","Type":"ContainerDied","Data":"865365e28885b4a1f3c0039c8f24992ea422c88be0ea3e7581c981c450049a6c"} Mar 09 14:14:04 crc kubenswrapper[4764]: I0309 14:14:04.282236 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551094-w5qt4" Mar 09 14:14:04 crc kubenswrapper[4764]: I0309 14:14:04.405557 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrvgc\" (UniqueName: \"kubernetes.io/projected/d7d72c38-b071-4fcb-89b4-935542a1943e-kube-api-access-vrvgc\") pod \"d7d72c38-b071-4fcb-89b4-935542a1943e\" (UID: \"d7d72c38-b071-4fcb-89b4-935542a1943e\") " Mar 09 14:14:04 crc kubenswrapper[4764]: I0309 14:14:04.414845 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7d72c38-b071-4fcb-89b4-935542a1943e-kube-api-access-vrvgc" (OuterVolumeSpecName: "kube-api-access-vrvgc") pod "d7d72c38-b071-4fcb-89b4-935542a1943e" (UID: "d7d72c38-b071-4fcb-89b4-935542a1943e"). InnerVolumeSpecName "kube-api-access-vrvgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:14:04 crc kubenswrapper[4764]: I0309 14:14:04.510340 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrvgc\" (UniqueName: \"kubernetes.io/projected/d7d72c38-b071-4fcb-89b4-935542a1943e-kube-api-access-vrvgc\") on node \"crc\" DevicePath \"\"" Mar 09 14:14:04 crc kubenswrapper[4764]: I0309 14:14:04.896469 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551094-w5qt4" event={"ID":"d7d72c38-b071-4fcb-89b4-935542a1943e","Type":"ContainerDied","Data":"68241c662e00c69533831193ee003a89b7f9795f5fb6f4f7cd8b120dc63d0036"} Mar 09 14:14:04 crc kubenswrapper[4764]: I0309 14:14:04.896531 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68241c662e00c69533831193ee003a89b7f9795f5fb6f4f7cd8b120dc63d0036" Mar 09 14:14:04 crc kubenswrapper[4764]: I0309 14:14:04.896582 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551094-w5qt4" Mar 09 14:14:05 crc kubenswrapper[4764]: I0309 14:14:05.361097 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551088-wtbvc"] Mar 09 14:14:05 crc kubenswrapper[4764]: I0309 14:14:05.373997 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551088-wtbvc"] Mar 09 14:14:05 crc kubenswrapper[4764]: I0309 14:14:05.567978 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:14:05 crc kubenswrapper[4764]: E0309 14:14:05.570017 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:14:05 crc kubenswrapper[4764]: I0309 14:14:05.572780 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="902ae1d9-a43c-46c6-a492-10ee0242e721" path="/var/lib/kubelet/pods/902ae1d9-a43c-46c6-a492-10ee0242e721/volumes" Mar 09 14:14:10 crc kubenswrapper[4764]: I0309 14:14:10.298012 4764 scope.go:117] "RemoveContainer" containerID="20854488fc265f9f6a154273e0d45920e3a458753547063958f0c25388b08a64" Mar 09 14:14:16 crc kubenswrapper[4764]: I0309 14:14:16.559887 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:14:16 crc kubenswrapper[4764]: E0309 14:14:16.560791 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:14:27 crc kubenswrapper[4764]: I0309 14:14:27.560960 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:14:27 crc kubenswrapper[4764]: E0309 14:14:27.562138 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:14:39 crc kubenswrapper[4764]: I0309 14:14:39.560431 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:14:39 crc kubenswrapper[4764]: E0309 14:14:39.561690 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:14:54 crc kubenswrapper[4764]: I0309 14:14:54.560425 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:14:54 crc kubenswrapper[4764]: E0309 14:14:54.561430 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.158130 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78"] Mar 09 14:15:00 crc kubenswrapper[4764]: E0309 14:15:00.159576 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d72c38-b071-4fcb-89b4-935542a1943e" containerName="oc" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.159596 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d72c38-b071-4fcb-89b4-935542a1943e" containerName="oc" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.159865 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7d72c38-b071-4fcb-89b4-935542a1943e" containerName="oc" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.160693 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.164149 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.168692 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.212225 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78"] Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.293118 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n45lc\" (UniqueName: \"kubernetes.io/projected/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-kube-api-access-n45lc\") pod \"collect-profiles-29551095-z2m78\" (UID: \"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.293198 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-config-volume\") pod \"collect-profiles-29551095-z2m78\" (UID: \"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.293315 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-secret-volume\") pod \"collect-profiles-29551095-z2m78\" (UID: \"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.395183 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-config-volume\") pod \"collect-profiles-29551095-z2m78\" (UID: \"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.395417 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-secret-volume\") pod \"collect-profiles-29551095-z2m78\" (UID: \"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.395600 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n45lc\" (UniqueName: \"kubernetes.io/projected/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-kube-api-access-n45lc\") pod \"collect-profiles-29551095-z2m78\" (UID: \"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.397234 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-config-volume\") pod \"collect-profiles-29551095-z2m78\" (UID: \"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.405164 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-secret-volume\") pod \"collect-profiles-29551095-z2m78\" (UID: \"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.416046 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n45lc\" (UniqueName: \"kubernetes.io/projected/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-kube-api-access-n45lc\") pod \"collect-profiles-29551095-z2m78\" (UID: \"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.489547 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" Mar 09 14:15:00 crc kubenswrapper[4764]: I0309 14:15:00.987860 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78"] Mar 09 14:15:01 crc kubenswrapper[4764]: I0309 14:15:01.486390 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" event={"ID":"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0","Type":"ContainerStarted","Data":"ab97d625c108341b13c2db5e1056def5231f61fcd681f959a64e961f2a15864d"} Mar 09 14:15:01 crc kubenswrapper[4764]: I0309 14:15:01.486844 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" event={"ID":"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0","Type":"ContainerStarted","Data":"2a4bac2ac8d89facbed3155024eab8ea4ab9e1aa36b9e90e3e95aabe36df67a2"} Mar 09 14:15:02 crc kubenswrapper[4764]: I0309 14:15:02.496834 4764 generic.go:334] "Generic (PLEG): container finished" podID="697f22c2-daeb-4ae7-a0da-4e80bfad4cb0" containerID="ab97d625c108341b13c2db5e1056def5231f61fcd681f959a64e961f2a15864d" exitCode=0 Mar 09 14:15:02 crc kubenswrapper[4764]: I0309 14:15:02.496929 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" event={"ID":"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0","Type":"ContainerDied","Data":"ab97d625c108341b13c2db5e1056def5231f61fcd681f959a64e961f2a15864d"} Mar 09 14:15:03 crc kubenswrapper[4764]: I0309 14:15:03.877374 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" Mar 09 14:15:04 crc kubenswrapper[4764]: I0309 14:15:04.012036 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n45lc\" (UniqueName: \"kubernetes.io/projected/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-kube-api-access-n45lc\") pod \"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0\" (UID: \"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0\") " Mar 09 14:15:04 crc kubenswrapper[4764]: I0309 14:15:04.012142 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-secret-volume\") pod \"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0\" (UID: \"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0\") " Mar 09 14:15:04 crc kubenswrapper[4764]: I0309 14:15:04.012323 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-config-volume\") pod \"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0\" (UID: \"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0\") " Mar 09 14:15:04 crc kubenswrapper[4764]: I0309 14:15:04.013884 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-config-volume" (OuterVolumeSpecName: "config-volume") pod "697f22c2-daeb-4ae7-a0da-4e80bfad4cb0" (UID: "697f22c2-daeb-4ae7-a0da-4e80bfad4cb0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:15:04 crc kubenswrapper[4764]: I0309 14:15:04.020117 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-kube-api-access-n45lc" (OuterVolumeSpecName: "kube-api-access-n45lc") pod "697f22c2-daeb-4ae7-a0da-4e80bfad4cb0" (UID: "697f22c2-daeb-4ae7-a0da-4e80bfad4cb0"). InnerVolumeSpecName "kube-api-access-n45lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:15:04 crc kubenswrapper[4764]: I0309 14:15:04.020837 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "697f22c2-daeb-4ae7-a0da-4e80bfad4cb0" (UID: "697f22c2-daeb-4ae7-a0da-4e80bfad4cb0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:15:04 crc kubenswrapper[4764]: I0309 14:15:04.115621 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 14:15:04 crc kubenswrapper[4764]: I0309 14:15:04.115950 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n45lc\" (UniqueName: \"kubernetes.io/projected/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-kube-api-access-n45lc\") on node \"crc\" DevicePath \"\"" Mar 09 14:15:04 crc kubenswrapper[4764]: I0309 14:15:04.116044 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/697f22c2-daeb-4ae7-a0da-4e80bfad4cb0-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 14:15:04 crc kubenswrapper[4764]: I0309 14:15:04.524377 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" event={"ID":"697f22c2-daeb-4ae7-a0da-4e80bfad4cb0","Type":"ContainerDied","Data":"2a4bac2ac8d89facbed3155024eab8ea4ab9e1aa36b9e90e3e95aabe36df67a2"} Mar 09 14:15:04 crc kubenswrapper[4764]: I0309 14:15:04.525189 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a4bac2ac8d89facbed3155024eab8ea4ab9e1aa36b9e90e3e95aabe36df67a2" Mar 09 14:15:04 crc kubenswrapper[4764]: I0309 14:15:04.524616 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-z2m78" Mar 09 14:15:04 crc kubenswrapper[4764]: I0309 14:15:04.973761 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp"] Mar 09 14:15:04 crc kubenswrapper[4764]: I0309 14:15:04.982162 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551050-9wvsp"] Mar 09 14:15:05 crc kubenswrapper[4764]: I0309 14:15:05.579338 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a7784d6-384a-426a-8c7f-17738461c327" path="/var/lib/kubelet/pods/1a7784d6-384a-426a-8c7f-17738461c327/volumes" Mar 09 14:15:06 crc kubenswrapper[4764]: I0309 14:15:06.560494 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:15:06 crc kubenswrapper[4764]: E0309 14:15:06.561097 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:15:10 crc kubenswrapper[4764]: I0309 14:15:10.397893 4764 scope.go:117] "RemoveContainer" containerID="aa846508ebc812b1aaea2ee2e48b6017bd527c31a06deb61dd1001786fb1b811" Mar 09 14:15:20 crc kubenswrapper[4764]: I0309 14:15:20.561571 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:15:20 crc kubenswrapper[4764]: E0309 14:15:20.565879 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:15:34 crc kubenswrapper[4764]: I0309 14:15:34.560717 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:15:34 crc kubenswrapper[4764]: I0309 14:15:34.857511 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"e0fd7c06957b904af3795728b54332eb3bc1c0b6a8df56e1a4bc3599bdedd8b5"} Mar 09 14:16:00 crc kubenswrapper[4764]: I0309 14:16:00.195778 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551096-9bl56"] Mar 09 14:16:00 crc kubenswrapper[4764]: E0309 14:16:00.197078 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="697f22c2-daeb-4ae7-a0da-4e80bfad4cb0" containerName="collect-profiles" Mar 09 14:16:00 crc kubenswrapper[4764]: I0309 14:16:00.197098 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="697f22c2-daeb-4ae7-a0da-4e80bfad4cb0" containerName="collect-profiles" Mar 09 14:16:00 crc kubenswrapper[4764]: I0309 14:16:00.197378 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="697f22c2-daeb-4ae7-a0da-4e80bfad4cb0" containerName="collect-profiles" Mar 09 14:16:00 crc kubenswrapper[4764]: I0309 14:16:00.198262 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551096-9bl56" Mar 09 14:16:00 crc kubenswrapper[4764]: I0309 14:16:00.204028 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:16:00 crc kubenswrapper[4764]: I0309 14:16:00.204246 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:16:00 crc kubenswrapper[4764]: I0309 14:16:00.205059 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:16:00 crc kubenswrapper[4764]: I0309 14:16:00.212716 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551096-9bl56"] Mar 09 14:16:00 crc kubenswrapper[4764]: I0309 14:16:00.330179 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pkrd\" (UniqueName: \"kubernetes.io/projected/40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3-kube-api-access-2pkrd\") pod \"auto-csr-approver-29551096-9bl56\" (UID: \"40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3\") " pod="openshift-infra/auto-csr-approver-29551096-9bl56" Mar 09 14:16:00 crc kubenswrapper[4764]: I0309 14:16:00.433116 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pkrd\" (UniqueName: \"kubernetes.io/projected/40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3-kube-api-access-2pkrd\") pod \"auto-csr-approver-29551096-9bl56\" (UID: \"40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3\") " pod="openshift-infra/auto-csr-approver-29551096-9bl56" Mar 09 14:16:00 crc kubenswrapper[4764]: I0309 14:16:00.458383 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pkrd\" (UniqueName: \"kubernetes.io/projected/40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3-kube-api-access-2pkrd\") pod \"auto-csr-approver-29551096-9bl56\" (UID: \"40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3\") " pod="openshift-infra/auto-csr-approver-29551096-9bl56" Mar 09 14:16:00 crc kubenswrapper[4764]: I0309 14:16:00.522264 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551096-9bl56" Mar 09 14:16:01 crc kubenswrapper[4764]: I0309 14:16:01.057706 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551096-9bl56"] Mar 09 14:16:01 crc kubenswrapper[4764]: I0309 14:16:01.113047 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551096-9bl56" event={"ID":"40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3","Type":"ContainerStarted","Data":"ec35db2146bde758526b41276fc340a90033203c67b677f74b8aef8ab3a243b1"} Mar 09 14:16:03 crc kubenswrapper[4764]: I0309 14:16:03.134847 4764 generic.go:334] "Generic (PLEG): container finished" podID="40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3" containerID="ec447d2d121a3c0de24bcb688dbed1ad219802d16c0a98f40138fd4ac8dc4b32" exitCode=0 Mar 09 14:16:03 crc kubenswrapper[4764]: I0309 14:16:03.134936 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551096-9bl56" event={"ID":"40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3","Type":"ContainerDied","Data":"ec447d2d121a3c0de24bcb688dbed1ad219802d16c0a98f40138fd4ac8dc4b32"} Mar 09 14:16:04 crc kubenswrapper[4764]: I0309 14:16:04.657126 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551096-9bl56" Mar 09 14:16:04 crc kubenswrapper[4764]: I0309 14:16:04.736272 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pkrd\" (UniqueName: \"kubernetes.io/projected/40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3-kube-api-access-2pkrd\") pod \"40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3\" (UID: \"40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3\") " Mar 09 14:16:04 crc kubenswrapper[4764]: I0309 14:16:04.744387 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3-kube-api-access-2pkrd" (OuterVolumeSpecName: "kube-api-access-2pkrd") pod "40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3" (UID: "40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3"). InnerVolumeSpecName "kube-api-access-2pkrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:16:04 crc kubenswrapper[4764]: I0309 14:16:04.840040 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pkrd\" (UniqueName: \"kubernetes.io/projected/40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3-kube-api-access-2pkrd\") on node \"crc\" DevicePath \"\"" Mar 09 14:16:05 crc kubenswrapper[4764]: I0309 14:16:05.176200 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551096-9bl56" event={"ID":"40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3","Type":"ContainerDied","Data":"ec35db2146bde758526b41276fc340a90033203c67b677f74b8aef8ab3a243b1"} Mar 09 14:16:05 crc kubenswrapper[4764]: I0309 14:16:05.176253 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec35db2146bde758526b41276fc340a90033203c67b677f74b8aef8ab3a243b1" Mar 09 14:16:05 crc kubenswrapper[4764]: I0309 14:16:05.176266 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551096-9bl56" Mar 09 14:16:05 crc kubenswrapper[4764]: I0309 14:16:05.758766 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551090-8wdlp"] Mar 09 14:16:05 crc kubenswrapper[4764]: I0309 14:16:05.776211 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551090-8wdlp"] Mar 09 14:16:07 crc kubenswrapper[4764]: I0309 14:16:07.571599 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e" path="/var/lib/kubelet/pods/4b56c1fe-b4de-45ba-8ca2-7bae98a2e97e/volumes" Mar 09 14:16:10 crc kubenswrapper[4764]: I0309 14:16:10.495566 4764 scope.go:117] "RemoveContainer" containerID="30776e46d343f3c10f72753cb59ea338812b698d41259546e8c5f506b57f2e53" Mar 09 14:16:10 crc kubenswrapper[4764]: I0309 14:16:10.553759 4764 scope.go:117] "RemoveContainer" containerID="c50fe2a841af25f201b2df55f17fb4354fb702de4d79adbf221b5ea06463110b" Mar 09 14:16:10 crc kubenswrapper[4764]: I0309 14:16:10.609802 4764 scope.go:117] "RemoveContainer" containerID="ecd62f1b0eb5226209d3ecf6887c5a0f2d0feaa519397c875dd114ffc89d8b3b" Mar 09 14:17:58 crc kubenswrapper[4764]: I0309 14:17:58.369991 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:17:58 crc kubenswrapper[4764]: I0309 14:17:58.370737 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:18:00 crc kubenswrapper[4764]: I0309 14:18:00.161501 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551098-7clls"] Mar 09 14:18:00 crc kubenswrapper[4764]: E0309 14:18:00.162614 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3" containerName="oc" Mar 09 14:18:00 crc kubenswrapper[4764]: I0309 14:18:00.162631 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3" containerName="oc" Mar 09 14:18:00 crc kubenswrapper[4764]: I0309 14:18:00.162899 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3" containerName="oc" Mar 09 14:18:00 crc kubenswrapper[4764]: I0309 14:18:00.163851 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551098-7clls" Mar 09 14:18:00 crc kubenswrapper[4764]: I0309 14:18:00.166944 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:18:00 crc kubenswrapper[4764]: I0309 14:18:00.167027 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:18:00 crc kubenswrapper[4764]: I0309 14:18:00.167378 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:18:00 crc kubenswrapper[4764]: I0309 14:18:00.189907 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551098-7clls"] Mar 09 14:18:00 crc kubenswrapper[4764]: I0309 14:18:00.308008 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5sck\" (UniqueName: \"kubernetes.io/projected/e86714ea-a59a-4955-b4a5-038ce0ce7bf6-kube-api-access-v5sck\") pod \"auto-csr-approver-29551098-7clls\" (UID: \"e86714ea-a59a-4955-b4a5-038ce0ce7bf6\") " pod="openshift-infra/auto-csr-approver-29551098-7clls" Mar 09 14:18:00 crc kubenswrapper[4764]: I0309 14:18:00.411438 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5sck\" (UniqueName: \"kubernetes.io/projected/e86714ea-a59a-4955-b4a5-038ce0ce7bf6-kube-api-access-v5sck\") pod \"auto-csr-approver-29551098-7clls\" (UID: \"e86714ea-a59a-4955-b4a5-038ce0ce7bf6\") " pod="openshift-infra/auto-csr-approver-29551098-7clls" Mar 09 14:18:00 crc kubenswrapper[4764]: I0309 14:18:00.451805 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5sck\" (UniqueName: \"kubernetes.io/projected/e86714ea-a59a-4955-b4a5-038ce0ce7bf6-kube-api-access-v5sck\") pod \"auto-csr-approver-29551098-7clls\" (UID: \"e86714ea-a59a-4955-b4a5-038ce0ce7bf6\") " pod="openshift-infra/auto-csr-approver-29551098-7clls" Mar 09 14:18:00 crc kubenswrapper[4764]: I0309 14:18:00.490684 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551098-7clls" Mar 09 14:18:01 crc kubenswrapper[4764]: I0309 14:18:01.009981 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551098-7clls"] Mar 09 14:18:01 crc kubenswrapper[4764]: I0309 14:18:01.041241 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 14:18:01 crc kubenswrapper[4764]: I0309 14:18:01.104125 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551098-7clls" event={"ID":"e86714ea-a59a-4955-b4a5-038ce0ce7bf6","Type":"ContainerStarted","Data":"a22ac519a9a0cc5b0000c00fd0bf97e4d6fdb2c0405f4eabe925989232e16a69"} Mar 09 14:18:03 crc kubenswrapper[4764]: I0309 14:18:03.125905 4764 generic.go:334] "Generic (PLEG): container finished" podID="e86714ea-a59a-4955-b4a5-038ce0ce7bf6" containerID="c9de23699d45475689268b26fadafce561fa4c6da1922b49b79542286bd5c95d" exitCode=0 Mar 09 14:18:03 crc kubenswrapper[4764]: I0309 14:18:03.126515 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551098-7clls" event={"ID":"e86714ea-a59a-4955-b4a5-038ce0ce7bf6","Type":"ContainerDied","Data":"c9de23699d45475689268b26fadafce561fa4c6da1922b49b79542286bd5c95d"} Mar 09 14:18:04 crc kubenswrapper[4764]: I0309 14:18:04.640579 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551098-7clls" Mar 09 14:18:04 crc kubenswrapper[4764]: I0309 14:18:04.837195 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5sck\" (UniqueName: \"kubernetes.io/projected/e86714ea-a59a-4955-b4a5-038ce0ce7bf6-kube-api-access-v5sck\") pod \"e86714ea-a59a-4955-b4a5-038ce0ce7bf6\" (UID: \"e86714ea-a59a-4955-b4a5-038ce0ce7bf6\") " Mar 09 14:18:04 crc kubenswrapper[4764]: I0309 14:18:04.843736 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e86714ea-a59a-4955-b4a5-038ce0ce7bf6-kube-api-access-v5sck" (OuterVolumeSpecName: "kube-api-access-v5sck") pod "e86714ea-a59a-4955-b4a5-038ce0ce7bf6" (UID: "e86714ea-a59a-4955-b4a5-038ce0ce7bf6"). InnerVolumeSpecName "kube-api-access-v5sck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:18:04 crc kubenswrapper[4764]: I0309 14:18:04.940118 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5sck\" (UniqueName: \"kubernetes.io/projected/e86714ea-a59a-4955-b4a5-038ce0ce7bf6-kube-api-access-v5sck\") on node \"crc\" DevicePath \"\"" Mar 09 14:18:05 crc kubenswrapper[4764]: I0309 14:18:05.148600 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551098-7clls" event={"ID":"e86714ea-a59a-4955-b4a5-038ce0ce7bf6","Type":"ContainerDied","Data":"a22ac519a9a0cc5b0000c00fd0bf97e4d6fdb2c0405f4eabe925989232e16a69"} Mar 09 14:18:05 crc kubenswrapper[4764]: I0309 14:18:05.149827 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a22ac519a9a0cc5b0000c00fd0bf97e4d6fdb2c0405f4eabe925989232e16a69" Mar 09 14:18:05 crc kubenswrapper[4764]: I0309 14:18:05.149967 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551098-7clls" Mar 09 14:18:05 crc kubenswrapper[4764]: I0309 14:18:05.748726 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551092-qs44j"] Mar 09 14:18:05 crc kubenswrapper[4764]: I0309 14:18:05.754455 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551092-qs44j"] Mar 09 14:18:07 crc kubenswrapper[4764]: I0309 14:18:07.572009 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81a7f588-07b1-4ef1-97ee-420e944ad16b" path="/var/lib/kubelet/pods/81a7f588-07b1-4ef1-97ee-420e944ad16b/volumes" Mar 09 14:18:10 crc kubenswrapper[4764]: I0309 14:18:10.753930 4764 scope.go:117] "RemoveContainer" containerID="4148e832dbd4302f41c6dbad3eab96d625f5a7299088cd33baf2acb47b5d3267" Mar 09 14:18:28 crc kubenswrapper[4764]: I0309 14:18:28.370356 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:18:28 crc kubenswrapper[4764]: I0309 14:18:28.373037 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:18:32 crc kubenswrapper[4764]: I0309 14:18:32.041518 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qrmxw"] Mar 09 14:18:32 crc kubenswrapper[4764]: E0309 14:18:32.043917 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86714ea-a59a-4955-b4a5-038ce0ce7bf6" containerName="oc" Mar 09 14:18:32 crc kubenswrapper[4764]: I0309 14:18:32.043942 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86714ea-a59a-4955-b4a5-038ce0ce7bf6" containerName="oc" Mar 09 14:18:32 crc kubenswrapper[4764]: I0309 14:18:32.044346 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e86714ea-a59a-4955-b4a5-038ce0ce7bf6" containerName="oc" Mar 09 14:18:32 crc kubenswrapper[4764]: I0309 14:18:32.046935 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:32 crc kubenswrapper[4764]: I0309 14:18:32.059804 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qrmxw"] Mar 09 14:18:32 crc kubenswrapper[4764]: I0309 14:18:32.172818 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e756ebeb-906e-4f9e-8abc-d254ffed03b7-utilities\") pod \"community-operators-qrmxw\" (UID: \"e756ebeb-906e-4f9e-8abc-d254ffed03b7\") " pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:32 crc kubenswrapper[4764]: I0309 14:18:32.172896 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e756ebeb-906e-4f9e-8abc-d254ffed03b7-catalog-content\") pod \"community-operators-qrmxw\" (UID: \"e756ebeb-906e-4f9e-8abc-d254ffed03b7\") " pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:32 crc kubenswrapper[4764]: I0309 14:18:32.172934 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9zcl\" (UniqueName: \"kubernetes.io/projected/e756ebeb-906e-4f9e-8abc-d254ffed03b7-kube-api-access-d9zcl\") pod \"community-operators-qrmxw\" (UID: \"e756ebeb-906e-4f9e-8abc-d254ffed03b7\") " pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:32 crc kubenswrapper[4764]: I0309 14:18:32.275859 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e756ebeb-906e-4f9e-8abc-d254ffed03b7-utilities\") pod \"community-operators-qrmxw\" (UID: \"e756ebeb-906e-4f9e-8abc-d254ffed03b7\") " pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:32 crc kubenswrapper[4764]: I0309 14:18:32.275958 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e756ebeb-906e-4f9e-8abc-d254ffed03b7-catalog-content\") pod \"community-operators-qrmxw\" (UID: \"e756ebeb-906e-4f9e-8abc-d254ffed03b7\") " pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:32 crc kubenswrapper[4764]: I0309 14:18:32.276021 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9zcl\" (UniqueName: \"kubernetes.io/projected/e756ebeb-906e-4f9e-8abc-d254ffed03b7-kube-api-access-d9zcl\") pod \"community-operators-qrmxw\" (UID: \"e756ebeb-906e-4f9e-8abc-d254ffed03b7\") " pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:32 crc kubenswrapper[4764]: I0309 14:18:32.276749 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e756ebeb-906e-4f9e-8abc-d254ffed03b7-utilities\") pod \"community-operators-qrmxw\" (UID: \"e756ebeb-906e-4f9e-8abc-d254ffed03b7\") " pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:32 crc kubenswrapper[4764]: I0309 14:18:32.277065 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e756ebeb-906e-4f9e-8abc-d254ffed03b7-catalog-content\") pod \"community-operators-qrmxw\" (UID: \"e756ebeb-906e-4f9e-8abc-d254ffed03b7\") " pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:32 crc kubenswrapper[4764]: I0309 14:18:32.297290 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9zcl\" (UniqueName: \"kubernetes.io/projected/e756ebeb-906e-4f9e-8abc-d254ffed03b7-kube-api-access-d9zcl\") pod \"community-operators-qrmxw\" (UID: \"e756ebeb-906e-4f9e-8abc-d254ffed03b7\") " pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:32 crc kubenswrapper[4764]: I0309 14:18:32.377404 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:33 crc kubenswrapper[4764]: I0309 14:18:33.069372 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qrmxw"] Mar 09 14:18:33 crc kubenswrapper[4764]: I0309 14:18:33.420905 4764 generic.go:334] "Generic (PLEG): container finished" podID="e756ebeb-906e-4f9e-8abc-d254ffed03b7" containerID="b2cca4f97913cf737df9850825be5c4968dfd8f52401c34f636f9062d699d84c" exitCode=0 Mar 09 14:18:33 crc kubenswrapper[4764]: I0309 14:18:33.421027 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrmxw" event={"ID":"e756ebeb-906e-4f9e-8abc-d254ffed03b7","Type":"ContainerDied","Data":"b2cca4f97913cf737df9850825be5c4968dfd8f52401c34f636f9062d699d84c"} Mar 09 14:18:33 crc kubenswrapper[4764]: I0309 14:18:33.421303 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrmxw" event={"ID":"e756ebeb-906e-4f9e-8abc-d254ffed03b7","Type":"ContainerStarted","Data":"262fcd6db8b7bee48513ef2e76ec7c52b7eea78484a12e2ae46265c79cd504ba"} Mar 09 14:18:34 crc kubenswrapper[4764]: I0309 14:18:34.435868 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrmxw" event={"ID":"e756ebeb-906e-4f9e-8abc-d254ffed03b7","Type":"ContainerStarted","Data":"ca4f23dac650d5b34aa704ef04fde382995e9921c39239b2af5cbeb3d6f9c134"} Mar 09 14:18:35 crc kubenswrapper[4764]: I0309 14:18:35.448208 4764 generic.go:334] "Generic (PLEG): container finished" podID="e756ebeb-906e-4f9e-8abc-d254ffed03b7" containerID="ca4f23dac650d5b34aa704ef04fde382995e9921c39239b2af5cbeb3d6f9c134" exitCode=0 Mar 09 14:18:35 crc kubenswrapper[4764]: I0309 14:18:35.448423 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrmxw" event={"ID":"e756ebeb-906e-4f9e-8abc-d254ffed03b7","Type":"ContainerDied","Data":"ca4f23dac650d5b34aa704ef04fde382995e9921c39239b2af5cbeb3d6f9c134"} Mar 09 14:18:36 crc kubenswrapper[4764]: I0309 14:18:36.458968 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrmxw" event={"ID":"e756ebeb-906e-4f9e-8abc-d254ffed03b7","Type":"ContainerStarted","Data":"4887c69c88d174633ecaac7efec228f692e197680549737c11953b3bdc179e85"} Mar 09 14:18:42 crc kubenswrapper[4764]: I0309 14:18:42.378633 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:42 crc kubenswrapper[4764]: I0309 14:18:42.379699 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:42 crc kubenswrapper[4764]: I0309 14:18:42.447443 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:42 crc kubenswrapper[4764]: I0309 14:18:42.472451 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qrmxw" podStartSLOduration=8.057431336 podStartE2EDuration="10.472413701s" podCreationTimestamp="2026-03-09 14:18:32 +0000 UTC" firstStartedPulling="2026-03-09 14:18:33.422914774 +0000 UTC m=+3468.673086672" lastFinishedPulling="2026-03-09 14:18:35.837897129 +0000 UTC m=+3471.088069037" observedRunningTime="2026-03-09 14:18:36.495860996 +0000 UTC m=+3471.746032904" watchObservedRunningTime="2026-03-09 14:18:42.472413701 +0000 UTC m=+3477.722585619" Mar 09 14:18:42 crc kubenswrapper[4764]: I0309 14:18:42.583111 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:42 crc kubenswrapper[4764]: I0309 14:18:42.686945 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qrmxw"] Mar 09 14:18:44 crc kubenswrapper[4764]: I0309 14:18:44.550133 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qrmxw" podUID="e756ebeb-906e-4f9e-8abc-d254ffed03b7" containerName="registry-server" containerID="cri-o://4887c69c88d174633ecaac7efec228f692e197680549737c11953b3bdc179e85" gracePeriod=2 Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.265294 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.316763 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9zcl\" (UniqueName: \"kubernetes.io/projected/e756ebeb-906e-4f9e-8abc-d254ffed03b7-kube-api-access-d9zcl\") pod \"e756ebeb-906e-4f9e-8abc-d254ffed03b7\" (UID: \"e756ebeb-906e-4f9e-8abc-d254ffed03b7\") " Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.317133 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e756ebeb-906e-4f9e-8abc-d254ffed03b7-catalog-content\") pod \"e756ebeb-906e-4f9e-8abc-d254ffed03b7\" (UID: \"e756ebeb-906e-4f9e-8abc-d254ffed03b7\") " Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.317221 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e756ebeb-906e-4f9e-8abc-d254ffed03b7-utilities\") pod \"e756ebeb-906e-4f9e-8abc-d254ffed03b7\" (UID: \"e756ebeb-906e-4f9e-8abc-d254ffed03b7\") " Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.319824 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e756ebeb-906e-4f9e-8abc-d254ffed03b7-utilities" (OuterVolumeSpecName: "utilities") pod "e756ebeb-906e-4f9e-8abc-d254ffed03b7" (UID: "e756ebeb-906e-4f9e-8abc-d254ffed03b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.324486 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e756ebeb-906e-4f9e-8abc-d254ffed03b7-kube-api-access-d9zcl" (OuterVolumeSpecName: "kube-api-access-d9zcl") pod "e756ebeb-906e-4f9e-8abc-d254ffed03b7" (UID: "e756ebeb-906e-4f9e-8abc-d254ffed03b7"). InnerVolumeSpecName "kube-api-access-d9zcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.371889 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e756ebeb-906e-4f9e-8abc-d254ffed03b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e756ebeb-906e-4f9e-8abc-d254ffed03b7" (UID: "e756ebeb-906e-4f9e-8abc-d254ffed03b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.420803 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9zcl\" (UniqueName: \"kubernetes.io/projected/e756ebeb-906e-4f9e-8abc-d254ffed03b7-kube-api-access-d9zcl\") on node \"crc\" DevicePath \"\"" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.420851 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e756ebeb-906e-4f9e-8abc-d254ffed03b7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.420863 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e756ebeb-906e-4f9e-8abc-d254ffed03b7-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.582255 4764 generic.go:334] "Generic (PLEG): container finished" podID="e756ebeb-906e-4f9e-8abc-d254ffed03b7" containerID="4887c69c88d174633ecaac7efec228f692e197680549737c11953b3bdc179e85" exitCode=0 Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.582365 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qrmxw" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.583962 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrmxw" event={"ID":"e756ebeb-906e-4f9e-8abc-d254ffed03b7","Type":"ContainerDied","Data":"4887c69c88d174633ecaac7efec228f692e197680549737c11953b3bdc179e85"} Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.584008 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrmxw" event={"ID":"e756ebeb-906e-4f9e-8abc-d254ffed03b7","Type":"ContainerDied","Data":"262fcd6db8b7bee48513ef2e76ec7c52b7eea78484a12e2ae46265c79cd504ba"} Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.584696 4764 scope.go:117] "RemoveContainer" containerID="4887c69c88d174633ecaac7efec228f692e197680549737c11953b3bdc179e85" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.639430 4764 scope.go:117] "RemoveContainer" containerID="ca4f23dac650d5b34aa704ef04fde382995e9921c39239b2af5cbeb3d6f9c134" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.655092 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qrmxw"] Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.667231 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qrmxw"] Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.668761 4764 scope.go:117] "RemoveContainer" containerID="b2cca4f97913cf737df9850825be5c4968dfd8f52401c34f636f9062d699d84c" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.715890 4764 scope.go:117] "RemoveContainer" containerID="4887c69c88d174633ecaac7efec228f692e197680549737c11953b3bdc179e85" Mar 09 14:18:45 crc kubenswrapper[4764]: E0309 14:18:45.716827 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4887c69c88d174633ecaac7efec228f692e197680549737c11953b3bdc179e85\": container with ID starting with 4887c69c88d174633ecaac7efec228f692e197680549737c11953b3bdc179e85 not found: ID does not exist" containerID="4887c69c88d174633ecaac7efec228f692e197680549737c11953b3bdc179e85" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.716870 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4887c69c88d174633ecaac7efec228f692e197680549737c11953b3bdc179e85"} err="failed to get container status \"4887c69c88d174633ecaac7efec228f692e197680549737c11953b3bdc179e85\": rpc error: code = NotFound desc = could not find container \"4887c69c88d174633ecaac7efec228f692e197680549737c11953b3bdc179e85\": container with ID starting with 4887c69c88d174633ecaac7efec228f692e197680549737c11953b3bdc179e85 not found: ID does not exist" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.716898 4764 scope.go:117] "RemoveContainer" containerID="ca4f23dac650d5b34aa704ef04fde382995e9921c39239b2af5cbeb3d6f9c134" Mar 09 14:18:45 crc kubenswrapper[4764]: E0309 14:18:45.717222 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca4f23dac650d5b34aa704ef04fde382995e9921c39239b2af5cbeb3d6f9c134\": container with ID starting with ca4f23dac650d5b34aa704ef04fde382995e9921c39239b2af5cbeb3d6f9c134 not found: ID does not exist" containerID="ca4f23dac650d5b34aa704ef04fde382995e9921c39239b2af5cbeb3d6f9c134" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.717303 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca4f23dac650d5b34aa704ef04fde382995e9921c39239b2af5cbeb3d6f9c134"} err="failed to get container status \"ca4f23dac650d5b34aa704ef04fde382995e9921c39239b2af5cbeb3d6f9c134\": rpc error: code = NotFound desc = could not find container \"ca4f23dac650d5b34aa704ef04fde382995e9921c39239b2af5cbeb3d6f9c134\": container with ID starting with ca4f23dac650d5b34aa704ef04fde382995e9921c39239b2af5cbeb3d6f9c134 not found: ID does not exist" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.717372 4764 scope.go:117] "RemoveContainer" containerID="b2cca4f97913cf737df9850825be5c4968dfd8f52401c34f636f9062d699d84c" Mar 09 14:18:45 crc kubenswrapper[4764]: E0309 14:18:45.718014 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2cca4f97913cf737df9850825be5c4968dfd8f52401c34f636f9062d699d84c\": container with ID starting with b2cca4f97913cf737df9850825be5c4968dfd8f52401c34f636f9062d699d84c not found: ID does not exist" containerID="b2cca4f97913cf737df9850825be5c4968dfd8f52401c34f636f9062d699d84c" Mar 09 14:18:45 crc kubenswrapper[4764]: I0309 14:18:45.718042 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2cca4f97913cf737df9850825be5c4968dfd8f52401c34f636f9062d699d84c"} err="failed to get container status \"b2cca4f97913cf737df9850825be5c4968dfd8f52401c34f636f9062d699d84c\": rpc error: code = NotFound desc = could not find container \"b2cca4f97913cf737df9850825be5c4968dfd8f52401c34f636f9062d699d84c\": container with ID starting with b2cca4f97913cf737df9850825be5c4968dfd8f52401c34f636f9062d699d84c not found: ID does not exist" Mar 09 14:18:47 crc kubenswrapper[4764]: I0309 14:18:47.573172 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e756ebeb-906e-4f9e-8abc-d254ffed03b7" path="/var/lib/kubelet/pods/e756ebeb-906e-4f9e-8abc-d254ffed03b7/volumes" Mar 09 14:18:58 crc kubenswrapper[4764]: I0309 14:18:58.370834 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:18:58 crc kubenswrapper[4764]: I0309 14:18:58.371473 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:18:58 crc kubenswrapper[4764]: I0309 14:18:58.371533 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 14:18:58 crc kubenswrapper[4764]: I0309 14:18:58.372513 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e0fd7c06957b904af3795728b54332eb3bc1c0b6a8df56e1a4bc3599bdedd8b5"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 14:18:58 crc kubenswrapper[4764]: I0309 14:18:58.372572 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://e0fd7c06957b904af3795728b54332eb3bc1c0b6a8df56e1a4bc3599bdedd8b5" gracePeriod=600 Mar 09 14:18:58 crc kubenswrapper[4764]: I0309 14:18:58.731137 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="e0fd7c06957b904af3795728b54332eb3bc1c0b6a8df56e1a4bc3599bdedd8b5" exitCode=0 Mar 09 14:18:58 crc kubenswrapper[4764]: I0309 14:18:58.731859 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"e0fd7c06957b904af3795728b54332eb3bc1c0b6a8df56e1a4bc3599bdedd8b5"} Mar 09 14:18:58 crc kubenswrapper[4764]: I0309 14:18:58.731907 4764 scope.go:117] "RemoveContainer" containerID="085fb58e00f347ac089fd33d79299a3879c66993685cb2c39951f8d234eb8cfb" Mar 09 14:18:59 crc kubenswrapper[4764]: I0309 14:18:59.744554 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096"} Mar 09 14:19:48 crc kubenswrapper[4764]: I0309 14:19:48.046253 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-fdg68"] Mar 09 14:19:48 crc kubenswrapper[4764]: I0309 14:19:48.055358 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-fdg68"] Mar 09 14:19:49 crc kubenswrapper[4764]: I0309 14:19:49.034612 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-1df0-account-create-update-bg564"] Mar 09 14:19:49 crc kubenswrapper[4764]: I0309 14:19:49.049523 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-1df0-account-create-update-bg564"] Mar 09 14:19:49 crc kubenswrapper[4764]: I0309 14:19:49.572114 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e1948a5-46f6-412d-91c3-bf9c255e02fc" path="/var/lib/kubelet/pods/5e1948a5-46f6-412d-91c3-bf9c255e02fc/volumes" Mar 09 14:19:49 crc kubenswrapper[4764]: I0309 14:19:49.573879 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ce9bce5-9c23-40ac-9683-6fb232e32c3c" path="/var/lib/kubelet/pods/7ce9bce5-9c23-40ac-9683-6fb232e32c3c/volumes" Mar 09 14:20:00 crc kubenswrapper[4764]: I0309 14:20:00.170891 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551100-l6b4v"] Mar 09 14:20:00 crc kubenswrapper[4764]: E0309 14:20:00.172303 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e756ebeb-906e-4f9e-8abc-d254ffed03b7" containerName="extract-content" Mar 09 14:20:00 crc kubenswrapper[4764]: I0309 14:20:00.172325 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e756ebeb-906e-4f9e-8abc-d254ffed03b7" containerName="extract-content" Mar 09 14:20:00 crc kubenswrapper[4764]: E0309 14:20:00.172343 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e756ebeb-906e-4f9e-8abc-d254ffed03b7" containerName="extract-utilities" Mar 09 14:20:00 crc kubenswrapper[4764]: I0309 14:20:00.172350 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e756ebeb-906e-4f9e-8abc-d254ffed03b7" containerName="extract-utilities" Mar 09 14:20:00 crc kubenswrapper[4764]: E0309 14:20:00.172369 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e756ebeb-906e-4f9e-8abc-d254ffed03b7" containerName="registry-server" Mar 09 14:20:00 crc kubenswrapper[4764]: I0309 14:20:00.172376 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e756ebeb-906e-4f9e-8abc-d254ffed03b7" containerName="registry-server" Mar 09 14:20:00 crc kubenswrapper[4764]: I0309 14:20:00.172721 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e756ebeb-906e-4f9e-8abc-d254ffed03b7" containerName="registry-server" Mar 09 14:20:00 crc kubenswrapper[4764]: I0309 14:20:00.173811 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551100-l6b4v" Mar 09 14:20:00 crc kubenswrapper[4764]: I0309 14:20:00.177133 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:20:00 crc kubenswrapper[4764]: I0309 14:20:00.177509 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:20:00 crc kubenswrapper[4764]: I0309 14:20:00.178699 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:20:00 crc kubenswrapper[4764]: I0309 14:20:00.202262 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551100-l6b4v"] Mar 09 14:20:00 crc kubenswrapper[4764]: I0309 14:20:00.272670 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl5lx\" (UniqueName: \"kubernetes.io/projected/ac3d8a45-0030-433c-a813-fa93811b952f-kube-api-access-xl5lx\") pod \"auto-csr-approver-29551100-l6b4v\" (UID: \"ac3d8a45-0030-433c-a813-fa93811b952f\") " pod="openshift-infra/auto-csr-approver-29551100-l6b4v" Mar 09 14:20:00 crc kubenswrapper[4764]: I0309 14:20:00.375347 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl5lx\" (UniqueName: \"kubernetes.io/projected/ac3d8a45-0030-433c-a813-fa93811b952f-kube-api-access-xl5lx\") pod \"auto-csr-approver-29551100-l6b4v\" (UID: \"ac3d8a45-0030-433c-a813-fa93811b952f\") " pod="openshift-infra/auto-csr-approver-29551100-l6b4v" Mar 09 14:20:00 crc kubenswrapper[4764]: I0309 14:20:00.396352 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl5lx\" (UniqueName: \"kubernetes.io/projected/ac3d8a45-0030-433c-a813-fa93811b952f-kube-api-access-xl5lx\") pod \"auto-csr-approver-29551100-l6b4v\" (UID: \"ac3d8a45-0030-433c-a813-fa93811b952f\") " pod="openshift-infra/auto-csr-approver-29551100-l6b4v" Mar 09 14:20:00 crc kubenswrapper[4764]: I0309 14:20:00.498608 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551100-l6b4v" Mar 09 14:20:00 crc kubenswrapper[4764]: I0309 14:20:00.986562 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551100-l6b4v"] Mar 09 14:20:01 crc kubenswrapper[4764]: I0309 14:20:01.389080 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551100-l6b4v" event={"ID":"ac3d8a45-0030-433c-a813-fa93811b952f","Type":"ContainerStarted","Data":"2f39d774e1aa1eb9ee3e30642b6b31806b792b18668b4e93d0df0af150d71706"} Mar 09 14:20:03 crc kubenswrapper[4764]: I0309 14:20:03.416113 4764 generic.go:334] "Generic (PLEG): container finished" podID="ac3d8a45-0030-433c-a813-fa93811b952f" containerID="0555c912480a032e6e1167209d44998b6306e07cd673abce316913bd071bd80f" exitCode=0 Mar 09 14:20:03 crc kubenswrapper[4764]: I0309 14:20:03.416236 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551100-l6b4v" event={"ID":"ac3d8a45-0030-433c-a813-fa93811b952f","Type":"ContainerDied","Data":"0555c912480a032e6e1167209d44998b6306e07cd673abce316913bd071bd80f"} Mar 09 14:20:04 crc kubenswrapper[4764]: I0309 14:20:04.818711 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551100-l6b4v" Mar 09 14:20:04 crc kubenswrapper[4764]: I0309 14:20:04.887316 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl5lx\" (UniqueName: \"kubernetes.io/projected/ac3d8a45-0030-433c-a813-fa93811b952f-kube-api-access-xl5lx\") pod \"ac3d8a45-0030-433c-a813-fa93811b952f\" (UID: \"ac3d8a45-0030-433c-a813-fa93811b952f\") " Mar 09 14:20:04 crc kubenswrapper[4764]: I0309 14:20:04.895720 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac3d8a45-0030-433c-a813-fa93811b952f-kube-api-access-xl5lx" (OuterVolumeSpecName: "kube-api-access-xl5lx") pod "ac3d8a45-0030-433c-a813-fa93811b952f" (UID: "ac3d8a45-0030-433c-a813-fa93811b952f"). InnerVolumeSpecName "kube-api-access-xl5lx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:20:04 crc kubenswrapper[4764]: I0309 14:20:04.990902 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xl5lx\" (UniqueName: \"kubernetes.io/projected/ac3d8a45-0030-433c-a813-fa93811b952f-kube-api-access-xl5lx\") on node \"crc\" DevicePath \"\"" Mar 09 14:20:05 crc kubenswrapper[4764]: I0309 14:20:05.438496 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551100-l6b4v" event={"ID":"ac3d8a45-0030-433c-a813-fa93811b952f","Type":"ContainerDied","Data":"2f39d774e1aa1eb9ee3e30642b6b31806b792b18668b4e93d0df0af150d71706"} Mar 09 14:20:05 crc kubenswrapper[4764]: I0309 14:20:05.438549 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f39d774e1aa1eb9ee3e30642b6b31806b792b18668b4e93d0df0af150d71706" Mar 09 14:20:05 crc kubenswrapper[4764]: I0309 14:20:05.438708 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551100-l6b4v" Mar 09 14:20:05 crc kubenswrapper[4764]: I0309 14:20:05.895832 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551094-w5qt4"] Mar 09 14:20:05 crc kubenswrapper[4764]: I0309 14:20:05.905863 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551094-w5qt4"] Mar 09 14:20:07 crc kubenswrapper[4764]: I0309 14:20:07.574870 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7d72c38-b071-4fcb-89b4-935542a1943e" path="/var/lib/kubelet/pods/d7d72c38-b071-4fcb-89b4-935542a1943e/volumes" Mar 09 14:20:10 crc kubenswrapper[4764]: I0309 14:20:10.903667 4764 scope.go:117] "RemoveContainer" containerID="25dd59f6d05d4fb8f17a37a81bd52fddb49fdd1d34f6ac322a9adb36613a0c6b" Mar 09 14:20:10 crc kubenswrapper[4764]: I0309 14:20:10.935473 4764 scope.go:117] "RemoveContainer" containerID="1d30d3bafe785b61035cfd6776bbf1fbd30eae865400a81c90fd508159791920" Mar 09 14:20:10 crc kubenswrapper[4764]: I0309 14:20:10.966924 4764 scope.go:117] "RemoveContainer" containerID="bdf07c58d276db116b39b558183c8a6af0bb01c84a705a0b57c15a4ac4f6b634" Mar 09 14:20:11 crc kubenswrapper[4764]: I0309 14:20:11.029941 4764 scope.go:117] "RemoveContainer" containerID="a83e9cbe22286a479af2a92b6e16b2800ac926082aaf40b98e4defaf7a33610b" Mar 09 14:20:11 crc kubenswrapper[4764]: I0309 14:20:11.072791 4764 scope.go:117] "RemoveContainer" containerID="865365e28885b4a1f3c0039c8f24992ea422c88be0ea3e7581c981c450049a6c" Mar 09 14:20:11 crc kubenswrapper[4764]: I0309 14:20:11.160555 4764 scope.go:117] "RemoveContainer" containerID="89bd3940c282c4190fbd12e55d3b6572d76973664eb8e08171ebbd6ade38f50f" Mar 09 14:20:23 crc kubenswrapper[4764]: I0309 14:20:23.049040 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-j6s45"] Mar 09 14:20:23 crc kubenswrapper[4764]: I0309 14:20:23.062488 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-j6s45"] Mar 09 14:20:23 crc kubenswrapper[4764]: I0309 14:20:23.571928 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6711cdff-410c-4d91-b172-c2065054c1be" path="/var/lib/kubelet/pods/6711cdff-410c-4d91-b172-c2065054c1be/volumes" Mar 09 14:20:40 crc kubenswrapper[4764]: I0309 14:20:40.926489 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rbppf"] Mar 09 14:20:40 crc kubenswrapper[4764]: E0309 14:20:40.929620 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3d8a45-0030-433c-a813-fa93811b952f" containerName="oc" Mar 09 14:20:40 crc kubenswrapper[4764]: I0309 14:20:40.929792 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3d8a45-0030-433c-a813-fa93811b952f" containerName="oc" Mar 09 14:20:40 crc kubenswrapper[4764]: I0309 14:20:40.930091 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac3d8a45-0030-433c-a813-fa93811b952f" containerName="oc" Mar 09 14:20:40 crc kubenswrapper[4764]: I0309 14:20:40.932230 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:40 crc kubenswrapper[4764]: I0309 14:20:40.953971 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rbppf"] Mar 09 14:20:41 crc kubenswrapper[4764]: I0309 14:20:41.071945 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/888fc58f-9a2b-4586-bc38-d645cae21425-utilities\") pod \"redhat-marketplace-rbppf\" (UID: \"888fc58f-9a2b-4586-bc38-d645cae21425\") " pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:41 crc kubenswrapper[4764]: I0309 14:20:41.072150 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq8tf\" (UniqueName: \"kubernetes.io/projected/888fc58f-9a2b-4586-bc38-d645cae21425-kube-api-access-tq8tf\") pod \"redhat-marketplace-rbppf\" (UID: \"888fc58f-9a2b-4586-bc38-d645cae21425\") " pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:41 crc kubenswrapper[4764]: I0309 14:20:41.072557 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/888fc58f-9a2b-4586-bc38-d645cae21425-catalog-content\") pod \"redhat-marketplace-rbppf\" (UID: \"888fc58f-9a2b-4586-bc38-d645cae21425\") " pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:41 crc kubenswrapper[4764]: I0309 14:20:41.175599 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/888fc58f-9a2b-4586-bc38-d645cae21425-catalog-content\") pod \"redhat-marketplace-rbppf\" (UID: \"888fc58f-9a2b-4586-bc38-d645cae21425\") " pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:41 crc kubenswrapper[4764]: I0309 14:20:41.175848 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/888fc58f-9a2b-4586-bc38-d645cae21425-utilities\") pod \"redhat-marketplace-rbppf\" (UID: \"888fc58f-9a2b-4586-bc38-d645cae21425\") " pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:41 crc kubenswrapper[4764]: I0309 14:20:41.175902 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq8tf\" (UniqueName: \"kubernetes.io/projected/888fc58f-9a2b-4586-bc38-d645cae21425-kube-api-access-tq8tf\") pod \"redhat-marketplace-rbppf\" (UID: \"888fc58f-9a2b-4586-bc38-d645cae21425\") " pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:41 crc kubenswrapper[4764]: I0309 14:20:41.176352 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/888fc58f-9a2b-4586-bc38-d645cae21425-catalog-content\") pod \"redhat-marketplace-rbppf\" (UID: \"888fc58f-9a2b-4586-bc38-d645cae21425\") " pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:41 crc kubenswrapper[4764]: I0309 14:20:41.176467 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/888fc58f-9a2b-4586-bc38-d645cae21425-utilities\") pod \"redhat-marketplace-rbppf\" (UID: \"888fc58f-9a2b-4586-bc38-d645cae21425\") " pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:41 crc kubenswrapper[4764]: I0309 14:20:41.201592 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq8tf\" (UniqueName: \"kubernetes.io/projected/888fc58f-9a2b-4586-bc38-d645cae21425-kube-api-access-tq8tf\") pod \"redhat-marketplace-rbppf\" (UID: \"888fc58f-9a2b-4586-bc38-d645cae21425\") " pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:41 crc kubenswrapper[4764]: I0309 14:20:41.255325 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:41 crc kubenswrapper[4764]: I0309 14:20:41.847667 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rbppf"] Mar 09 14:20:42 crc kubenswrapper[4764]: I0309 14:20:42.858161 4764 generic.go:334] "Generic (PLEG): container finished" podID="888fc58f-9a2b-4586-bc38-d645cae21425" containerID="b626cc702ddc013282f6ebfcbde013693e80a5808ff3a9fa5b4069a2ac0da90d" exitCode=0 Mar 09 14:20:42 crc kubenswrapper[4764]: I0309 14:20:42.858239 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbppf" event={"ID":"888fc58f-9a2b-4586-bc38-d645cae21425","Type":"ContainerDied","Data":"b626cc702ddc013282f6ebfcbde013693e80a5808ff3a9fa5b4069a2ac0da90d"} Mar 09 14:20:42 crc kubenswrapper[4764]: I0309 14:20:42.858887 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbppf" event={"ID":"888fc58f-9a2b-4586-bc38-d645cae21425","Type":"ContainerStarted","Data":"54048bcbd0e67426acff49fbd46b2a61d8208845b90d713b788118d239a1246a"} Mar 09 14:20:43 crc kubenswrapper[4764]: I0309 14:20:43.879311 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbppf" event={"ID":"888fc58f-9a2b-4586-bc38-d645cae21425","Type":"ContainerStarted","Data":"fb3683c2f5b994d9bf1b4476b7c14f9500475e2cfc2431fdcb9a1a8f8933c048"} Mar 09 14:20:44 crc kubenswrapper[4764]: I0309 14:20:44.892421 4764 generic.go:334] "Generic (PLEG): container finished" podID="888fc58f-9a2b-4586-bc38-d645cae21425" containerID="fb3683c2f5b994d9bf1b4476b7c14f9500475e2cfc2431fdcb9a1a8f8933c048" exitCode=0 Mar 09 14:20:44 crc kubenswrapper[4764]: I0309 14:20:44.892473 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbppf" event={"ID":"888fc58f-9a2b-4586-bc38-d645cae21425","Type":"ContainerDied","Data":"fb3683c2f5b994d9bf1b4476b7c14f9500475e2cfc2431fdcb9a1a8f8933c048"} Mar 09 14:20:45 crc kubenswrapper[4764]: I0309 14:20:45.911341 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbppf" event={"ID":"888fc58f-9a2b-4586-bc38-d645cae21425","Type":"ContainerStarted","Data":"dee20c84fe6a88a99746d593b2b47a68cf15d7855d8604d9f74434aafdb9bf89"} Mar 09 14:20:45 crc kubenswrapper[4764]: I0309 14:20:45.946008 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rbppf" podStartSLOduration=3.49453088 podStartE2EDuration="5.945972442s" podCreationTimestamp="2026-03-09 14:20:40 +0000 UTC" firstStartedPulling="2026-03-09 14:20:42.861100001 +0000 UTC m=+3598.111271909" lastFinishedPulling="2026-03-09 14:20:45.312541563 +0000 UTC m=+3600.562713471" observedRunningTime="2026-03-09 14:20:45.932989764 +0000 UTC m=+3601.183161672" watchObservedRunningTime="2026-03-09 14:20:45.945972442 +0000 UTC m=+3601.196144350" Mar 09 14:20:51 crc kubenswrapper[4764]: I0309 14:20:51.255713 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:51 crc kubenswrapper[4764]: I0309 14:20:51.256612 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:51 crc kubenswrapper[4764]: I0309 14:20:51.310823 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:52 crc kubenswrapper[4764]: I0309 14:20:52.017442 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:52 crc kubenswrapper[4764]: I0309 14:20:52.082809 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rbppf"] Mar 09 14:20:53 crc kubenswrapper[4764]: I0309 14:20:53.986429 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rbppf" podUID="888fc58f-9a2b-4586-bc38-d645cae21425" containerName="registry-server" containerID="cri-o://dee20c84fe6a88a99746d593b2b47a68cf15d7855d8604d9f74434aafdb9bf89" gracePeriod=2 Mar 09 14:20:54 crc kubenswrapper[4764]: I0309 14:20:54.528198 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:54 crc kubenswrapper[4764]: I0309 14:20:54.725046 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tq8tf\" (UniqueName: \"kubernetes.io/projected/888fc58f-9a2b-4586-bc38-d645cae21425-kube-api-access-tq8tf\") pod \"888fc58f-9a2b-4586-bc38-d645cae21425\" (UID: \"888fc58f-9a2b-4586-bc38-d645cae21425\") " Mar 09 14:20:54 crc kubenswrapper[4764]: I0309 14:20:54.725522 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/888fc58f-9a2b-4586-bc38-d645cae21425-utilities\") pod \"888fc58f-9a2b-4586-bc38-d645cae21425\" (UID: \"888fc58f-9a2b-4586-bc38-d645cae21425\") " Mar 09 14:20:54 crc kubenswrapper[4764]: I0309 14:20:54.725778 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/888fc58f-9a2b-4586-bc38-d645cae21425-catalog-content\") pod \"888fc58f-9a2b-4586-bc38-d645cae21425\" (UID: \"888fc58f-9a2b-4586-bc38-d645cae21425\") " Mar 09 14:20:54 crc kubenswrapper[4764]: I0309 14:20:54.726309 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/888fc58f-9a2b-4586-bc38-d645cae21425-utilities" (OuterVolumeSpecName: "utilities") pod "888fc58f-9a2b-4586-bc38-d645cae21425" (UID: "888fc58f-9a2b-4586-bc38-d645cae21425"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:20:54 crc kubenswrapper[4764]: I0309 14:20:54.726801 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/888fc58f-9a2b-4586-bc38-d645cae21425-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:20:54 crc kubenswrapper[4764]: I0309 14:20:54.733298 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/888fc58f-9a2b-4586-bc38-d645cae21425-kube-api-access-tq8tf" (OuterVolumeSpecName: "kube-api-access-tq8tf") pod "888fc58f-9a2b-4586-bc38-d645cae21425" (UID: "888fc58f-9a2b-4586-bc38-d645cae21425"). InnerVolumeSpecName "kube-api-access-tq8tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:20:54 crc kubenswrapper[4764]: I0309 14:20:54.755110 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/888fc58f-9a2b-4586-bc38-d645cae21425-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "888fc58f-9a2b-4586-bc38-d645cae21425" (UID: "888fc58f-9a2b-4586-bc38-d645cae21425"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:20:54 crc kubenswrapper[4764]: I0309 14:20:54.828715 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/888fc58f-9a2b-4586-bc38-d645cae21425-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:20:54 crc kubenswrapper[4764]: I0309 14:20:54.828765 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tq8tf\" (UniqueName: \"kubernetes.io/projected/888fc58f-9a2b-4586-bc38-d645cae21425-kube-api-access-tq8tf\") on node \"crc\" DevicePath \"\"" Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.000777 4764 generic.go:334] "Generic (PLEG): container finished" podID="888fc58f-9a2b-4586-bc38-d645cae21425" containerID="dee20c84fe6a88a99746d593b2b47a68cf15d7855d8604d9f74434aafdb9bf89" exitCode=0 Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.000883 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rbppf" Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.000880 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbppf" event={"ID":"888fc58f-9a2b-4586-bc38-d645cae21425","Type":"ContainerDied","Data":"dee20c84fe6a88a99746d593b2b47a68cf15d7855d8604d9f74434aafdb9bf89"} Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.002635 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbppf" event={"ID":"888fc58f-9a2b-4586-bc38-d645cae21425","Type":"ContainerDied","Data":"54048bcbd0e67426acff49fbd46b2a61d8208845b90d713b788118d239a1246a"} Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.002683 4764 scope.go:117] "RemoveContainer" containerID="dee20c84fe6a88a99746d593b2b47a68cf15d7855d8604d9f74434aafdb9bf89" Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.046755 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rbppf"] Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.053178 4764 scope.go:117] "RemoveContainer" containerID="fb3683c2f5b994d9bf1b4476b7c14f9500475e2cfc2431fdcb9a1a8f8933c048" Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.059897 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rbppf"] Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.076818 4764 scope.go:117] "RemoveContainer" containerID="b626cc702ddc013282f6ebfcbde013693e80a5808ff3a9fa5b4069a2ac0da90d" Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.123844 4764 scope.go:117] "RemoveContainer" containerID="dee20c84fe6a88a99746d593b2b47a68cf15d7855d8604d9f74434aafdb9bf89" Mar 09 14:20:55 crc kubenswrapper[4764]: E0309 14:20:55.124292 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dee20c84fe6a88a99746d593b2b47a68cf15d7855d8604d9f74434aafdb9bf89\": container with ID starting with dee20c84fe6a88a99746d593b2b47a68cf15d7855d8604d9f74434aafdb9bf89 not found: ID does not exist" containerID="dee20c84fe6a88a99746d593b2b47a68cf15d7855d8604d9f74434aafdb9bf89" Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.124329 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dee20c84fe6a88a99746d593b2b47a68cf15d7855d8604d9f74434aafdb9bf89"} err="failed to get container status \"dee20c84fe6a88a99746d593b2b47a68cf15d7855d8604d9f74434aafdb9bf89\": rpc error: code = NotFound desc = could not find container \"dee20c84fe6a88a99746d593b2b47a68cf15d7855d8604d9f74434aafdb9bf89\": container with ID starting with dee20c84fe6a88a99746d593b2b47a68cf15d7855d8604d9f74434aafdb9bf89 not found: ID does not exist" Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.124354 4764 scope.go:117] "RemoveContainer" containerID="fb3683c2f5b994d9bf1b4476b7c14f9500475e2cfc2431fdcb9a1a8f8933c048" Mar 09 14:20:55 crc kubenswrapper[4764]: E0309 14:20:55.124692 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb3683c2f5b994d9bf1b4476b7c14f9500475e2cfc2431fdcb9a1a8f8933c048\": container with ID starting with fb3683c2f5b994d9bf1b4476b7c14f9500475e2cfc2431fdcb9a1a8f8933c048 not found: ID does not exist" containerID="fb3683c2f5b994d9bf1b4476b7c14f9500475e2cfc2431fdcb9a1a8f8933c048" Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.124715 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb3683c2f5b994d9bf1b4476b7c14f9500475e2cfc2431fdcb9a1a8f8933c048"} err="failed to get container status \"fb3683c2f5b994d9bf1b4476b7c14f9500475e2cfc2431fdcb9a1a8f8933c048\": rpc error: code = NotFound desc = could not find container \"fb3683c2f5b994d9bf1b4476b7c14f9500475e2cfc2431fdcb9a1a8f8933c048\": container with ID starting with fb3683c2f5b994d9bf1b4476b7c14f9500475e2cfc2431fdcb9a1a8f8933c048 not found: ID does not exist" Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.124728 4764 scope.go:117] "RemoveContainer" containerID="b626cc702ddc013282f6ebfcbde013693e80a5808ff3a9fa5b4069a2ac0da90d" Mar 09 14:20:55 crc kubenswrapper[4764]: E0309 14:20:55.124973 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b626cc702ddc013282f6ebfcbde013693e80a5808ff3a9fa5b4069a2ac0da90d\": container with ID starting with b626cc702ddc013282f6ebfcbde013693e80a5808ff3a9fa5b4069a2ac0da90d not found: ID does not exist" containerID="b626cc702ddc013282f6ebfcbde013693e80a5808ff3a9fa5b4069a2ac0da90d" Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.124995 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b626cc702ddc013282f6ebfcbde013693e80a5808ff3a9fa5b4069a2ac0da90d"} err="failed to get container status \"b626cc702ddc013282f6ebfcbde013693e80a5808ff3a9fa5b4069a2ac0da90d\": rpc error: code = NotFound desc = could not find container \"b626cc702ddc013282f6ebfcbde013693e80a5808ff3a9fa5b4069a2ac0da90d\": container with ID starting with b626cc702ddc013282f6ebfcbde013693e80a5808ff3a9fa5b4069a2ac0da90d not found: ID does not exist" Mar 09 14:20:55 crc kubenswrapper[4764]: I0309 14:20:55.573467 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="888fc58f-9a2b-4586-bc38-d645cae21425" path="/var/lib/kubelet/pods/888fc58f-9a2b-4586-bc38-d645cae21425/volumes" Mar 09 14:20:58 crc kubenswrapper[4764]: I0309 14:20:58.370321 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:20:58 crc kubenswrapper[4764]: I0309 14:20:58.371153 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:21:11 crc kubenswrapper[4764]: I0309 14:21:11.318333 4764 scope.go:117] "RemoveContainer" containerID="8d377192006e66551cb0ea6108fb1f8c6b81bac19c99e99d5e21dffabe9e931d" Mar 09 14:21:28 crc kubenswrapper[4764]: I0309 14:21:28.370756 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:21:28 crc kubenswrapper[4764]: I0309 14:21:28.371572 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:21:58 crc kubenswrapper[4764]: I0309 14:21:58.370335 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:21:58 crc kubenswrapper[4764]: I0309 14:21:58.371187 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:21:58 crc kubenswrapper[4764]: I0309 14:21:58.371256 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 14:21:58 crc kubenswrapper[4764]: I0309 14:21:58.372270 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 14:21:58 crc kubenswrapper[4764]: I0309 14:21:58.372324 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" gracePeriod=600 Mar 09 14:21:58 crc kubenswrapper[4764]: E0309 14:21:58.494027 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:21:58 crc kubenswrapper[4764]: I0309 14:21:58.680685 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" exitCode=0 Mar 09 14:21:58 crc kubenswrapper[4764]: I0309 14:21:58.680757 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096"} Mar 09 14:21:58 crc kubenswrapper[4764]: I0309 14:21:58.681107 4764 scope.go:117] "RemoveContainer" containerID="e0fd7c06957b904af3795728b54332eb3bc1c0b6a8df56e1a4bc3599bdedd8b5" Mar 09 14:21:58 crc kubenswrapper[4764]: I0309 14:21:58.681663 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:21:58 crc kubenswrapper[4764]: E0309 14:21:58.682010 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:22:00 crc kubenswrapper[4764]: I0309 14:22:00.159047 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551102-gnwjq"] Mar 09 14:22:00 crc kubenswrapper[4764]: E0309 14:22:00.160103 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="888fc58f-9a2b-4586-bc38-d645cae21425" containerName="extract-content" Mar 09 14:22:00 crc kubenswrapper[4764]: I0309 14:22:00.160121 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="888fc58f-9a2b-4586-bc38-d645cae21425" containerName="extract-content" Mar 09 14:22:00 crc kubenswrapper[4764]: E0309 14:22:00.160154 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="888fc58f-9a2b-4586-bc38-d645cae21425" containerName="extract-utilities" Mar 09 14:22:00 crc kubenswrapper[4764]: I0309 14:22:00.160167 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="888fc58f-9a2b-4586-bc38-d645cae21425" containerName="extract-utilities" Mar 09 14:22:00 crc kubenswrapper[4764]: E0309 14:22:00.160194 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="888fc58f-9a2b-4586-bc38-d645cae21425" containerName="registry-server" Mar 09 14:22:00 crc kubenswrapper[4764]: I0309 14:22:00.160201 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="888fc58f-9a2b-4586-bc38-d645cae21425" containerName="registry-server" Mar 09 14:22:00 crc kubenswrapper[4764]: I0309 14:22:00.160390 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="888fc58f-9a2b-4586-bc38-d645cae21425" containerName="registry-server" Mar 09 14:22:00 crc kubenswrapper[4764]: I0309 14:22:00.161457 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551102-gnwjq" Mar 09 14:22:00 crc kubenswrapper[4764]: I0309 14:22:00.166700 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:22:00 crc kubenswrapper[4764]: I0309 14:22:00.166873 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:22:00 crc kubenswrapper[4764]: I0309 14:22:00.167036 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:22:00 crc kubenswrapper[4764]: I0309 14:22:00.171637 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551102-gnwjq"] Mar 09 14:22:00 crc kubenswrapper[4764]: I0309 14:22:00.239010 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tfnd\" (UniqueName: \"kubernetes.io/projected/8f976f5d-f876-491a-8557-f6755b9641a3-kube-api-access-4tfnd\") pod \"auto-csr-approver-29551102-gnwjq\" (UID: \"8f976f5d-f876-491a-8557-f6755b9641a3\") " pod="openshift-infra/auto-csr-approver-29551102-gnwjq" Mar 09 14:22:00 crc kubenswrapper[4764]: I0309 14:22:00.341589 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tfnd\" (UniqueName: \"kubernetes.io/projected/8f976f5d-f876-491a-8557-f6755b9641a3-kube-api-access-4tfnd\") pod \"auto-csr-approver-29551102-gnwjq\" (UID: \"8f976f5d-f876-491a-8557-f6755b9641a3\") " pod="openshift-infra/auto-csr-approver-29551102-gnwjq" Mar 09 14:22:00 crc kubenswrapper[4764]: I0309 14:22:00.366567 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tfnd\" (UniqueName: \"kubernetes.io/projected/8f976f5d-f876-491a-8557-f6755b9641a3-kube-api-access-4tfnd\") pod \"auto-csr-approver-29551102-gnwjq\" (UID: \"8f976f5d-f876-491a-8557-f6755b9641a3\") " pod="openshift-infra/auto-csr-approver-29551102-gnwjq" Mar 09 14:22:00 crc kubenswrapper[4764]: I0309 14:22:00.514479 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551102-gnwjq" Mar 09 14:22:01 crc kubenswrapper[4764]: I0309 14:22:01.041904 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551102-gnwjq"] Mar 09 14:22:01 crc kubenswrapper[4764]: I0309 14:22:01.725287 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551102-gnwjq" event={"ID":"8f976f5d-f876-491a-8557-f6755b9641a3","Type":"ContainerStarted","Data":"6ce2883cdfe320e4d50c01ec1293e26b7d633f3bddc2555890a1b8545691ee5e"} Mar 09 14:22:02 crc kubenswrapper[4764]: I0309 14:22:02.741669 4764 generic.go:334] "Generic (PLEG): container finished" podID="8f976f5d-f876-491a-8557-f6755b9641a3" containerID="469e856eb9ee2a8bcc4f55156a6acb6b99b858dcd6fc4bec4aaa66e7dc6669a8" exitCode=0 Mar 09 14:22:02 crc kubenswrapper[4764]: I0309 14:22:02.741768 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551102-gnwjq" event={"ID":"8f976f5d-f876-491a-8557-f6755b9641a3","Type":"ContainerDied","Data":"469e856eb9ee2a8bcc4f55156a6acb6b99b858dcd6fc4bec4aaa66e7dc6669a8"} Mar 09 14:22:04 crc kubenswrapper[4764]: I0309 14:22:04.216307 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551102-gnwjq" Mar 09 14:22:04 crc kubenswrapper[4764]: I0309 14:22:04.335377 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tfnd\" (UniqueName: \"kubernetes.io/projected/8f976f5d-f876-491a-8557-f6755b9641a3-kube-api-access-4tfnd\") pod \"8f976f5d-f876-491a-8557-f6755b9641a3\" (UID: \"8f976f5d-f876-491a-8557-f6755b9641a3\") " Mar 09 14:22:04 crc kubenswrapper[4764]: I0309 14:22:04.342681 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f976f5d-f876-491a-8557-f6755b9641a3-kube-api-access-4tfnd" (OuterVolumeSpecName: "kube-api-access-4tfnd") pod "8f976f5d-f876-491a-8557-f6755b9641a3" (UID: "8f976f5d-f876-491a-8557-f6755b9641a3"). InnerVolumeSpecName "kube-api-access-4tfnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:22:04 crc kubenswrapper[4764]: I0309 14:22:04.437997 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tfnd\" (UniqueName: \"kubernetes.io/projected/8f976f5d-f876-491a-8557-f6755b9641a3-kube-api-access-4tfnd\") on node \"crc\" DevicePath \"\"" Mar 09 14:22:04 crc kubenswrapper[4764]: I0309 14:22:04.765152 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551102-gnwjq" event={"ID":"8f976f5d-f876-491a-8557-f6755b9641a3","Type":"ContainerDied","Data":"6ce2883cdfe320e4d50c01ec1293e26b7d633f3bddc2555890a1b8545691ee5e"} Mar 09 14:22:04 crc kubenswrapper[4764]: I0309 14:22:04.765208 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ce2883cdfe320e4d50c01ec1293e26b7d633f3bddc2555890a1b8545691ee5e" Mar 09 14:22:04 crc kubenswrapper[4764]: I0309 14:22:04.765220 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551102-gnwjq" Mar 09 14:22:05 crc kubenswrapper[4764]: I0309 14:22:05.295515 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551096-9bl56"] Mar 09 14:22:05 crc kubenswrapper[4764]: I0309 14:22:05.308111 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551096-9bl56"] Mar 09 14:22:05 crc kubenswrapper[4764]: I0309 14:22:05.577400 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3" path="/var/lib/kubelet/pods/40ff3557-e9fd-4c9a-bc94-b5a03f4e71c3/volumes" Mar 09 14:22:11 crc kubenswrapper[4764]: I0309 14:22:11.427084 4764 scope.go:117] "RemoveContainer" containerID="ec447d2d121a3c0de24bcb688dbed1ad219802d16c0a98f40138fd4ac8dc4b32" Mar 09 14:22:11 crc kubenswrapper[4764]: I0309 14:22:11.559736 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:22:11 crc kubenswrapper[4764]: E0309 14:22:11.560135 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:22:24 crc kubenswrapper[4764]: I0309 14:22:24.560284 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:22:24 crc kubenswrapper[4764]: E0309 14:22:24.561453 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:22:26 crc kubenswrapper[4764]: I0309 14:22:26.186116 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fcpdb"] Mar 09 14:22:26 crc kubenswrapper[4764]: E0309 14:22:26.187229 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f976f5d-f876-491a-8557-f6755b9641a3" containerName="oc" Mar 09 14:22:26 crc kubenswrapper[4764]: I0309 14:22:26.187252 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f976f5d-f876-491a-8557-f6755b9641a3" containerName="oc" Mar 09 14:22:26 crc kubenswrapper[4764]: I0309 14:22:26.187521 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f976f5d-f876-491a-8557-f6755b9641a3" containerName="oc" Mar 09 14:22:26 crc kubenswrapper[4764]: I0309 14:22:26.189376 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:26 crc kubenswrapper[4764]: I0309 14:22:26.198795 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fcpdb"] Mar 09 14:22:26 crc kubenswrapper[4764]: I0309 14:22:26.281074 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4638c73f-5adb-4e39-b7d3-b1d6627b7705-utilities\") pod \"redhat-operators-fcpdb\" (UID: \"4638c73f-5adb-4e39-b7d3-b1d6627b7705\") " pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:26 crc kubenswrapper[4764]: I0309 14:22:26.281188 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2bwd\" (UniqueName: \"kubernetes.io/projected/4638c73f-5adb-4e39-b7d3-b1d6627b7705-kube-api-access-j2bwd\") pod \"redhat-operators-fcpdb\" (UID: \"4638c73f-5adb-4e39-b7d3-b1d6627b7705\") " pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:26 crc kubenswrapper[4764]: I0309 14:22:26.281863 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4638c73f-5adb-4e39-b7d3-b1d6627b7705-catalog-content\") pod \"redhat-operators-fcpdb\" (UID: \"4638c73f-5adb-4e39-b7d3-b1d6627b7705\") " pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:26 crc kubenswrapper[4764]: I0309 14:22:26.384826 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4638c73f-5adb-4e39-b7d3-b1d6627b7705-utilities\") pod \"redhat-operators-fcpdb\" (UID: \"4638c73f-5adb-4e39-b7d3-b1d6627b7705\") " pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:26 crc kubenswrapper[4764]: I0309 14:22:26.384952 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2bwd\" (UniqueName: \"kubernetes.io/projected/4638c73f-5adb-4e39-b7d3-b1d6627b7705-kube-api-access-j2bwd\") pod \"redhat-operators-fcpdb\" (UID: \"4638c73f-5adb-4e39-b7d3-b1d6627b7705\") " pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:26 crc kubenswrapper[4764]: I0309 14:22:26.385120 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4638c73f-5adb-4e39-b7d3-b1d6627b7705-catalog-content\") pod \"redhat-operators-fcpdb\" (UID: \"4638c73f-5adb-4e39-b7d3-b1d6627b7705\") " pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:26 crc kubenswrapper[4764]: I0309 14:22:26.385607 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4638c73f-5adb-4e39-b7d3-b1d6627b7705-utilities\") pod \"redhat-operators-fcpdb\" (UID: \"4638c73f-5adb-4e39-b7d3-b1d6627b7705\") " pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:26 crc kubenswrapper[4764]: I0309 14:22:26.385822 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4638c73f-5adb-4e39-b7d3-b1d6627b7705-catalog-content\") pod \"redhat-operators-fcpdb\" (UID: \"4638c73f-5adb-4e39-b7d3-b1d6627b7705\") " pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:26 crc kubenswrapper[4764]: I0309 14:22:26.416694 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2bwd\" (UniqueName: \"kubernetes.io/projected/4638c73f-5adb-4e39-b7d3-b1d6627b7705-kube-api-access-j2bwd\") pod \"redhat-operators-fcpdb\" (UID: \"4638c73f-5adb-4e39-b7d3-b1d6627b7705\") " pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:26 crc kubenswrapper[4764]: I0309 14:22:26.524251 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:27 crc kubenswrapper[4764]: I0309 14:22:27.082075 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fcpdb"] Mar 09 14:22:28 crc kubenswrapper[4764]: I0309 14:22:28.051316 4764 generic.go:334] "Generic (PLEG): container finished" podID="4638c73f-5adb-4e39-b7d3-b1d6627b7705" containerID="56d76f8c6bb58626d2529263241264f7ef0049500bf7d5bd6760d3c4cee8e3c0" exitCode=0 Mar 09 14:22:28 crc kubenswrapper[4764]: I0309 14:22:28.051730 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcpdb" event={"ID":"4638c73f-5adb-4e39-b7d3-b1d6627b7705","Type":"ContainerDied","Data":"56d76f8c6bb58626d2529263241264f7ef0049500bf7d5bd6760d3c4cee8e3c0"} Mar 09 14:22:28 crc kubenswrapper[4764]: I0309 14:22:28.051772 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcpdb" event={"ID":"4638c73f-5adb-4e39-b7d3-b1d6627b7705","Type":"ContainerStarted","Data":"d22ecbeedc703150c2c29bd0be02199d91646441b7ef06e81064110c6b81f83e"} Mar 09 14:22:29 crc kubenswrapper[4764]: I0309 14:22:29.066103 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcpdb" event={"ID":"4638c73f-5adb-4e39-b7d3-b1d6627b7705","Type":"ContainerStarted","Data":"4eae2ce540cdd8401ae00543317b875985d0ae3eeaf5bcbc0a26c17c98f2b329"} Mar 09 14:22:30 crc kubenswrapper[4764]: I0309 14:22:30.078411 4764 generic.go:334] "Generic (PLEG): container finished" podID="4638c73f-5adb-4e39-b7d3-b1d6627b7705" containerID="4eae2ce540cdd8401ae00543317b875985d0ae3eeaf5bcbc0a26c17c98f2b329" exitCode=0 Mar 09 14:22:30 crc kubenswrapper[4764]: I0309 14:22:30.078500 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcpdb" event={"ID":"4638c73f-5adb-4e39-b7d3-b1d6627b7705","Type":"ContainerDied","Data":"4eae2ce540cdd8401ae00543317b875985d0ae3eeaf5bcbc0a26c17c98f2b329"} Mar 09 14:22:31 crc kubenswrapper[4764]: I0309 14:22:31.092958 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcpdb" event={"ID":"4638c73f-5adb-4e39-b7d3-b1d6627b7705","Type":"ContainerStarted","Data":"ad80cf2b70762a65d9c7f26342ac7f2b7bc1aeeb502716f1f6efa154db3dcdb4"} Mar 09 14:22:31 crc kubenswrapper[4764]: I0309 14:22:31.124887 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fcpdb" podStartSLOduration=2.665947062 podStartE2EDuration="5.124865415s" podCreationTimestamp="2026-03-09 14:22:26 +0000 UTC" firstStartedPulling="2026-03-09 14:22:28.063672063 +0000 UTC m=+3703.313843971" lastFinishedPulling="2026-03-09 14:22:30.522590416 +0000 UTC m=+3705.772762324" observedRunningTime="2026-03-09 14:22:31.119710987 +0000 UTC m=+3706.369882905" watchObservedRunningTime="2026-03-09 14:22:31.124865415 +0000 UTC m=+3706.375037323" Mar 09 14:22:36 crc kubenswrapper[4764]: I0309 14:22:36.525171 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:36 crc kubenswrapper[4764]: I0309 14:22:36.526514 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:37 crc kubenswrapper[4764]: I0309 14:22:37.572738 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fcpdb" podUID="4638c73f-5adb-4e39-b7d3-b1d6627b7705" containerName="registry-server" probeResult="failure" output=< Mar 09 14:22:37 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Mar 09 14:22:37 crc kubenswrapper[4764]: > Mar 09 14:22:39 crc kubenswrapper[4764]: I0309 14:22:39.560787 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:22:39 crc kubenswrapper[4764]: E0309 14:22:39.561792 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:22:46 crc kubenswrapper[4764]: I0309 14:22:46.583123 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:46 crc kubenswrapper[4764]: I0309 14:22:46.636577 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:46 crc kubenswrapper[4764]: I0309 14:22:46.829138 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fcpdb"] Mar 09 14:22:48 crc kubenswrapper[4764]: I0309 14:22:48.310298 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fcpdb" podUID="4638c73f-5adb-4e39-b7d3-b1d6627b7705" containerName="registry-server" containerID="cri-o://ad80cf2b70762a65d9c7f26342ac7f2b7bc1aeeb502716f1f6efa154db3dcdb4" gracePeriod=2 Mar 09 14:22:48 crc kubenswrapper[4764]: I0309 14:22:48.925942 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.092305 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2bwd\" (UniqueName: \"kubernetes.io/projected/4638c73f-5adb-4e39-b7d3-b1d6627b7705-kube-api-access-j2bwd\") pod \"4638c73f-5adb-4e39-b7d3-b1d6627b7705\" (UID: \"4638c73f-5adb-4e39-b7d3-b1d6627b7705\") " Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.092667 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4638c73f-5adb-4e39-b7d3-b1d6627b7705-catalog-content\") pod \"4638c73f-5adb-4e39-b7d3-b1d6627b7705\" (UID: \"4638c73f-5adb-4e39-b7d3-b1d6627b7705\") " Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.092767 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4638c73f-5adb-4e39-b7d3-b1d6627b7705-utilities\") pod \"4638c73f-5adb-4e39-b7d3-b1d6627b7705\" (UID: \"4638c73f-5adb-4e39-b7d3-b1d6627b7705\") " Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.095213 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4638c73f-5adb-4e39-b7d3-b1d6627b7705-utilities" (OuterVolumeSpecName: "utilities") pod "4638c73f-5adb-4e39-b7d3-b1d6627b7705" (UID: "4638c73f-5adb-4e39-b7d3-b1d6627b7705"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.101023 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4638c73f-5adb-4e39-b7d3-b1d6627b7705-kube-api-access-j2bwd" (OuterVolumeSpecName: "kube-api-access-j2bwd") pod "4638c73f-5adb-4e39-b7d3-b1d6627b7705" (UID: "4638c73f-5adb-4e39-b7d3-b1d6627b7705"). InnerVolumeSpecName "kube-api-access-j2bwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.195702 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2bwd\" (UniqueName: \"kubernetes.io/projected/4638c73f-5adb-4e39-b7d3-b1d6627b7705-kube-api-access-j2bwd\") on node \"crc\" DevicePath \"\"" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.195742 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4638c73f-5adb-4e39-b7d3-b1d6627b7705-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.242357 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4638c73f-5adb-4e39-b7d3-b1d6627b7705-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4638c73f-5adb-4e39-b7d3-b1d6627b7705" (UID: "4638c73f-5adb-4e39-b7d3-b1d6627b7705"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.298840 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4638c73f-5adb-4e39-b7d3-b1d6627b7705-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.324569 4764 generic.go:334] "Generic (PLEG): container finished" podID="4638c73f-5adb-4e39-b7d3-b1d6627b7705" containerID="ad80cf2b70762a65d9c7f26342ac7f2b7bc1aeeb502716f1f6efa154db3dcdb4" exitCode=0 Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.324629 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcpdb" event={"ID":"4638c73f-5adb-4e39-b7d3-b1d6627b7705","Type":"ContainerDied","Data":"ad80cf2b70762a65d9c7f26342ac7f2b7bc1aeeb502716f1f6efa154db3dcdb4"} Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.324682 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fcpdb" event={"ID":"4638c73f-5adb-4e39-b7d3-b1d6627b7705","Type":"ContainerDied","Data":"d22ecbeedc703150c2c29bd0be02199d91646441b7ef06e81064110c6b81f83e"} Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.324709 4764 scope.go:117] "RemoveContainer" containerID="ad80cf2b70762a65d9c7f26342ac7f2b7bc1aeeb502716f1f6efa154db3dcdb4" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.324955 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fcpdb" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.367904 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fcpdb"] Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.367973 4764 scope.go:117] "RemoveContainer" containerID="4eae2ce540cdd8401ae00543317b875985d0ae3eeaf5bcbc0a26c17c98f2b329" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.379099 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fcpdb"] Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.410385 4764 scope.go:117] "RemoveContainer" containerID="56d76f8c6bb58626d2529263241264f7ef0049500bf7d5bd6760d3c4cee8e3c0" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.462086 4764 scope.go:117] "RemoveContainer" containerID="ad80cf2b70762a65d9c7f26342ac7f2b7bc1aeeb502716f1f6efa154db3dcdb4" Mar 09 14:22:49 crc kubenswrapper[4764]: E0309 14:22:49.462791 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad80cf2b70762a65d9c7f26342ac7f2b7bc1aeeb502716f1f6efa154db3dcdb4\": container with ID starting with ad80cf2b70762a65d9c7f26342ac7f2b7bc1aeeb502716f1f6efa154db3dcdb4 not found: ID does not exist" containerID="ad80cf2b70762a65d9c7f26342ac7f2b7bc1aeeb502716f1f6efa154db3dcdb4" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.462856 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad80cf2b70762a65d9c7f26342ac7f2b7bc1aeeb502716f1f6efa154db3dcdb4"} err="failed to get container status \"ad80cf2b70762a65d9c7f26342ac7f2b7bc1aeeb502716f1f6efa154db3dcdb4\": rpc error: code = NotFound desc = could not find container \"ad80cf2b70762a65d9c7f26342ac7f2b7bc1aeeb502716f1f6efa154db3dcdb4\": container with ID starting with ad80cf2b70762a65d9c7f26342ac7f2b7bc1aeeb502716f1f6efa154db3dcdb4 not found: ID does not exist" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.462893 4764 scope.go:117] "RemoveContainer" containerID="4eae2ce540cdd8401ae00543317b875985d0ae3eeaf5bcbc0a26c17c98f2b329" Mar 09 14:22:49 crc kubenswrapper[4764]: E0309 14:22:49.463330 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4eae2ce540cdd8401ae00543317b875985d0ae3eeaf5bcbc0a26c17c98f2b329\": container with ID starting with 4eae2ce540cdd8401ae00543317b875985d0ae3eeaf5bcbc0a26c17c98f2b329 not found: ID does not exist" containerID="4eae2ce540cdd8401ae00543317b875985d0ae3eeaf5bcbc0a26c17c98f2b329" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.463362 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eae2ce540cdd8401ae00543317b875985d0ae3eeaf5bcbc0a26c17c98f2b329"} err="failed to get container status \"4eae2ce540cdd8401ae00543317b875985d0ae3eeaf5bcbc0a26c17c98f2b329\": rpc error: code = NotFound desc = could not find container \"4eae2ce540cdd8401ae00543317b875985d0ae3eeaf5bcbc0a26c17c98f2b329\": container with ID starting with 4eae2ce540cdd8401ae00543317b875985d0ae3eeaf5bcbc0a26c17c98f2b329 not found: ID does not exist" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.463378 4764 scope.go:117] "RemoveContainer" containerID="56d76f8c6bb58626d2529263241264f7ef0049500bf7d5bd6760d3c4cee8e3c0" Mar 09 14:22:49 crc kubenswrapper[4764]: E0309 14:22:49.463640 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56d76f8c6bb58626d2529263241264f7ef0049500bf7d5bd6760d3c4cee8e3c0\": container with ID starting with 56d76f8c6bb58626d2529263241264f7ef0049500bf7d5bd6760d3c4cee8e3c0 not found: ID does not exist" containerID="56d76f8c6bb58626d2529263241264f7ef0049500bf7d5bd6760d3c4cee8e3c0" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.463688 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56d76f8c6bb58626d2529263241264f7ef0049500bf7d5bd6760d3c4cee8e3c0"} err="failed to get container status \"56d76f8c6bb58626d2529263241264f7ef0049500bf7d5bd6760d3c4cee8e3c0\": rpc error: code = NotFound desc = could not find container \"56d76f8c6bb58626d2529263241264f7ef0049500bf7d5bd6760d3c4cee8e3c0\": container with ID starting with 56d76f8c6bb58626d2529263241264f7ef0049500bf7d5bd6760d3c4cee8e3c0 not found: ID does not exist" Mar 09 14:22:49 crc kubenswrapper[4764]: I0309 14:22:49.572386 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4638c73f-5adb-4e39-b7d3-b1d6627b7705" path="/var/lib/kubelet/pods/4638c73f-5adb-4e39-b7d3-b1d6627b7705/volumes" Mar 09 14:22:50 crc kubenswrapper[4764]: I0309 14:22:50.561436 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:22:50 crc kubenswrapper[4764]: E0309 14:22:50.561824 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:23:02 crc kubenswrapper[4764]: I0309 14:23:02.560622 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:23:02 crc kubenswrapper[4764]: E0309 14:23:02.562021 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:23:16 crc kubenswrapper[4764]: I0309 14:23:16.560536 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:23:16 crc kubenswrapper[4764]: E0309 14:23:16.561571 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:23:31 crc kubenswrapper[4764]: I0309 14:23:31.560197 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:23:31 crc kubenswrapper[4764]: E0309 14:23:31.561336 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:23:46 crc kubenswrapper[4764]: I0309 14:23:46.560942 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:23:46 crc kubenswrapper[4764]: E0309 14:23:46.562209 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:23:58 crc kubenswrapper[4764]: I0309 14:23:58.560040 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:23:58 crc kubenswrapper[4764]: E0309 14:23:58.561258 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:24:00 crc kubenswrapper[4764]: I0309 14:24:00.174265 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551104-pc4h6"] Mar 09 14:24:00 crc kubenswrapper[4764]: E0309 14:24:00.175267 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4638c73f-5adb-4e39-b7d3-b1d6627b7705" containerName="registry-server" Mar 09 14:24:00 crc kubenswrapper[4764]: I0309 14:24:00.175288 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4638c73f-5adb-4e39-b7d3-b1d6627b7705" containerName="registry-server" Mar 09 14:24:00 crc kubenswrapper[4764]: E0309 14:24:00.175354 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4638c73f-5adb-4e39-b7d3-b1d6627b7705" containerName="extract-utilities" Mar 09 14:24:00 crc kubenswrapper[4764]: I0309 14:24:00.175363 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4638c73f-5adb-4e39-b7d3-b1d6627b7705" containerName="extract-utilities" Mar 09 14:24:00 crc kubenswrapper[4764]: E0309 14:24:00.175377 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4638c73f-5adb-4e39-b7d3-b1d6627b7705" containerName="extract-content" Mar 09 14:24:00 crc kubenswrapper[4764]: I0309 14:24:00.175386 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4638c73f-5adb-4e39-b7d3-b1d6627b7705" containerName="extract-content" Mar 09 14:24:00 crc kubenswrapper[4764]: I0309 14:24:00.175619 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4638c73f-5adb-4e39-b7d3-b1d6627b7705" containerName="registry-server" Mar 09 14:24:00 crc kubenswrapper[4764]: I0309 14:24:00.176584 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551104-pc4h6" Mar 09 14:24:00 crc kubenswrapper[4764]: I0309 14:24:00.181324 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:24:00 crc kubenswrapper[4764]: I0309 14:24:00.181324 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:24:00 crc kubenswrapper[4764]: I0309 14:24:00.184532 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:24:00 crc kubenswrapper[4764]: I0309 14:24:00.202762 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551104-pc4h6"] Mar 09 14:24:00 crc kubenswrapper[4764]: I0309 14:24:00.214312 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5hdp\" (UniqueName: \"kubernetes.io/projected/b4e97252-1933-4b92-ab28-f9713db14afb-kube-api-access-x5hdp\") pod \"auto-csr-approver-29551104-pc4h6\" (UID: \"b4e97252-1933-4b92-ab28-f9713db14afb\") " pod="openshift-infra/auto-csr-approver-29551104-pc4h6" Mar 09 14:24:00 crc kubenswrapper[4764]: I0309 14:24:00.317165 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5hdp\" (UniqueName: \"kubernetes.io/projected/b4e97252-1933-4b92-ab28-f9713db14afb-kube-api-access-x5hdp\") pod \"auto-csr-approver-29551104-pc4h6\" (UID: \"b4e97252-1933-4b92-ab28-f9713db14afb\") " pod="openshift-infra/auto-csr-approver-29551104-pc4h6" Mar 09 14:24:00 crc kubenswrapper[4764]: I0309 14:24:00.343179 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5hdp\" (UniqueName: \"kubernetes.io/projected/b4e97252-1933-4b92-ab28-f9713db14afb-kube-api-access-x5hdp\") pod \"auto-csr-approver-29551104-pc4h6\" (UID: \"b4e97252-1933-4b92-ab28-f9713db14afb\") " pod="openshift-infra/auto-csr-approver-29551104-pc4h6" Mar 09 14:24:00 crc kubenswrapper[4764]: I0309 14:24:00.499867 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551104-pc4h6" Mar 09 14:24:01 crc kubenswrapper[4764]: I0309 14:24:00.999944 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551104-pc4h6"] Mar 09 14:24:01 crc kubenswrapper[4764]: I0309 14:24:01.017130 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 14:24:01 crc kubenswrapper[4764]: I0309 14:24:01.161438 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551104-pc4h6" event={"ID":"b4e97252-1933-4b92-ab28-f9713db14afb","Type":"ContainerStarted","Data":"9bdfcd300abbbc3598563c8f1975cdc69a41fc3ac9b5564bdbcb92bcbc4b9f52"} Mar 09 14:24:03 crc kubenswrapper[4764]: I0309 14:24:03.185758 4764 generic.go:334] "Generic (PLEG): container finished" podID="b4e97252-1933-4b92-ab28-f9713db14afb" containerID="fbe599b37c79d3976b2347daa88e4c2b8bc846bfc99bce86dadf53c1a9ea4b91" exitCode=0 Mar 09 14:24:03 crc kubenswrapper[4764]: I0309 14:24:03.185827 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551104-pc4h6" event={"ID":"b4e97252-1933-4b92-ab28-f9713db14afb","Type":"ContainerDied","Data":"fbe599b37c79d3976b2347daa88e4c2b8bc846bfc99bce86dadf53c1a9ea4b91"} Mar 09 14:24:04 crc kubenswrapper[4764]: I0309 14:24:04.579347 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551104-pc4h6" Mar 09 14:24:04 crc kubenswrapper[4764]: I0309 14:24:04.628949 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5hdp\" (UniqueName: \"kubernetes.io/projected/b4e97252-1933-4b92-ab28-f9713db14afb-kube-api-access-x5hdp\") pod \"b4e97252-1933-4b92-ab28-f9713db14afb\" (UID: \"b4e97252-1933-4b92-ab28-f9713db14afb\") " Mar 09 14:24:04 crc kubenswrapper[4764]: I0309 14:24:04.644193 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4e97252-1933-4b92-ab28-f9713db14afb-kube-api-access-x5hdp" (OuterVolumeSpecName: "kube-api-access-x5hdp") pod "b4e97252-1933-4b92-ab28-f9713db14afb" (UID: "b4e97252-1933-4b92-ab28-f9713db14afb"). InnerVolumeSpecName "kube-api-access-x5hdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:24:04 crc kubenswrapper[4764]: I0309 14:24:04.732443 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5hdp\" (UniqueName: \"kubernetes.io/projected/b4e97252-1933-4b92-ab28-f9713db14afb-kube-api-access-x5hdp\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:05 crc kubenswrapper[4764]: I0309 14:24:05.206717 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551104-pc4h6" event={"ID":"b4e97252-1933-4b92-ab28-f9713db14afb","Type":"ContainerDied","Data":"9bdfcd300abbbc3598563c8f1975cdc69a41fc3ac9b5564bdbcb92bcbc4b9f52"} Mar 09 14:24:05 crc kubenswrapper[4764]: I0309 14:24:05.207090 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bdfcd300abbbc3598563c8f1975cdc69a41fc3ac9b5564bdbcb92bcbc4b9f52" Mar 09 14:24:05 crc kubenswrapper[4764]: I0309 14:24:05.206800 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551104-pc4h6" Mar 09 14:24:05 crc kubenswrapper[4764]: I0309 14:24:05.669416 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551098-7clls"] Mar 09 14:24:05 crc kubenswrapper[4764]: I0309 14:24:05.680869 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551098-7clls"] Mar 09 14:24:07 crc kubenswrapper[4764]: I0309 14:24:07.572450 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e86714ea-a59a-4955-b4a5-038ce0ce7bf6" path="/var/lib/kubelet/pods/e86714ea-a59a-4955-b4a5-038ce0ce7bf6/volumes" Mar 09 14:24:11 crc kubenswrapper[4764]: I0309 14:24:11.549859 4764 scope.go:117] "RemoveContainer" containerID="c9de23699d45475689268b26fadafce561fa4c6da1922b49b79542286bd5c95d" Mar 09 14:24:13 crc kubenswrapper[4764]: I0309 14:24:13.564796 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:24:13 crc kubenswrapper[4764]: E0309 14:24:13.565726 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:24:26 crc kubenswrapper[4764]: I0309 14:24:26.560033 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:24:26 crc kubenswrapper[4764]: E0309 14:24:26.561143 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:24:41 crc kubenswrapper[4764]: I0309 14:24:41.559921 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:24:41 crc kubenswrapper[4764]: E0309 14:24:41.561025 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:24:55 crc kubenswrapper[4764]: I0309 14:24:55.568080 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:24:55 crc kubenswrapper[4764]: E0309 14:24:55.569360 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:25:07 crc kubenswrapper[4764]: I0309 14:25:07.559767 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:25:07 crc kubenswrapper[4764]: E0309 14:25:07.564136 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:25:19 crc kubenswrapper[4764]: I0309 14:25:19.560716 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:25:19 crc kubenswrapper[4764]: E0309 14:25:19.561877 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:25:31 crc kubenswrapper[4764]: I0309 14:25:31.560555 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:25:31 crc kubenswrapper[4764]: E0309 14:25:31.561685 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:25:43 crc kubenswrapper[4764]: I0309 14:25:43.560410 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:25:43 crc kubenswrapper[4764]: E0309 14:25:43.561473 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:25:58 crc kubenswrapper[4764]: I0309 14:25:58.560803 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:25:58 crc kubenswrapper[4764]: E0309 14:25:58.561996 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:26:00 crc kubenswrapper[4764]: I0309 14:26:00.152556 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551106-dnhln"] Mar 09 14:26:00 crc kubenswrapper[4764]: E0309 14:26:00.153536 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4e97252-1933-4b92-ab28-f9713db14afb" containerName="oc" Mar 09 14:26:00 crc kubenswrapper[4764]: I0309 14:26:00.153556 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4e97252-1933-4b92-ab28-f9713db14afb" containerName="oc" Mar 09 14:26:00 crc kubenswrapper[4764]: I0309 14:26:00.153828 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4e97252-1933-4b92-ab28-f9713db14afb" containerName="oc" Mar 09 14:26:00 crc kubenswrapper[4764]: I0309 14:26:00.158332 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551106-dnhln" Mar 09 14:26:00 crc kubenswrapper[4764]: I0309 14:26:00.171352 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:26:00 crc kubenswrapper[4764]: I0309 14:26:00.171519 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:26:00 crc kubenswrapper[4764]: I0309 14:26:00.176484 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:26:00 crc kubenswrapper[4764]: I0309 14:26:00.179042 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551106-dnhln"] Mar 09 14:26:00 crc kubenswrapper[4764]: I0309 14:26:00.277939 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcxjc\" (UniqueName: \"kubernetes.io/projected/d33fc9b0-e440-4f1b-9522-1abec06eca2a-kube-api-access-vcxjc\") pod \"auto-csr-approver-29551106-dnhln\" (UID: \"d33fc9b0-e440-4f1b-9522-1abec06eca2a\") " pod="openshift-infra/auto-csr-approver-29551106-dnhln" Mar 09 14:26:00 crc kubenswrapper[4764]: I0309 14:26:00.381196 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcxjc\" (UniqueName: \"kubernetes.io/projected/d33fc9b0-e440-4f1b-9522-1abec06eca2a-kube-api-access-vcxjc\") pod \"auto-csr-approver-29551106-dnhln\" (UID: \"d33fc9b0-e440-4f1b-9522-1abec06eca2a\") " pod="openshift-infra/auto-csr-approver-29551106-dnhln" Mar 09 14:26:00 crc kubenswrapper[4764]: I0309 14:26:00.794192 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcxjc\" (UniqueName: \"kubernetes.io/projected/d33fc9b0-e440-4f1b-9522-1abec06eca2a-kube-api-access-vcxjc\") pod \"auto-csr-approver-29551106-dnhln\" (UID: \"d33fc9b0-e440-4f1b-9522-1abec06eca2a\") " pod="openshift-infra/auto-csr-approver-29551106-dnhln" Mar 09 14:26:00 crc kubenswrapper[4764]: I0309 14:26:00.830732 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551106-dnhln" Mar 09 14:26:01 crc kubenswrapper[4764]: I0309 14:26:01.343092 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551106-dnhln"] Mar 09 14:26:01 crc kubenswrapper[4764]: I0309 14:26:01.386244 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551106-dnhln" event={"ID":"d33fc9b0-e440-4f1b-9522-1abec06eca2a","Type":"ContainerStarted","Data":"e9244a9303f048aa1f4d232fd72882791bc82d03a9aa22f780d5079aac920863"} Mar 09 14:26:03 crc kubenswrapper[4764]: I0309 14:26:03.408160 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551106-dnhln" event={"ID":"d33fc9b0-e440-4f1b-9522-1abec06eca2a","Type":"ContainerStarted","Data":"8f9d999651aa510edfbd48cd8433fc728bf24d4c32451ce578a41c04f581a328"} Mar 09 14:26:03 crc kubenswrapper[4764]: I0309 14:26:03.433281 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551106-dnhln" podStartSLOduration=2.299167453 podStartE2EDuration="3.433242913s" podCreationTimestamp="2026-03-09 14:26:00 +0000 UTC" firstStartedPulling="2026-03-09 14:26:01.349812092 +0000 UTC m=+3916.599984000" lastFinishedPulling="2026-03-09 14:26:02.483887542 +0000 UTC m=+3917.734059460" observedRunningTime="2026-03-09 14:26:03.423746928 +0000 UTC m=+3918.673918846" watchObservedRunningTime="2026-03-09 14:26:03.433242913 +0000 UTC m=+3918.683414831" Mar 09 14:26:04 crc kubenswrapper[4764]: I0309 14:26:04.421061 4764 generic.go:334] "Generic (PLEG): container finished" podID="d33fc9b0-e440-4f1b-9522-1abec06eca2a" containerID="8f9d999651aa510edfbd48cd8433fc728bf24d4c32451ce578a41c04f581a328" exitCode=0 Mar 09 14:26:04 crc kubenswrapper[4764]: I0309 14:26:04.421108 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551106-dnhln" event={"ID":"d33fc9b0-e440-4f1b-9522-1abec06eca2a","Type":"ContainerDied","Data":"8f9d999651aa510edfbd48cd8433fc728bf24d4c32451ce578a41c04f581a328"} Mar 09 14:26:05 crc kubenswrapper[4764]: I0309 14:26:05.847613 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551106-dnhln" Mar 09 14:26:05 crc kubenswrapper[4764]: I0309 14:26:05.929637 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcxjc\" (UniqueName: \"kubernetes.io/projected/d33fc9b0-e440-4f1b-9522-1abec06eca2a-kube-api-access-vcxjc\") pod \"d33fc9b0-e440-4f1b-9522-1abec06eca2a\" (UID: \"d33fc9b0-e440-4f1b-9522-1abec06eca2a\") " Mar 09 14:26:05 crc kubenswrapper[4764]: I0309 14:26:05.937328 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d33fc9b0-e440-4f1b-9522-1abec06eca2a-kube-api-access-vcxjc" (OuterVolumeSpecName: "kube-api-access-vcxjc") pod "d33fc9b0-e440-4f1b-9522-1abec06eca2a" (UID: "d33fc9b0-e440-4f1b-9522-1abec06eca2a"). InnerVolumeSpecName "kube-api-access-vcxjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:26:06 crc kubenswrapper[4764]: I0309 14:26:06.033843 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcxjc\" (UniqueName: \"kubernetes.io/projected/d33fc9b0-e440-4f1b-9522-1abec06eca2a-kube-api-access-vcxjc\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:06 crc kubenswrapper[4764]: I0309 14:26:06.444454 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551106-dnhln" event={"ID":"d33fc9b0-e440-4f1b-9522-1abec06eca2a","Type":"ContainerDied","Data":"e9244a9303f048aa1f4d232fd72882791bc82d03a9aa22f780d5079aac920863"} Mar 09 14:26:06 crc kubenswrapper[4764]: I0309 14:26:06.444951 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9244a9303f048aa1f4d232fd72882791bc82d03a9aa22f780d5079aac920863" Mar 09 14:26:06 crc kubenswrapper[4764]: I0309 14:26:06.444613 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551106-dnhln" Mar 09 14:26:06 crc kubenswrapper[4764]: I0309 14:26:06.516028 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551100-l6b4v"] Mar 09 14:26:06 crc kubenswrapper[4764]: I0309 14:26:06.525201 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551100-l6b4v"] Mar 09 14:26:07 crc kubenswrapper[4764]: I0309 14:26:07.571845 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac3d8a45-0030-433c-a813-fa93811b952f" path="/var/lib/kubelet/pods/ac3d8a45-0030-433c-a813-fa93811b952f/volumes" Mar 09 14:26:11 crc kubenswrapper[4764]: I0309 14:26:11.667961 4764 scope.go:117] "RemoveContainer" containerID="0555c912480a032e6e1167209d44998b6306e07cd673abce316913bd071bd80f" Mar 09 14:26:13 crc kubenswrapper[4764]: I0309 14:26:13.560699 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:26:13 crc kubenswrapper[4764]: E0309 14:26:13.561483 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:26:27 crc kubenswrapper[4764]: I0309 14:26:27.561045 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:26:27 crc kubenswrapper[4764]: E0309 14:26:27.562842 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:26:38 crc kubenswrapper[4764]: I0309 14:26:38.560290 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:26:38 crc kubenswrapper[4764]: E0309 14:26:38.561493 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:26:53 crc kubenswrapper[4764]: I0309 14:26:53.561142 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:26:53 crc kubenswrapper[4764]: E0309 14:26:53.562223 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:27:04 crc kubenswrapper[4764]: I0309 14:27:04.560741 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:27:05 crc kubenswrapper[4764]: I0309 14:27:05.017435 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"956798d3640b6af2f25ed443a7b5b3c26e0da670aff756c574b1a30cc50879dd"} Mar 09 14:28:00 crc kubenswrapper[4764]: I0309 14:28:00.153992 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551108-8tqjz"] Mar 09 14:28:00 crc kubenswrapper[4764]: E0309 14:28:00.155521 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d33fc9b0-e440-4f1b-9522-1abec06eca2a" containerName="oc" Mar 09 14:28:00 crc kubenswrapper[4764]: I0309 14:28:00.155541 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d33fc9b0-e440-4f1b-9522-1abec06eca2a" containerName="oc" Mar 09 14:28:00 crc kubenswrapper[4764]: I0309 14:28:00.155873 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d33fc9b0-e440-4f1b-9522-1abec06eca2a" containerName="oc" Mar 09 14:28:00 crc kubenswrapper[4764]: I0309 14:28:00.156905 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551108-8tqjz" Mar 09 14:28:00 crc kubenswrapper[4764]: I0309 14:28:00.159583 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:28:00 crc kubenswrapper[4764]: I0309 14:28:00.159757 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:28:00 crc kubenswrapper[4764]: I0309 14:28:00.160069 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:28:00 crc kubenswrapper[4764]: I0309 14:28:00.166883 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551108-8tqjz"] Mar 09 14:28:00 crc kubenswrapper[4764]: I0309 14:28:00.258948 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs8d7\" (UniqueName: \"kubernetes.io/projected/1f49d558-1317-4099-abb0-bb57895b3917-kube-api-access-fs8d7\") pod \"auto-csr-approver-29551108-8tqjz\" (UID: \"1f49d558-1317-4099-abb0-bb57895b3917\") " pod="openshift-infra/auto-csr-approver-29551108-8tqjz" Mar 09 14:28:00 crc kubenswrapper[4764]: I0309 14:28:00.361220 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs8d7\" (UniqueName: \"kubernetes.io/projected/1f49d558-1317-4099-abb0-bb57895b3917-kube-api-access-fs8d7\") pod \"auto-csr-approver-29551108-8tqjz\" (UID: \"1f49d558-1317-4099-abb0-bb57895b3917\") " pod="openshift-infra/auto-csr-approver-29551108-8tqjz" Mar 09 14:28:00 crc kubenswrapper[4764]: I0309 14:28:00.386226 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs8d7\" (UniqueName: \"kubernetes.io/projected/1f49d558-1317-4099-abb0-bb57895b3917-kube-api-access-fs8d7\") pod \"auto-csr-approver-29551108-8tqjz\" (UID: \"1f49d558-1317-4099-abb0-bb57895b3917\") " pod="openshift-infra/auto-csr-approver-29551108-8tqjz" Mar 09 14:28:00 crc kubenswrapper[4764]: I0309 14:28:00.487817 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551108-8tqjz" Mar 09 14:28:00 crc kubenswrapper[4764]: I0309 14:28:00.998605 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551108-8tqjz"] Mar 09 14:28:01 crc kubenswrapper[4764]: I0309 14:28:01.686805 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551108-8tqjz" event={"ID":"1f49d558-1317-4099-abb0-bb57895b3917","Type":"ContainerStarted","Data":"4bf524a7497f95a2615264ebe34d165b98a41502eaf492b05240e40372c8f8f6"} Mar 09 14:28:02 crc kubenswrapper[4764]: E0309 14:28:02.634605 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f49d558_1317_4099_abb0_bb57895b3917.slice/crio-conmon-3d8746bde1498da01929d78c37701a3fb1698a0044a00e71ddbcca0d4465fb3c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f49d558_1317_4099_abb0_bb57895b3917.slice/crio-3d8746bde1498da01929d78c37701a3fb1698a0044a00e71ddbcca0d4465fb3c.scope\": RecentStats: unable to find data in memory cache]" Mar 09 14:28:02 crc kubenswrapper[4764]: I0309 14:28:02.699998 4764 generic.go:334] "Generic (PLEG): container finished" podID="1f49d558-1317-4099-abb0-bb57895b3917" containerID="3d8746bde1498da01929d78c37701a3fb1698a0044a00e71ddbcca0d4465fb3c" exitCode=0 Mar 09 14:28:02 crc kubenswrapper[4764]: I0309 14:28:02.700415 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551108-8tqjz" event={"ID":"1f49d558-1317-4099-abb0-bb57895b3917","Type":"ContainerDied","Data":"3d8746bde1498da01929d78c37701a3fb1698a0044a00e71ddbcca0d4465fb3c"} Mar 09 14:28:04 crc kubenswrapper[4764]: I0309 14:28:04.208673 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551108-8tqjz" Mar 09 14:28:04 crc kubenswrapper[4764]: I0309 14:28:04.385364 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fs8d7\" (UniqueName: \"kubernetes.io/projected/1f49d558-1317-4099-abb0-bb57895b3917-kube-api-access-fs8d7\") pod \"1f49d558-1317-4099-abb0-bb57895b3917\" (UID: \"1f49d558-1317-4099-abb0-bb57895b3917\") " Mar 09 14:28:04 crc kubenswrapper[4764]: I0309 14:28:04.391741 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f49d558-1317-4099-abb0-bb57895b3917-kube-api-access-fs8d7" (OuterVolumeSpecName: "kube-api-access-fs8d7") pod "1f49d558-1317-4099-abb0-bb57895b3917" (UID: "1f49d558-1317-4099-abb0-bb57895b3917"). InnerVolumeSpecName "kube-api-access-fs8d7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:28:04 crc kubenswrapper[4764]: I0309 14:28:04.490858 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fs8d7\" (UniqueName: \"kubernetes.io/projected/1f49d558-1317-4099-abb0-bb57895b3917-kube-api-access-fs8d7\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:04 crc kubenswrapper[4764]: I0309 14:28:04.723130 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551108-8tqjz" event={"ID":"1f49d558-1317-4099-abb0-bb57895b3917","Type":"ContainerDied","Data":"4bf524a7497f95a2615264ebe34d165b98a41502eaf492b05240e40372c8f8f6"} Mar 09 14:28:04 crc kubenswrapper[4764]: I0309 14:28:04.723198 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bf524a7497f95a2615264ebe34d165b98a41502eaf492b05240e40372c8f8f6" Mar 09 14:28:04 crc kubenswrapper[4764]: I0309 14:28:04.723243 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551108-8tqjz" Mar 09 14:28:05 crc kubenswrapper[4764]: I0309 14:28:05.289446 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551102-gnwjq"] Mar 09 14:28:05 crc kubenswrapper[4764]: I0309 14:28:05.303954 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551102-gnwjq"] Mar 09 14:28:05 crc kubenswrapper[4764]: I0309 14:28:05.575280 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f976f5d-f876-491a-8557-f6755b9641a3" path="/var/lib/kubelet/pods/8f976f5d-f876-491a-8557-f6755b9641a3/volumes" Mar 09 14:28:11 crc kubenswrapper[4764]: I0309 14:28:11.803054 4764 scope.go:117] "RemoveContainer" containerID="469e856eb9ee2a8bcc4f55156a6acb6b99b858dcd6fc4bec4aaa66e7dc6669a8" Mar 09 14:28:51 crc kubenswrapper[4764]: I0309 14:28:51.194020 4764 generic.go:334] "Generic (PLEG): container finished" podID="5db22a0e-ee1a-4b26-9e49-b26644266834" containerID="88276b497758f3b5f0f6060dd845ad4b0ff499d7eeb52fb1af2c65524e2ceaba" exitCode=1 Mar 09 14:28:51 crc kubenswrapper[4764]: I0309 14:28:51.194114 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5db22a0e-ee1a-4b26-9e49-b26644266834","Type":"ContainerDied","Data":"88276b497758f3b5f0f6060dd845ad4b0ff499d7eeb52fb1af2c65524e2ceaba"} Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.591941 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.678928 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5db22a0e-ee1a-4b26-9e49-b26644266834-test-operator-ephemeral-temporary\") pod \"5db22a0e-ee1a-4b26-9e49-b26644266834\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.679141 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-openstack-config-secret\") pod \"5db22a0e-ee1a-4b26-9e49-b26644266834\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.679183 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5db22a0e-ee1a-4b26-9e49-b26644266834-test-operator-ephemeral-workdir\") pod \"5db22a0e-ee1a-4b26-9e49-b26644266834\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.679213 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"5db22a0e-ee1a-4b26-9e49-b26644266834\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.679277 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-ssh-key\") pod \"5db22a0e-ee1a-4b26-9e49-b26644266834\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.679358 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9db4k\" (UniqueName: \"kubernetes.io/projected/5db22a0e-ee1a-4b26-9e49-b26644266834-kube-api-access-9db4k\") pod \"5db22a0e-ee1a-4b26-9e49-b26644266834\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.679414 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5db22a0e-ee1a-4b26-9e49-b26644266834-openstack-config\") pod \"5db22a0e-ee1a-4b26-9e49-b26644266834\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.679479 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-ca-certs\") pod \"5db22a0e-ee1a-4b26-9e49-b26644266834\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.679554 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5db22a0e-ee1a-4b26-9e49-b26644266834-config-data\") pod \"5db22a0e-ee1a-4b26-9e49-b26644266834\" (UID: \"5db22a0e-ee1a-4b26-9e49-b26644266834\") " Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.679805 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5db22a0e-ee1a-4b26-9e49-b26644266834-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "5db22a0e-ee1a-4b26-9e49-b26644266834" (UID: "5db22a0e-ee1a-4b26-9e49-b26644266834"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.680380 4764 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5db22a0e-ee1a-4b26-9e49-b26644266834-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.680673 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5db22a0e-ee1a-4b26-9e49-b26644266834-config-data" (OuterVolumeSpecName: "config-data") pod "5db22a0e-ee1a-4b26-9e49-b26644266834" (UID: "5db22a0e-ee1a-4b26-9e49-b26644266834"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.686807 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5db22a0e-ee1a-4b26-9e49-b26644266834-kube-api-access-9db4k" (OuterVolumeSpecName: "kube-api-access-9db4k") pod "5db22a0e-ee1a-4b26-9e49-b26644266834" (UID: "5db22a0e-ee1a-4b26-9e49-b26644266834"). InnerVolumeSpecName "kube-api-access-9db4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.686842 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5db22a0e-ee1a-4b26-9e49-b26644266834-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "5db22a0e-ee1a-4b26-9e49-b26644266834" (UID: "5db22a0e-ee1a-4b26-9e49-b26644266834"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.686870 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "test-operator-logs") pod "5db22a0e-ee1a-4b26-9e49-b26644266834" (UID: "5db22a0e-ee1a-4b26-9e49-b26644266834"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.711914 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5db22a0e-ee1a-4b26-9e49-b26644266834" (UID: "5db22a0e-ee1a-4b26-9e49-b26644266834"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.712522 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "5db22a0e-ee1a-4b26-9e49-b26644266834" (UID: "5db22a0e-ee1a-4b26-9e49-b26644266834"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.714853 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "5db22a0e-ee1a-4b26-9e49-b26644266834" (UID: "5db22a0e-ee1a-4b26-9e49-b26644266834"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.741121 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5db22a0e-ee1a-4b26-9e49-b26644266834-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "5db22a0e-ee1a-4b26-9e49-b26644266834" (UID: "5db22a0e-ee1a-4b26-9e49-b26644266834"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.782320 4764 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.782366 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5db22a0e-ee1a-4b26-9e49-b26644266834-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.782381 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.782396 4764 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5db22a0e-ee1a-4b26-9e49-b26644266834-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.782441 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.782452 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5db22a0e-ee1a-4b26-9e49-b26644266834-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.782462 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9db4k\" (UniqueName: \"kubernetes.io/projected/5db22a0e-ee1a-4b26-9e49-b26644266834-kube-api-access-9db4k\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.782470 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5db22a0e-ee1a-4b26-9e49-b26644266834-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.815351 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 09 14:28:52 crc kubenswrapper[4764]: I0309 14:28:52.884887 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:53 crc kubenswrapper[4764]: I0309 14:28:53.214633 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5db22a0e-ee1a-4b26-9e49-b26644266834","Type":"ContainerDied","Data":"bb1050b512fa7c7108bc1149d279ca654a7191121729b646cce9a919a3a91226"} Mar 09 14:28:53 crc kubenswrapper[4764]: I0309 14:28:53.214721 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb1050b512fa7c7108bc1149d279ca654a7191121729b646cce9a919a3a91226" Mar 09 14:28:53 crc kubenswrapper[4764]: I0309 14:28:53.214775 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 09 14:28:55 crc kubenswrapper[4764]: I0309 14:28:55.914325 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 09 14:28:55 crc kubenswrapper[4764]: E0309 14:28:55.916951 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5db22a0e-ee1a-4b26-9e49-b26644266834" containerName="tempest-tests-tempest-tests-runner" Mar 09 14:28:55 crc kubenswrapper[4764]: I0309 14:28:55.917057 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5db22a0e-ee1a-4b26-9e49-b26644266834" containerName="tempest-tests-tempest-tests-runner" Mar 09 14:28:55 crc kubenswrapper[4764]: E0309 14:28:55.917162 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f49d558-1317-4099-abb0-bb57895b3917" containerName="oc" Mar 09 14:28:55 crc kubenswrapper[4764]: I0309 14:28:55.917245 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f49d558-1317-4099-abb0-bb57895b3917" containerName="oc" Mar 09 14:28:55 crc kubenswrapper[4764]: I0309 14:28:55.917588 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f49d558-1317-4099-abb0-bb57895b3917" containerName="oc" Mar 09 14:28:55 crc kubenswrapper[4764]: I0309 14:28:55.917713 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5db22a0e-ee1a-4b26-9e49-b26644266834" containerName="tempest-tests-tempest-tests-runner" Mar 09 14:28:55 crc kubenswrapper[4764]: I0309 14:28:55.918722 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 14:28:55 crc kubenswrapper[4764]: I0309 14:28:55.921175 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-9bk55" Mar 09 14:28:55 crc kubenswrapper[4764]: I0309 14:28:55.928390 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 09 14:28:56 crc kubenswrapper[4764]: I0309 14:28:56.060709 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3b233056-629a-4653-8726-76e6b231e58b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 14:28:56 crc kubenswrapper[4764]: I0309 14:28:56.061140 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pbmq\" (UniqueName: \"kubernetes.io/projected/3b233056-629a-4653-8726-76e6b231e58b-kube-api-access-2pbmq\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3b233056-629a-4653-8726-76e6b231e58b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 14:28:56 crc kubenswrapper[4764]: I0309 14:28:56.163719 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3b233056-629a-4653-8726-76e6b231e58b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 14:28:56 crc kubenswrapper[4764]: I0309 14:28:56.163844 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pbmq\" (UniqueName: \"kubernetes.io/projected/3b233056-629a-4653-8726-76e6b231e58b-kube-api-access-2pbmq\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3b233056-629a-4653-8726-76e6b231e58b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 14:28:56 crc kubenswrapper[4764]: I0309 14:28:56.164491 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3b233056-629a-4653-8726-76e6b231e58b\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 14:28:56 crc kubenswrapper[4764]: I0309 14:28:56.186458 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pbmq\" (UniqueName: \"kubernetes.io/projected/3b233056-629a-4653-8726-76e6b231e58b-kube-api-access-2pbmq\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3b233056-629a-4653-8726-76e6b231e58b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 14:28:56 crc kubenswrapper[4764]: I0309 14:28:56.193855 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3b233056-629a-4653-8726-76e6b231e58b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 14:28:56 crc kubenswrapper[4764]: I0309 14:28:56.236465 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 14:28:56 crc kubenswrapper[4764]: I0309 14:28:56.719293 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 09 14:28:57 crc kubenswrapper[4764]: I0309 14:28:57.271617 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"3b233056-629a-4653-8726-76e6b231e58b","Type":"ContainerStarted","Data":"038f360b2c61c4b55ed7bc17478170c2792e08704ea7f96f87972b775e992e5a"} Mar 09 14:28:58 crc kubenswrapper[4764]: I0309 14:28:58.287544 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"3b233056-629a-4653-8726-76e6b231e58b","Type":"ContainerStarted","Data":"ad51648649dfbb2f41e1f67e4374471c83a02d3db0063e492c33bb98690bf63c"} Mar 09 14:29:03 crc kubenswrapper[4764]: I0309 14:29:03.666461 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=7.83190881 podStartE2EDuration="8.666434833s" podCreationTimestamp="2026-03-09 14:28:55 +0000 UTC" firstStartedPulling="2026-03-09 14:28:56.731603636 +0000 UTC m=+4091.981775544" lastFinishedPulling="2026-03-09 14:28:57.566129659 +0000 UTC m=+4092.816301567" observedRunningTime="2026-03-09 14:28:58.307147627 +0000 UTC m=+4093.557319555" watchObservedRunningTime="2026-03-09 14:29:03.666434833 +0000 UTC m=+4098.916606741" Mar 09 14:29:03 crc kubenswrapper[4764]: I0309 14:29:03.673086 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ggs99"] Mar 09 14:29:03 crc kubenswrapper[4764]: I0309 14:29:03.675721 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ggs99" Mar 09 14:29:03 crc kubenswrapper[4764]: I0309 14:29:03.690002 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ggs99"] Mar 09 14:29:03 crc kubenswrapper[4764]: I0309 14:29:03.764408 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxvvn\" (UniqueName: \"kubernetes.io/projected/176982f0-3e86-471c-8054-13490ee485bb-kube-api-access-gxvvn\") pod \"community-operators-ggs99\" (UID: \"176982f0-3e86-471c-8054-13490ee485bb\") " pod="openshift-marketplace/community-operators-ggs99" Mar 09 14:29:03 crc kubenswrapper[4764]: I0309 14:29:03.765219 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/176982f0-3e86-471c-8054-13490ee485bb-catalog-content\") pod \"community-operators-ggs99\" (UID: \"176982f0-3e86-471c-8054-13490ee485bb\") " pod="openshift-marketplace/community-operators-ggs99" Mar 09 14:29:03 crc kubenswrapper[4764]: I0309 14:29:03.765351 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/176982f0-3e86-471c-8054-13490ee485bb-utilities\") pod \"community-operators-ggs99\" (UID: \"176982f0-3e86-471c-8054-13490ee485bb\") " pod="openshift-marketplace/community-operators-ggs99" Mar 09 14:29:03 crc kubenswrapper[4764]: I0309 14:29:03.868014 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/176982f0-3e86-471c-8054-13490ee485bb-catalog-content\") pod \"community-operators-ggs99\" (UID: \"176982f0-3e86-471c-8054-13490ee485bb\") " pod="openshift-marketplace/community-operators-ggs99" Mar 09 14:29:03 crc kubenswrapper[4764]: I0309 14:29:03.868078 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/176982f0-3e86-471c-8054-13490ee485bb-utilities\") pod \"community-operators-ggs99\" (UID: \"176982f0-3e86-471c-8054-13490ee485bb\") " pod="openshift-marketplace/community-operators-ggs99" Mar 09 14:29:03 crc kubenswrapper[4764]: I0309 14:29:03.868141 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxvvn\" (UniqueName: \"kubernetes.io/projected/176982f0-3e86-471c-8054-13490ee485bb-kube-api-access-gxvvn\") pod \"community-operators-ggs99\" (UID: \"176982f0-3e86-471c-8054-13490ee485bb\") " pod="openshift-marketplace/community-operators-ggs99" Mar 09 14:29:03 crc kubenswrapper[4764]: I0309 14:29:03.868719 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/176982f0-3e86-471c-8054-13490ee485bb-catalog-content\") pod \"community-operators-ggs99\" (UID: \"176982f0-3e86-471c-8054-13490ee485bb\") " pod="openshift-marketplace/community-operators-ggs99" Mar 09 14:29:03 crc kubenswrapper[4764]: I0309 14:29:03.868879 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/176982f0-3e86-471c-8054-13490ee485bb-utilities\") pod \"community-operators-ggs99\" (UID: \"176982f0-3e86-471c-8054-13490ee485bb\") " pod="openshift-marketplace/community-operators-ggs99" Mar 09 14:29:03 crc kubenswrapper[4764]: I0309 14:29:03.889909 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxvvn\" (UniqueName: \"kubernetes.io/projected/176982f0-3e86-471c-8054-13490ee485bb-kube-api-access-gxvvn\") pod \"community-operators-ggs99\" (UID: \"176982f0-3e86-471c-8054-13490ee485bb\") " pod="openshift-marketplace/community-operators-ggs99" Mar 09 14:29:04 crc kubenswrapper[4764]: I0309 14:29:04.003719 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ggs99" Mar 09 14:29:04 crc kubenswrapper[4764]: I0309 14:29:04.103110 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8g2gm"] Mar 09 14:29:04 crc kubenswrapper[4764]: I0309 14:29:04.105528 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:04 crc kubenswrapper[4764]: I0309 14:29:04.115673 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8g2gm"] Mar 09 14:29:04 crc kubenswrapper[4764]: I0309 14:29:04.175515 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-catalog-content\") pod \"certified-operators-8g2gm\" (UID: \"07b9f49e-cd4b-44ac-ba0b-22764e3372b3\") " pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:04 crc kubenswrapper[4764]: I0309 14:29:04.175886 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mlds\" (UniqueName: \"kubernetes.io/projected/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-kube-api-access-8mlds\") pod \"certified-operators-8g2gm\" (UID: \"07b9f49e-cd4b-44ac-ba0b-22764e3372b3\") " pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:04 crc kubenswrapper[4764]: I0309 14:29:04.176177 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-utilities\") pod \"certified-operators-8g2gm\" (UID: \"07b9f49e-cd4b-44ac-ba0b-22764e3372b3\") " pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:04 crc kubenswrapper[4764]: I0309 14:29:04.285098 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-catalog-content\") pod \"certified-operators-8g2gm\" (UID: \"07b9f49e-cd4b-44ac-ba0b-22764e3372b3\") " pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:04 crc kubenswrapper[4764]: I0309 14:29:04.285197 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mlds\" (UniqueName: \"kubernetes.io/projected/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-kube-api-access-8mlds\") pod \"certified-operators-8g2gm\" (UID: \"07b9f49e-cd4b-44ac-ba0b-22764e3372b3\") " pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:04 crc kubenswrapper[4764]: I0309 14:29:04.285259 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-utilities\") pod \"certified-operators-8g2gm\" (UID: \"07b9f49e-cd4b-44ac-ba0b-22764e3372b3\") " pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:04 crc kubenswrapper[4764]: I0309 14:29:04.285833 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-utilities\") pod \"certified-operators-8g2gm\" (UID: \"07b9f49e-cd4b-44ac-ba0b-22764e3372b3\") " pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:04 crc kubenswrapper[4764]: I0309 14:29:04.286063 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-catalog-content\") pod \"certified-operators-8g2gm\" (UID: \"07b9f49e-cd4b-44ac-ba0b-22764e3372b3\") " pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:04 crc kubenswrapper[4764]: I0309 14:29:04.337071 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mlds\" (UniqueName: \"kubernetes.io/projected/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-kube-api-access-8mlds\") pod \"certified-operators-8g2gm\" (UID: \"07b9f49e-cd4b-44ac-ba0b-22764e3372b3\") " pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:04 crc kubenswrapper[4764]: I0309 14:29:04.542001 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:04 crc kubenswrapper[4764]: I0309 14:29:04.702778 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ggs99"] Mar 09 14:29:05 crc kubenswrapper[4764]: I0309 14:29:05.091232 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8g2gm"] Mar 09 14:29:05 crc kubenswrapper[4764]: W0309 14:29:05.100049 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07b9f49e_cd4b_44ac_ba0b_22764e3372b3.slice/crio-b93b6586819dc583657fbe5fb0e96fd38cad070aa30246cf99d71f1f6cb66d31 WatchSource:0}: Error finding container b93b6586819dc583657fbe5fb0e96fd38cad070aa30246cf99d71f1f6cb66d31: Status 404 returned error can't find the container with id b93b6586819dc583657fbe5fb0e96fd38cad070aa30246cf99d71f1f6cb66d31 Mar 09 14:29:05 crc kubenswrapper[4764]: I0309 14:29:05.455216 4764 generic.go:334] "Generic (PLEG): container finished" podID="176982f0-3e86-471c-8054-13490ee485bb" containerID="d8c80c11cf0e967a1dfd8c4b93c815a24716ae93a258544f33ea5e567ff6734c" exitCode=0 Mar 09 14:29:05 crc kubenswrapper[4764]: I0309 14:29:05.455360 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggs99" event={"ID":"176982f0-3e86-471c-8054-13490ee485bb","Type":"ContainerDied","Data":"d8c80c11cf0e967a1dfd8c4b93c815a24716ae93a258544f33ea5e567ff6734c"} Mar 09 14:29:05 crc kubenswrapper[4764]: I0309 14:29:05.455451 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggs99" event={"ID":"176982f0-3e86-471c-8054-13490ee485bb","Type":"ContainerStarted","Data":"166077c5e519763979f01840287cd8d08640b9365b4dc9e96e36e1f7da6e7ee8"} Mar 09 14:29:05 crc kubenswrapper[4764]: I0309 14:29:05.457427 4764 generic.go:334] "Generic (PLEG): container finished" podID="07b9f49e-cd4b-44ac-ba0b-22764e3372b3" containerID="1d44c6a6f1ca51015449d9711ca6e1ee57f52a1f6f76c5f60c6f3d34e343f24f" exitCode=0 Mar 09 14:29:05 crc kubenswrapper[4764]: I0309 14:29:05.457549 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8g2gm" event={"ID":"07b9f49e-cd4b-44ac-ba0b-22764e3372b3","Type":"ContainerDied","Data":"1d44c6a6f1ca51015449d9711ca6e1ee57f52a1f6f76c5f60c6f3d34e343f24f"} Mar 09 14:29:05 crc kubenswrapper[4764]: I0309 14:29:05.457664 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8g2gm" event={"ID":"07b9f49e-cd4b-44ac-ba0b-22764e3372b3","Type":"ContainerStarted","Data":"b93b6586819dc583657fbe5fb0e96fd38cad070aa30246cf99d71f1f6cb66d31"} Mar 09 14:29:05 crc kubenswrapper[4764]: I0309 14:29:05.457602 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 14:29:06 crc kubenswrapper[4764]: I0309 14:29:06.470463 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8g2gm" event={"ID":"07b9f49e-cd4b-44ac-ba0b-22764e3372b3","Type":"ContainerStarted","Data":"300cab71666fbf2679ea5e68d0f3e8d76de51a643d44d801f0ca2059b05ed1fa"} Mar 09 14:29:08 crc kubenswrapper[4764]: I0309 14:29:08.502375 4764 generic.go:334] "Generic (PLEG): container finished" podID="07b9f49e-cd4b-44ac-ba0b-22764e3372b3" containerID="300cab71666fbf2679ea5e68d0f3e8d76de51a643d44d801f0ca2059b05ed1fa" exitCode=0 Mar 09 14:29:08 crc kubenswrapper[4764]: I0309 14:29:08.502459 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8g2gm" event={"ID":"07b9f49e-cd4b-44ac-ba0b-22764e3372b3","Type":"ContainerDied","Data":"300cab71666fbf2679ea5e68d0f3e8d76de51a643d44d801f0ca2059b05ed1fa"} Mar 09 14:29:11 crc kubenswrapper[4764]: I0309 14:29:11.541804 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8g2gm" event={"ID":"07b9f49e-cd4b-44ac-ba0b-22764e3372b3","Type":"ContainerStarted","Data":"ed793f34ea4538da0c31464a86c25bec86e493f91e44da4d2e6815b6221e542b"} Mar 09 14:29:11 crc kubenswrapper[4764]: I0309 14:29:11.544616 4764 generic.go:334] "Generic (PLEG): container finished" podID="176982f0-3e86-471c-8054-13490ee485bb" containerID="4a341a4b28fa7bb07af4dc3bc024fa42c3d793bbdd626124ab77730a3005a5c8" exitCode=0 Mar 09 14:29:11 crc kubenswrapper[4764]: I0309 14:29:11.544691 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggs99" event={"ID":"176982f0-3e86-471c-8054-13490ee485bb","Type":"ContainerDied","Data":"4a341a4b28fa7bb07af4dc3bc024fa42c3d793bbdd626124ab77730a3005a5c8"} Mar 09 14:29:11 crc kubenswrapper[4764]: I0309 14:29:11.600974 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8g2gm" podStartSLOduration=2.53528242 podStartE2EDuration="7.600949309s" podCreationTimestamp="2026-03-09 14:29:04 +0000 UTC" firstStartedPulling="2026-03-09 14:29:05.459101042 +0000 UTC m=+4100.709272950" lastFinishedPulling="2026-03-09 14:29:10.524767931 +0000 UTC m=+4105.774939839" observedRunningTime="2026-03-09 14:29:11.575114587 +0000 UTC m=+4106.825286505" watchObservedRunningTime="2026-03-09 14:29:11.600949309 +0000 UTC m=+4106.851121217" Mar 09 14:29:12 crc kubenswrapper[4764]: I0309 14:29:12.557210 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggs99" event={"ID":"176982f0-3e86-471c-8054-13490ee485bb","Type":"ContainerStarted","Data":"179e4497a5b2a885ab4606b5d2d90dd9f1fdbd0ba61ca5cc3671d4047b8df2cc"} Mar 09 14:29:12 crc kubenswrapper[4764]: I0309 14:29:12.586773 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ggs99" podStartSLOduration=3.055209586 podStartE2EDuration="9.586741066s" podCreationTimestamp="2026-03-09 14:29:03 +0000 UTC" firstStartedPulling="2026-03-09 14:29:05.457377586 +0000 UTC m=+4100.707549494" lastFinishedPulling="2026-03-09 14:29:11.988909066 +0000 UTC m=+4107.239080974" observedRunningTime="2026-03-09 14:29:12.577629882 +0000 UTC m=+4107.827801790" watchObservedRunningTime="2026-03-09 14:29:12.586741066 +0000 UTC m=+4107.836912974" Mar 09 14:29:14 crc kubenswrapper[4764]: I0309 14:29:14.004577 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ggs99" Mar 09 14:29:14 crc kubenswrapper[4764]: I0309 14:29:14.005042 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ggs99" Mar 09 14:29:14 crc kubenswrapper[4764]: I0309 14:29:14.542335 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:14 crc kubenswrapper[4764]: I0309 14:29:14.542848 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:14 crc kubenswrapper[4764]: I0309 14:29:14.596082 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:15 crc kubenswrapper[4764]: I0309 14:29:15.059194 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-ggs99" podUID="176982f0-3e86-471c-8054-13490ee485bb" containerName="registry-server" probeResult="failure" output=< Mar 09 14:29:15 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Mar 09 14:29:15 crc kubenswrapper[4764]: > Mar 09 14:29:24 crc kubenswrapper[4764]: I0309 14:29:24.054853 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ggs99" Mar 09 14:29:24 crc kubenswrapper[4764]: I0309 14:29:24.122509 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ggs99" Mar 09 14:29:24 crc kubenswrapper[4764]: I0309 14:29:24.595322 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:26 crc kubenswrapper[4764]: I0309 14:29:26.973429 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ggs99"] Mar 09 14:29:27 crc kubenswrapper[4764]: I0309 14:29:27.345887 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4sxc8"] Mar 09 14:29:27 crc kubenswrapper[4764]: I0309 14:29:27.346632 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4sxc8" podUID="fa6ff5f6-9328-419b-a996-05bcf478b446" containerName="registry-server" containerID="cri-o://cbaaee31f94d5b978452176039ab59d2520f7f3d0d136e8695a1528c41a1e96e" gracePeriod=2 Mar 09 14:29:27 crc kubenswrapper[4764]: I0309 14:29:27.722261 4764 generic.go:334] "Generic (PLEG): container finished" podID="fa6ff5f6-9328-419b-a996-05bcf478b446" containerID="cbaaee31f94d5b978452176039ab59d2520f7f3d0d136e8695a1528c41a1e96e" exitCode=0 Mar 09 14:29:27 crc kubenswrapper[4764]: I0309 14:29:27.722855 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4sxc8" event={"ID":"fa6ff5f6-9328-419b-a996-05bcf478b446","Type":"ContainerDied","Data":"cbaaee31f94d5b978452176039ab59d2520f7f3d0d136e8695a1528c41a1e96e"} Mar 09 14:29:27 crc kubenswrapper[4764]: I0309 14:29:27.918083 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4sxc8" Mar 09 14:29:27 crc kubenswrapper[4764]: I0309 14:29:27.959400 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8g2gm"] Mar 09 14:29:27 crc kubenswrapper[4764]: I0309 14:29:27.959784 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8g2gm" podUID="07b9f49e-cd4b-44ac-ba0b-22764e3372b3" containerName="registry-server" containerID="cri-o://ed793f34ea4538da0c31464a86c25bec86e493f91e44da4d2e6815b6221e542b" gracePeriod=2 Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.109671 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p26kd\" (UniqueName: \"kubernetes.io/projected/fa6ff5f6-9328-419b-a996-05bcf478b446-kube-api-access-p26kd\") pod \"fa6ff5f6-9328-419b-a996-05bcf478b446\" (UID: \"fa6ff5f6-9328-419b-a996-05bcf478b446\") " Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.110003 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa6ff5f6-9328-419b-a996-05bcf478b446-catalog-content\") pod \"fa6ff5f6-9328-419b-a996-05bcf478b446\" (UID: \"fa6ff5f6-9328-419b-a996-05bcf478b446\") " Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.110263 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa6ff5f6-9328-419b-a996-05bcf478b446-utilities\") pod \"fa6ff5f6-9328-419b-a996-05bcf478b446\" (UID: \"fa6ff5f6-9328-419b-a996-05bcf478b446\") " Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.112448 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa6ff5f6-9328-419b-a996-05bcf478b446-utilities" (OuterVolumeSpecName: "utilities") pod "fa6ff5f6-9328-419b-a996-05bcf478b446" (UID: "fa6ff5f6-9328-419b-a996-05bcf478b446"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.123631 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa6ff5f6-9328-419b-a996-05bcf478b446-kube-api-access-p26kd" (OuterVolumeSpecName: "kube-api-access-p26kd") pod "fa6ff5f6-9328-419b-a996-05bcf478b446" (UID: "fa6ff5f6-9328-419b-a996-05bcf478b446"). InnerVolumeSpecName "kube-api-access-p26kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.200827 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa6ff5f6-9328-419b-a996-05bcf478b446-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa6ff5f6-9328-419b-a996-05bcf478b446" (UID: "fa6ff5f6-9328-419b-a996-05bcf478b446"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.219012 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa6ff5f6-9328-419b-a996-05bcf478b446-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.219058 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa6ff5f6-9328-419b-a996-05bcf478b446-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.219073 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p26kd\" (UniqueName: \"kubernetes.io/projected/fa6ff5f6-9328-419b-a996-05bcf478b446-kube-api-access-p26kd\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.370753 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.370960 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.400241 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.422030 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-catalog-content\") pod \"07b9f49e-cd4b-44ac-ba0b-22764e3372b3\" (UID: \"07b9f49e-cd4b-44ac-ba0b-22764e3372b3\") " Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.422097 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-utilities\") pod \"07b9f49e-cd4b-44ac-ba0b-22764e3372b3\" (UID: \"07b9f49e-cd4b-44ac-ba0b-22764e3372b3\") " Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.422274 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mlds\" (UniqueName: \"kubernetes.io/projected/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-kube-api-access-8mlds\") pod \"07b9f49e-cd4b-44ac-ba0b-22764e3372b3\" (UID: \"07b9f49e-cd4b-44ac-ba0b-22764e3372b3\") " Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.426109 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-utilities" (OuterVolumeSpecName: "utilities") pod "07b9f49e-cd4b-44ac-ba0b-22764e3372b3" (UID: "07b9f49e-cd4b-44ac-ba0b-22764e3372b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.429609 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-kube-api-access-8mlds" (OuterVolumeSpecName: "kube-api-access-8mlds") pod "07b9f49e-cd4b-44ac-ba0b-22764e3372b3" (UID: "07b9f49e-cd4b-44ac-ba0b-22764e3372b3"). InnerVolumeSpecName "kube-api-access-8mlds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.511321 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "07b9f49e-cd4b-44ac-ba0b-22764e3372b3" (UID: "07b9f49e-cd4b-44ac-ba0b-22764e3372b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.525897 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mlds\" (UniqueName: \"kubernetes.io/projected/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-kube-api-access-8mlds\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.525943 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.525955 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07b9f49e-cd4b-44ac-ba0b-22764e3372b3-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.735212 4764 generic.go:334] "Generic (PLEG): container finished" podID="07b9f49e-cd4b-44ac-ba0b-22764e3372b3" containerID="ed793f34ea4538da0c31464a86c25bec86e493f91e44da4d2e6815b6221e542b" exitCode=0 Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.735281 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8g2gm" event={"ID":"07b9f49e-cd4b-44ac-ba0b-22764e3372b3","Type":"ContainerDied","Data":"ed793f34ea4538da0c31464a86c25bec86e493f91e44da4d2e6815b6221e542b"} Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.735317 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8g2gm" event={"ID":"07b9f49e-cd4b-44ac-ba0b-22764e3372b3","Type":"ContainerDied","Data":"b93b6586819dc583657fbe5fb0e96fd38cad070aa30246cf99d71f1f6cb66d31"} Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.735338 4764 scope.go:117] "RemoveContainer" containerID="ed793f34ea4538da0c31464a86c25bec86e493f91e44da4d2e6815b6221e542b" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.735477 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8g2gm" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.745086 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4sxc8" event={"ID":"fa6ff5f6-9328-419b-a996-05bcf478b446","Type":"ContainerDied","Data":"fa2faca2b6f361aab53dbf1e7c047f8fa427577595f01228a4ee0b2b2c128cfa"} Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.745171 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4sxc8" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.789936 4764 scope.go:117] "RemoveContainer" containerID="300cab71666fbf2679ea5e68d0f3e8d76de51a643d44d801f0ca2059b05ed1fa" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.807821 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8g2gm"] Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.819040 4764 scope.go:117] "RemoveContainer" containerID="1d44c6a6f1ca51015449d9711ca6e1ee57f52a1f6f76c5f60c6f3d34e343f24f" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.827580 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8g2gm"] Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.840712 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4sxc8"] Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.863553 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4sxc8"] Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.873530 4764 scope.go:117] "RemoveContainer" containerID="ed793f34ea4538da0c31464a86c25bec86e493f91e44da4d2e6815b6221e542b" Mar 09 14:29:28 crc kubenswrapper[4764]: E0309 14:29:28.874151 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed793f34ea4538da0c31464a86c25bec86e493f91e44da4d2e6815b6221e542b\": container with ID starting with ed793f34ea4538da0c31464a86c25bec86e493f91e44da4d2e6815b6221e542b not found: ID does not exist" containerID="ed793f34ea4538da0c31464a86c25bec86e493f91e44da4d2e6815b6221e542b" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.874190 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed793f34ea4538da0c31464a86c25bec86e493f91e44da4d2e6815b6221e542b"} err="failed to get container status \"ed793f34ea4538da0c31464a86c25bec86e493f91e44da4d2e6815b6221e542b\": rpc error: code = NotFound desc = could not find container \"ed793f34ea4538da0c31464a86c25bec86e493f91e44da4d2e6815b6221e542b\": container with ID starting with ed793f34ea4538da0c31464a86c25bec86e493f91e44da4d2e6815b6221e542b not found: ID does not exist" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.874220 4764 scope.go:117] "RemoveContainer" containerID="300cab71666fbf2679ea5e68d0f3e8d76de51a643d44d801f0ca2059b05ed1fa" Mar 09 14:29:28 crc kubenswrapper[4764]: E0309 14:29:28.874463 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"300cab71666fbf2679ea5e68d0f3e8d76de51a643d44d801f0ca2059b05ed1fa\": container with ID starting with 300cab71666fbf2679ea5e68d0f3e8d76de51a643d44d801f0ca2059b05ed1fa not found: ID does not exist" containerID="300cab71666fbf2679ea5e68d0f3e8d76de51a643d44d801f0ca2059b05ed1fa" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.874488 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"300cab71666fbf2679ea5e68d0f3e8d76de51a643d44d801f0ca2059b05ed1fa"} err="failed to get container status \"300cab71666fbf2679ea5e68d0f3e8d76de51a643d44d801f0ca2059b05ed1fa\": rpc error: code = NotFound desc = could not find container \"300cab71666fbf2679ea5e68d0f3e8d76de51a643d44d801f0ca2059b05ed1fa\": container with ID starting with 300cab71666fbf2679ea5e68d0f3e8d76de51a643d44d801f0ca2059b05ed1fa not found: ID does not exist" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.874503 4764 scope.go:117] "RemoveContainer" containerID="1d44c6a6f1ca51015449d9711ca6e1ee57f52a1f6f76c5f60c6f3d34e343f24f" Mar 09 14:29:28 crc kubenswrapper[4764]: E0309 14:29:28.874797 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d44c6a6f1ca51015449d9711ca6e1ee57f52a1f6f76c5f60c6f3d34e343f24f\": container with ID starting with 1d44c6a6f1ca51015449d9711ca6e1ee57f52a1f6f76c5f60c6f3d34e343f24f not found: ID does not exist" containerID="1d44c6a6f1ca51015449d9711ca6e1ee57f52a1f6f76c5f60c6f3d34e343f24f" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.874822 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d44c6a6f1ca51015449d9711ca6e1ee57f52a1f6f76c5f60c6f3d34e343f24f"} err="failed to get container status \"1d44c6a6f1ca51015449d9711ca6e1ee57f52a1f6f76c5f60c6f3d34e343f24f\": rpc error: code = NotFound desc = could not find container \"1d44c6a6f1ca51015449d9711ca6e1ee57f52a1f6f76c5f60c6f3d34e343f24f\": container with ID starting with 1d44c6a6f1ca51015449d9711ca6e1ee57f52a1f6f76c5f60c6f3d34e343f24f not found: ID does not exist" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.874838 4764 scope.go:117] "RemoveContainer" containerID="cbaaee31f94d5b978452176039ab59d2520f7f3d0d136e8695a1528c41a1e96e" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.918565 4764 scope.go:117] "RemoveContainer" containerID="81f2426c617949ba2e886b6a6713e69e1e2306d5a032ede7615f1cde56b297b4" Mar 09 14:29:28 crc kubenswrapper[4764]: I0309 14:29:28.941299 4764 scope.go:117] "RemoveContainer" containerID="dab29cc25a32a534e61bf039add40e1f06d5ae10ca8ef07579cb0bf91b124a9d" Mar 09 14:29:29 crc kubenswrapper[4764]: I0309 14:29:29.572012 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07b9f49e-cd4b-44ac-ba0b-22764e3372b3" path="/var/lib/kubelet/pods/07b9f49e-cd4b-44ac-ba0b-22764e3372b3/volumes" Mar 09 14:29:29 crc kubenswrapper[4764]: I0309 14:29:29.573502 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa6ff5f6-9328-419b-a996-05bcf478b446" path="/var/lib/kubelet/pods/fa6ff5f6-9328-419b-a996-05bcf478b446/volumes" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.329389 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2n85d/must-gather-hmzh8"] Mar 09 14:29:36 crc kubenswrapper[4764]: E0309 14:29:36.331765 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07b9f49e-cd4b-44ac-ba0b-22764e3372b3" containerName="extract-content" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.331887 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="07b9f49e-cd4b-44ac-ba0b-22764e3372b3" containerName="extract-content" Mar 09 14:29:36 crc kubenswrapper[4764]: E0309 14:29:36.331954 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa6ff5f6-9328-419b-a996-05bcf478b446" containerName="extract-content" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.332016 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa6ff5f6-9328-419b-a996-05bcf478b446" containerName="extract-content" Mar 09 14:29:36 crc kubenswrapper[4764]: E0309 14:29:36.332082 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa6ff5f6-9328-419b-a996-05bcf478b446" containerName="registry-server" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.332141 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa6ff5f6-9328-419b-a996-05bcf478b446" containerName="registry-server" Mar 09 14:29:36 crc kubenswrapper[4764]: E0309 14:29:36.332234 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa6ff5f6-9328-419b-a996-05bcf478b446" containerName="extract-utilities" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.332313 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa6ff5f6-9328-419b-a996-05bcf478b446" containerName="extract-utilities" Mar 09 14:29:36 crc kubenswrapper[4764]: E0309 14:29:36.332387 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07b9f49e-cd4b-44ac-ba0b-22764e3372b3" containerName="extract-utilities" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.332441 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="07b9f49e-cd4b-44ac-ba0b-22764e3372b3" containerName="extract-utilities" Mar 09 14:29:36 crc kubenswrapper[4764]: E0309 14:29:36.332498 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07b9f49e-cd4b-44ac-ba0b-22764e3372b3" containerName="registry-server" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.332549 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="07b9f49e-cd4b-44ac-ba0b-22764e3372b3" containerName="registry-server" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.332841 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa6ff5f6-9328-419b-a996-05bcf478b446" containerName="registry-server" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.332952 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="07b9f49e-cd4b-44ac-ba0b-22764e3372b3" containerName="registry-server" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.334380 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/must-gather-hmzh8" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.338446 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-2n85d"/"openshift-service-ca.crt" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.338842 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-2n85d"/"kube-root-ca.crt" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.348358 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-2n85d"/"default-dockercfg-fjlv5" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.360552 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2n85d/must-gather-hmzh8"] Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.445636 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f4069cd4-c4ea-4c35-a8e3-231f40655d27-must-gather-output\") pod \"must-gather-hmzh8\" (UID: \"f4069cd4-c4ea-4c35-a8e3-231f40655d27\") " pod="openshift-must-gather-2n85d/must-gather-hmzh8" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.445802 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bslj\" (UniqueName: \"kubernetes.io/projected/f4069cd4-c4ea-4c35-a8e3-231f40655d27-kube-api-access-8bslj\") pod \"must-gather-hmzh8\" (UID: \"f4069cd4-c4ea-4c35-a8e3-231f40655d27\") " pod="openshift-must-gather-2n85d/must-gather-hmzh8" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.548614 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f4069cd4-c4ea-4c35-a8e3-231f40655d27-must-gather-output\") pod \"must-gather-hmzh8\" (UID: \"f4069cd4-c4ea-4c35-a8e3-231f40655d27\") " pod="openshift-must-gather-2n85d/must-gather-hmzh8" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.548783 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bslj\" (UniqueName: \"kubernetes.io/projected/f4069cd4-c4ea-4c35-a8e3-231f40655d27-kube-api-access-8bslj\") pod \"must-gather-hmzh8\" (UID: \"f4069cd4-c4ea-4c35-a8e3-231f40655d27\") " pod="openshift-must-gather-2n85d/must-gather-hmzh8" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.549031 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f4069cd4-c4ea-4c35-a8e3-231f40655d27-must-gather-output\") pod \"must-gather-hmzh8\" (UID: \"f4069cd4-c4ea-4c35-a8e3-231f40655d27\") " pod="openshift-must-gather-2n85d/must-gather-hmzh8" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.598277 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bslj\" (UniqueName: \"kubernetes.io/projected/f4069cd4-c4ea-4c35-a8e3-231f40655d27-kube-api-access-8bslj\") pod \"must-gather-hmzh8\" (UID: \"f4069cd4-c4ea-4c35-a8e3-231f40655d27\") " pod="openshift-must-gather-2n85d/must-gather-hmzh8" Mar 09 14:29:36 crc kubenswrapper[4764]: I0309 14:29:36.655157 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/must-gather-hmzh8" Mar 09 14:29:37 crc kubenswrapper[4764]: I0309 14:29:37.256721 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2n85d/must-gather-hmzh8"] Mar 09 14:29:37 crc kubenswrapper[4764]: I0309 14:29:37.860965 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2n85d/must-gather-hmzh8" event={"ID":"f4069cd4-c4ea-4c35-a8e3-231f40655d27","Type":"ContainerStarted","Data":"5c35375c12dfdebcd5a99ee269599da2c2da6b1470aa36489d7414f993910904"} Mar 09 14:29:44 crc kubenswrapper[4764]: I0309 14:29:44.961923 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2n85d/must-gather-hmzh8" event={"ID":"f4069cd4-c4ea-4c35-a8e3-231f40655d27","Type":"ContainerStarted","Data":"7a1d05f1a79149bd8bced16f5f4fb67221a689028074b5bf7fc3e37c2411e2d9"} Mar 09 14:29:44 crc kubenswrapper[4764]: I0309 14:29:44.962614 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2n85d/must-gather-hmzh8" event={"ID":"f4069cd4-c4ea-4c35-a8e3-231f40655d27","Type":"ContainerStarted","Data":"3751c9a7f60cedf52e5109a5d05ced7856abcae8f6d88733462034a07970747a"} Mar 09 14:29:44 crc kubenswrapper[4764]: I0309 14:29:44.990013 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2n85d/must-gather-hmzh8" podStartSLOduration=2.271023773 podStartE2EDuration="8.989983499s" podCreationTimestamp="2026-03-09 14:29:36 +0000 UTC" firstStartedPulling="2026-03-09 14:29:37.263912014 +0000 UTC m=+4132.514083922" lastFinishedPulling="2026-03-09 14:29:43.98287174 +0000 UTC m=+4139.233043648" observedRunningTime="2026-03-09 14:29:44.982809017 +0000 UTC m=+4140.232980935" watchObservedRunningTime="2026-03-09 14:29:44.989983499 +0000 UTC m=+4140.240155417" Mar 09 14:29:49 crc kubenswrapper[4764]: I0309 14:29:49.155456 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2n85d/crc-debug-9qlpp"] Mar 09 14:29:49 crc kubenswrapper[4764]: I0309 14:29:49.157760 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/crc-debug-9qlpp" Mar 09 14:29:49 crc kubenswrapper[4764]: I0309 14:29:49.183115 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25af9291-c5a5-4dff-9eb7-960615c614c1-host\") pod \"crc-debug-9qlpp\" (UID: \"25af9291-c5a5-4dff-9eb7-960615c614c1\") " pod="openshift-must-gather-2n85d/crc-debug-9qlpp" Mar 09 14:29:49 crc kubenswrapper[4764]: I0309 14:29:49.183405 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp82d\" (UniqueName: \"kubernetes.io/projected/25af9291-c5a5-4dff-9eb7-960615c614c1-kube-api-access-kp82d\") pod \"crc-debug-9qlpp\" (UID: \"25af9291-c5a5-4dff-9eb7-960615c614c1\") " pod="openshift-must-gather-2n85d/crc-debug-9qlpp" Mar 09 14:29:49 crc kubenswrapper[4764]: I0309 14:29:49.285903 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25af9291-c5a5-4dff-9eb7-960615c614c1-host\") pod \"crc-debug-9qlpp\" (UID: \"25af9291-c5a5-4dff-9eb7-960615c614c1\") " pod="openshift-must-gather-2n85d/crc-debug-9qlpp" Mar 09 14:29:49 crc kubenswrapper[4764]: I0309 14:29:49.286110 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25af9291-c5a5-4dff-9eb7-960615c614c1-host\") pod \"crc-debug-9qlpp\" (UID: \"25af9291-c5a5-4dff-9eb7-960615c614c1\") " pod="openshift-must-gather-2n85d/crc-debug-9qlpp" Mar 09 14:29:49 crc kubenswrapper[4764]: I0309 14:29:49.286709 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp82d\" (UniqueName: \"kubernetes.io/projected/25af9291-c5a5-4dff-9eb7-960615c614c1-kube-api-access-kp82d\") pod \"crc-debug-9qlpp\" (UID: \"25af9291-c5a5-4dff-9eb7-960615c614c1\") " pod="openshift-must-gather-2n85d/crc-debug-9qlpp" Mar 09 14:29:49 crc kubenswrapper[4764]: I0309 14:29:49.312348 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp82d\" (UniqueName: \"kubernetes.io/projected/25af9291-c5a5-4dff-9eb7-960615c614c1-kube-api-access-kp82d\") pod \"crc-debug-9qlpp\" (UID: \"25af9291-c5a5-4dff-9eb7-960615c614c1\") " pod="openshift-must-gather-2n85d/crc-debug-9qlpp" Mar 09 14:29:49 crc kubenswrapper[4764]: I0309 14:29:49.492192 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/crc-debug-9qlpp" Mar 09 14:29:49 crc kubenswrapper[4764]: W0309 14:29:49.542472 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25af9291_c5a5_4dff_9eb7_960615c614c1.slice/crio-881e8e6e7900e3f6e6c5bcf4497258776402b46797e6c19f23effb9b0d8b76ef WatchSource:0}: Error finding container 881e8e6e7900e3f6e6c5bcf4497258776402b46797e6c19f23effb9b0d8b76ef: Status 404 returned error can't find the container with id 881e8e6e7900e3f6e6c5bcf4497258776402b46797e6c19f23effb9b0d8b76ef Mar 09 14:29:50 crc kubenswrapper[4764]: I0309 14:29:50.011196 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2n85d/crc-debug-9qlpp" event={"ID":"25af9291-c5a5-4dff-9eb7-960615c614c1","Type":"ContainerStarted","Data":"881e8e6e7900e3f6e6c5bcf4497258776402b46797e6c19f23effb9b0d8b76ef"} Mar 09 14:29:58 crc kubenswrapper[4764]: I0309 14:29:58.370934 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:29:58 crc kubenswrapper[4764]: I0309 14:29:58.371948 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.160211 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551110-sjw8t"] Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.166545 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551110-sjw8t" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.170452 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.170517 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.172347 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.179879 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551110-sjw8t"] Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.181958 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqvzx\" (UniqueName: \"kubernetes.io/projected/44063b01-0b96-488c-98af-43cdb752467e-kube-api-access-wqvzx\") pod \"auto-csr-approver-29551110-sjw8t\" (UID: \"44063b01-0b96-488c-98af-43cdb752467e\") " pod="openshift-infra/auto-csr-approver-29551110-sjw8t" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.270688 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff"] Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.272518 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.275588 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.275815 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.283837 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6ndb\" (UniqueName: \"kubernetes.io/projected/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-kube-api-access-v6ndb\") pod \"collect-profiles-29551110-jq7ff\" (UID: \"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.283940 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqvzx\" (UniqueName: \"kubernetes.io/projected/44063b01-0b96-488c-98af-43cdb752467e-kube-api-access-wqvzx\") pod \"auto-csr-approver-29551110-sjw8t\" (UID: \"44063b01-0b96-488c-98af-43cdb752467e\") " pod="openshift-infra/auto-csr-approver-29551110-sjw8t" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.283999 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-secret-volume\") pod \"collect-profiles-29551110-jq7ff\" (UID: \"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.284289 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-config-volume\") pod \"collect-profiles-29551110-jq7ff\" (UID: \"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.286246 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff"] Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.315606 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqvzx\" (UniqueName: \"kubernetes.io/projected/44063b01-0b96-488c-98af-43cdb752467e-kube-api-access-wqvzx\") pod \"auto-csr-approver-29551110-sjw8t\" (UID: \"44063b01-0b96-488c-98af-43cdb752467e\") " pod="openshift-infra/auto-csr-approver-29551110-sjw8t" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.386809 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6ndb\" (UniqueName: \"kubernetes.io/projected/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-kube-api-access-v6ndb\") pod \"collect-profiles-29551110-jq7ff\" (UID: \"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.386963 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-secret-volume\") pod \"collect-profiles-29551110-jq7ff\" (UID: \"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.387054 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-config-volume\") pod \"collect-profiles-29551110-jq7ff\" (UID: \"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.388315 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-config-volume\") pod \"collect-profiles-29551110-jq7ff\" (UID: \"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.393136 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-secret-volume\") pod \"collect-profiles-29551110-jq7ff\" (UID: \"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.407252 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6ndb\" (UniqueName: \"kubernetes.io/projected/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-kube-api-access-v6ndb\") pod \"collect-profiles-29551110-jq7ff\" (UID: \"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.688255 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" Mar 09 14:30:00 crc kubenswrapper[4764]: I0309 14:30:00.689791 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551110-sjw8t" Mar 09 14:30:04 crc kubenswrapper[4764]: I0309 14:30:04.436057 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff"] Mar 09 14:30:04 crc kubenswrapper[4764]: W0309 14:30:04.451036 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44063b01_0b96_488c_98af_43cdb752467e.slice/crio-9f463b037f6748b8413203836384045220979ce2e648b64c69d1e25e30dfec60 WatchSource:0}: Error finding container 9f463b037f6748b8413203836384045220979ce2e648b64c69d1e25e30dfec60: Status 404 returned error can't find the container with id 9f463b037f6748b8413203836384045220979ce2e648b64c69d1e25e30dfec60 Mar 09 14:30:04 crc kubenswrapper[4764]: I0309 14:30:04.451372 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551110-sjw8t"] Mar 09 14:30:05 crc kubenswrapper[4764]: I0309 14:30:05.180960 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2n85d/crc-debug-9qlpp" event={"ID":"25af9291-c5a5-4dff-9eb7-960615c614c1","Type":"ContainerStarted","Data":"48d1ac59fa8fdaa23f3a6ee1c8dc95eb7e7c19f15c77263002430602ffce1989"} Mar 09 14:30:05 crc kubenswrapper[4764]: I0309 14:30:05.184842 4764 generic.go:334] "Generic (PLEG): container finished" podID="8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1" containerID="7c8b928d8a9177ef5b9323c3811d5e0b452dba966aeb1e937b8cab94ebc62fd3" exitCode=0 Mar 09 14:30:05 crc kubenswrapper[4764]: I0309 14:30:05.184941 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" event={"ID":"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1","Type":"ContainerDied","Data":"7c8b928d8a9177ef5b9323c3811d5e0b452dba966aeb1e937b8cab94ebc62fd3"} Mar 09 14:30:05 crc kubenswrapper[4764]: I0309 14:30:05.184978 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" event={"ID":"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1","Type":"ContainerStarted","Data":"51fea13555bc7275650d667289bdcb28261d28171823fe98cc0700e0815f1df9"} Mar 09 14:30:05 crc kubenswrapper[4764]: I0309 14:30:05.192347 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551110-sjw8t" event={"ID":"44063b01-0b96-488c-98af-43cdb752467e","Type":"ContainerStarted","Data":"9f463b037f6748b8413203836384045220979ce2e648b64c69d1e25e30dfec60"} Mar 09 14:30:05 crc kubenswrapper[4764]: I0309 14:30:05.206452 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2n85d/crc-debug-9qlpp" podStartSLOduration=1.794139531 podStartE2EDuration="16.20642686s" podCreationTimestamp="2026-03-09 14:29:49 +0000 UTC" firstStartedPulling="2026-03-09 14:29:49.545538496 +0000 UTC m=+4144.795710404" lastFinishedPulling="2026-03-09 14:30:03.957825825 +0000 UTC m=+4159.207997733" observedRunningTime="2026-03-09 14:30:05.198509139 +0000 UTC m=+4160.448681047" watchObservedRunningTime="2026-03-09 14:30:05.20642686 +0000 UTC m=+4160.456598768" Mar 09 14:30:06 crc kubenswrapper[4764]: I0309 14:30:06.583781 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" Mar 09 14:30:06 crc kubenswrapper[4764]: I0309 14:30:06.647830 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-config-volume\") pod \"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1\" (UID: \"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1\") " Mar 09 14:30:06 crc kubenswrapper[4764]: I0309 14:30:06.647971 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6ndb\" (UniqueName: \"kubernetes.io/projected/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-kube-api-access-v6ndb\") pod \"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1\" (UID: \"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1\") " Mar 09 14:30:06 crc kubenswrapper[4764]: I0309 14:30:06.648199 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-secret-volume\") pod \"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1\" (UID: \"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1\") " Mar 09 14:30:06 crc kubenswrapper[4764]: I0309 14:30:06.651769 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-config-volume" (OuterVolumeSpecName: "config-volume") pod "8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1" (UID: "8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:30:06 crc kubenswrapper[4764]: I0309 14:30:06.662961 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-kube-api-access-v6ndb" (OuterVolumeSpecName: "kube-api-access-v6ndb") pod "8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1" (UID: "8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1"). InnerVolumeSpecName "kube-api-access-v6ndb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:30:06 crc kubenswrapper[4764]: I0309 14:30:06.663216 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1" (UID: "8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:30:06 crc kubenswrapper[4764]: I0309 14:30:06.751672 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:06 crc kubenswrapper[4764]: I0309 14:30:06.751726 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:06 crc kubenswrapper[4764]: I0309 14:30:06.751740 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6ndb\" (UniqueName: \"kubernetes.io/projected/8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1-kube-api-access-v6ndb\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:07 crc kubenswrapper[4764]: I0309 14:30:07.213268 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" event={"ID":"8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1","Type":"ContainerDied","Data":"51fea13555bc7275650d667289bdcb28261d28171823fe98cc0700e0815f1df9"} Mar 09 14:30:07 crc kubenswrapper[4764]: I0309 14:30:07.214130 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51fea13555bc7275650d667289bdcb28261d28171823fe98cc0700e0815f1df9" Mar 09 14:30:07 crc kubenswrapper[4764]: I0309 14:30:07.213515 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-jq7ff" Mar 09 14:30:07 crc kubenswrapper[4764]: I0309 14:30:07.682902 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt"] Mar 09 14:30:07 crc kubenswrapper[4764]: I0309 14:30:07.697374 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551065-gdnvt"] Mar 09 14:30:08 crc kubenswrapper[4764]: I0309 14:30:08.226152 4764 generic.go:334] "Generic (PLEG): container finished" podID="44063b01-0b96-488c-98af-43cdb752467e" containerID="b7f1a91e8b51914a12830dcf15f240318b51a9141210e70d3ec57af5cc977455" exitCode=0 Mar 09 14:30:08 crc kubenswrapper[4764]: I0309 14:30:08.226254 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551110-sjw8t" event={"ID":"44063b01-0b96-488c-98af-43cdb752467e","Type":"ContainerDied","Data":"b7f1a91e8b51914a12830dcf15f240318b51a9141210e70d3ec57af5cc977455"} Mar 09 14:30:09 crc kubenswrapper[4764]: I0309 14:30:09.574390 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d" path="/var/lib/kubelet/pods/5eb53f8f-cfd0-4061-b4ac-741d77ed6f0d/volumes" Mar 09 14:30:09 crc kubenswrapper[4764]: I0309 14:30:09.661908 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551110-sjw8t" Mar 09 14:30:09 crc kubenswrapper[4764]: I0309 14:30:09.725974 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqvzx\" (UniqueName: \"kubernetes.io/projected/44063b01-0b96-488c-98af-43cdb752467e-kube-api-access-wqvzx\") pod \"44063b01-0b96-488c-98af-43cdb752467e\" (UID: \"44063b01-0b96-488c-98af-43cdb752467e\") " Mar 09 14:30:09 crc kubenswrapper[4764]: I0309 14:30:09.737427 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44063b01-0b96-488c-98af-43cdb752467e-kube-api-access-wqvzx" (OuterVolumeSpecName: "kube-api-access-wqvzx") pod "44063b01-0b96-488c-98af-43cdb752467e" (UID: "44063b01-0b96-488c-98af-43cdb752467e"). InnerVolumeSpecName "kube-api-access-wqvzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:30:09 crc kubenswrapper[4764]: I0309 14:30:09.828823 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqvzx\" (UniqueName: \"kubernetes.io/projected/44063b01-0b96-488c-98af-43cdb752467e-kube-api-access-wqvzx\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:10 crc kubenswrapper[4764]: I0309 14:30:10.249476 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551110-sjw8t" event={"ID":"44063b01-0b96-488c-98af-43cdb752467e","Type":"ContainerDied","Data":"9f463b037f6748b8413203836384045220979ce2e648b64c69d1e25e30dfec60"} Mar 09 14:30:10 crc kubenswrapper[4764]: I0309 14:30:10.249916 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f463b037f6748b8413203836384045220979ce2e648b64c69d1e25e30dfec60" Mar 09 14:30:10 crc kubenswrapper[4764]: I0309 14:30:10.249570 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551110-sjw8t" Mar 09 14:30:10 crc kubenswrapper[4764]: I0309 14:30:10.743552 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551104-pc4h6"] Mar 09 14:30:10 crc kubenswrapper[4764]: I0309 14:30:10.755764 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551104-pc4h6"] Mar 09 14:30:11 crc kubenswrapper[4764]: I0309 14:30:11.571884 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4e97252-1933-4b92-ab28-f9713db14afb" path="/var/lib/kubelet/pods/b4e97252-1933-4b92-ab28-f9713db14afb/volumes" Mar 09 14:30:11 crc kubenswrapper[4764]: I0309 14:30:11.928100 4764 scope.go:117] "RemoveContainer" containerID="fbe599b37c79d3976b2347daa88e4c2b8bc846bfc99bce86dadf53c1a9ea4b91" Mar 09 14:30:12 crc kubenswrapper[4764]: I0309 14:30:12.000359 4764 scope.go:117] "RemoveContainer" containerID="797b0150f56f98282613dd9aa20b75479aae0c1306bab3b67c8a86168757b198" Mar 09 14:30:28 crc kubenswrapper[4764]: I0309 14:30:28.370263 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:30:28 crc kubenswrapper[4764]: I0309 14:30:28.371285 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:30:28 crc kubenswrapper[4764]: I0309 14:30:28.371362 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 14:30:28 crc kubenswrapper[4764]: I0309 14:30:28.372687 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"956798d3640b6af2f25ed443a7b5b3c26e0da670aff756c574b1a30cc50879dd"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 14:30:28 crc kubenswrapper[4764]: I0309 14:30:28.372776 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://956798d3640b6af2f25ed443a7b5b3c26e0da670aff756c574b1a30cc50879dd" gracePeriod=600 Mar 09 14:30:29 crc kubenswrapper[4764]: I0309 14:30:29.450108 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="956798d3640b6af2f25ed443a7b5b3c26e0da670aff756c574b1a30cc50879dd" exitCode=0 Mar 09 14:30:29 crc kubenswrapper[4764]: I0309 14:30:29.450197 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"956798d3640b6af2f25ed443a7b5b3c26e0da670aff756c574b1a30cc50879dd"} Mar 09 14:30:29 crc kubenswrapper[4764]: I0309 14:30:29.452071 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2"} Mar 09 14:30:29 crc kubenswrapper[4764]: I0309 14:30:29.452119 4764 scope.go:117] "RemoveContainer" containerID="c10d86bbf5c5d5ea5875f026de2e5f7abb576968d3567040b8020dcc93a08096" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.433075 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mdqd7"] Mar 09 14:30:44 crc kubenswrapper[4764]: E0309 14:30:44.438973 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44063b01-0b96-488c-98af-43cdb752467e" containerName="oc" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.439283 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="44063b01-0b96-488c-98af-43cdb752467e" containerName="oc" Mar 09 14:30:44 crc kubenswrapper[4764]: E0309 14:30:44.439381 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1" containerName="collect-profiles" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.439453 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1" containerName="collect-profiles" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.439845 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ef8fd3b-c34b-4bb4-9726-f9f7f0e659c1" containerName="collect-profiles" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.439946 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="44063b01-0b96-488c-98af-43cdb752467e" containerName="oc" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.441777 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.454919 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdqd7"] Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.581772 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-utilities\") pod \"redhat-marketplace-mdqd7\" (UID: \"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd\") " pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.582285 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zz5t\" (UniqueName: \"kubernetes.io/projected/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-kube-api-access-9zz5t\") pod \"redhat-marketplace-mdqd7\" (UID: \"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd\") " pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.582355 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-catalog-content\") pod \"redhat-marketplace-mdqd7\" (UID: \"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd\") " pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.685227 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-utilities\") pod \"redhat-marketplace-mdqd7\" (UID: \"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd\") " pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.685370 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zz5t\" (UniqueName: \"kubernetes.io/projected/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-kube-api-access-9zz5t\") pod \"redhat-marketplace-mdqd7\" (UID: \"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd\") " pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.685458 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-catalog-content\") pod \"redhat-marketplace-mdqd7\" (UID: \"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd\") " pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.688193 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-catalog-content\") pod \"redhat-marketplace-mdqd7\" (UID: \"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd\") " pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.688473 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-utilities\") pod \"redhat-marketplace-mdqd7\" (UID: \"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd\") " pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.718848 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zz5t\" (UniqueName: \"kubernetes.io/projected/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-kube-api-access-9zz5t\") pod \"redhat-marketplace-mdqd7\" (UID: \"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd\") " pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:44 crc kubenswrapper[4764]: I0309 14:30:44.771030 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:45 crc kubenswrapper[4764]: W0309 14:30:45.306227 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbf30af3_6be9_48e2_8bf6_9fe8a0d4e1cd.slice/crio-e2faedc1de519d3df1b83f7d036261c9bb6514dba61e55a96fdcfbdf4aa98b4d WatchSource:0}: Error finding container e2faedc1de519d3df1b83f7d036261c9bb6514dba61e55a96fdcfbdf4aa98b4d: Status 404 returned error can't find the container with id e2faedc1de519d3df1b83f7d036261c9bb6514dba61e55a96fdcfbdf4aa98b4d Mar 09 14:30:45 crc kubenswrapper[4764]: I0309 14:30:45.312952 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdqd7"] Mar 09 14:30:45 crc kubenswrapper[4764]: I0309 14:30:45.673801 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdqd7" event={"ID":"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd","Type":"ContainerStarted","Data":"e2faedc1de519d3df1b83f7d036261c9bb6514dba61e55a96fdcfbdf4aa98b4d"} Mar 09 14:30:46 crc kubenswrapper[4764]: I0309 14:30:46.705388 4764 generic.go:334] "Generic (PLEG): container finished" podID="cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd" containerID="52e00525d2fe3223b29c4d510f146a1dfebeea02e1ec48c4d73131eba5cfe1e3" exitCode=0 Mar 09 14:30:46 crc kubenswrapper[4764]: I0309 14:30:46.705451 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdqd7" event={"ID":"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd","Type":"ContainerDied","Data":"52e00525d2fe3223b29c4d510f146a1dfebeea02e1ec48c4d73131eba5cfe1e3"} Mar 09 14:30:48 crc kubenswrapper[4764]: I0309 14:30:48.730361 4764 generic.go:334] "Generic (PLEG): container finished" podID="cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd" containerID="4ea2b80f206544876b4477dfc25fe84c0def83b59496eb5d6d5b4a2d5a2bad74" exitCode=0 Mar 09 14:30:48 crc kubenswrapper[4764]: I0309 14:30:48.730899 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdqd7" event={"ID":"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd","Type":"ContainerDied","Data":"4ea2b80f206544876b4477dfc25fe84c0def83b59496eb5d6d5b4a2d5a2bad74"} Mar 09 14:30:49 crc kubenswrapper[4764]: I0309 14:30:49.745581 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdqd7" event={"ID":"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd","Type":"ContainerStarted","Data":"6f25afdbffc91821633a9c1aac68545587801eecb36671bddb6b3a225f8a4b4a"} Mar 09 14:30:54 crc kubenswrapper[4764]: I0309 14:30:54.771813 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:54 crc kubenswrapper[4764]: I0309 14:30:54.772712 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:54 crc kubenswrapper[4764]: I0309 14:30:54.824746 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:54 crc kubenswrapper[4764]: I0309 14:30:54.865523 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mdqd7" podStartSLOduration=8.396383339 podStartE2EDuration="10.865491691s" podCreationTimestamp="2026-03-09 14:30:44 +0000 UTC" firstStartedPulling="2026-03-09 14:30:46.711159631 +0000 UTC m=+4201.961331539" lastFinishedPulling="2026-03-09 14:30:49.180267983 +0000 UTC m=+4204.430439891" observedRunningTime="2026-03-09 14:30:49.790158296 +0000 UTC m=+4205.040330204" watchObservedRunningTime="2026-03-09 14:30:54.865491691 +0000 UTC m=+4210.115663609" Mar 09 14:30:54 crc kubenswrapper[4764]: I0309 14:30:54.884431 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:55 crc kubenswrapper[4764]: I0309 14:30:55.073588 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdqd7"] Mar 09 14:30:55 crc kubenswrapper[4764]: I0309 14:30:55.823152 4764 generic.go:334] "Generic (PLEG): container finished" podID="25af9291-c5a5-4dff-9eb7-960615c614c1" containerID="48d1ac59fa8fdaa23f3a6ee1c8dc95eb7e7c19f15c77263002430602ffce1989" exitCode=0 Mar 09 14:30:55 crc kubenswrapper[4764]: I0309 14:30:55.823268 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2n85d/crc-debug-9qlpp" event={"ID":"25af9291-c5a5-4dff-9eb7-960615c614c1","Type":"ContainerDied","Data":"48d1ac59fa8fdaa23f3a6ee1c8dc95eb7e7c19f15c77263002430602ffce1989"} Mar 09 14:30:56 crc kubenswrapper[4764]: I0309 14:30:56.835724 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mdqd7" podUID="cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd" containerName="registry-server" containerID="cri-o://6f25afdbffc91821633a9c1aac68545587801eecb36671bddb6b3a225f8a4b4a" gracePeriod=2 Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.060054 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/crc-debug-9qlpp" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.127463 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2n85d/crc-debug-9qlpp"] Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.130497 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp82d\" (UniqueName: \"kubernetes.io/projected/25af9291-c5a5-4dff-9eb7-960615c614c1-kube-api-access-kp82d\") pod \"25af9291-c5a5-4dff-9eb7-960615c614c1\" (UID: \"25af9291-c5a5-4dff-9eb7-960615c614c1\") " Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.130804 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25af9291-c5a5-4dff-9eb7-960615c614c1-host\") pod \"25af9291-c5a5-4dff-9eb7-960615c614c1\" (UID: \"25af9291-c5a5-4dff-9eb7-960615c614c1\") " Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.130895 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25af9291-c5a5-4dff-9eb7-960615c614c1-host" (OuterVolumeSpecName: "host") pod "25af9291-c5a5-4dff-9eb7-960615c614c1" (UID: "25af9291-c5a5-4dff-9eb7-960615c614c1"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.132086 4764 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25af9291-c5a5-4dff-9eb7-960615c614c1-host\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.141978 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2n85d/crc-debug-9qlpp"] Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.143945 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25af9291-c5a5-4dff-9eb7-960615c614c1-kube-api-access-kp82d" (OuterVolumeSpecName: "kube-api-access-kp82d") pod "25af9291-c5a5-4dff-9eb7-960615c614c1" (UID: "25af9291-c5a5-4dff-9eb7-960615c614c1"). InnerVolumeSpecName "kube-api-access-kp82d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.234296 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp82d\" (UniqueName: \"kubernetes.io/projected/25af9291-c5a5-4dff-9eb7-960615c614c1-kube-api-access-kp82d\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.292637 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.437278 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-utilities\") pod \"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd\" (UID: \"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd\") " Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.437521 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-catalog-content\") pod \"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd\" (UID: \"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd\") " Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.438462 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-utilities" (OuterVolumeSpecName: "utilities") pod "cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd" (UID: "cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.443799 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zz5t\" (UniqueName: \"kubernetes.io/projected/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-kube-api-access-9zz5t\") pod \"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd\" (UID: \"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd\") " Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.448081 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.449194 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-kube-api-access-9zz5t" (OuterVolumeSpecName: "kube-api-access-9zz5t") pod "cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd" (UID: "cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd"). InnerVolumeSpecName "kube-api-access-9zz5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.470830 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd" (UID: "cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.550786 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.550833 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zz5t\" (UniqueName: \"kubernetes.io/projected/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd-kube-api-access-9zz5t\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.576339 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25af9291-c5a5-4dff-9eb7-960615c614c1" path="/var/lib/kubelet/pods/25af9291-c5a5-4dff-9eb7-960615c614c1/volumes" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.849120 4764 scope.go:117] "RemoveContainer" containerID="48d1ac59fa8fdaa23f3a6ee1c8dc95eb7e7c19f15c77263002430602ffce1989" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.849136 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/crc-debug-9qlpp" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.852369 4764 generic.go:334] "Generic (PLEG): container finished" podID="cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd" containerID="6f25afdbffc91821633a9c1aac68545587801eecb36671bddb6b3a225f8a4b4a" exitCode=0 Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.852407 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdqd7" event={"ID":"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd","Type":"ContainerDied","Data":"6f25afdbffc91821633a9c1aac68545587801eecb36671bddb6b3a225f8a4b4a"} Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.852444 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdqd7" event={"ID":"cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd","Type":"ContainerDied","Data":"e2faedc1de519d3df1b83f7d036261c9bb6514dba61e55a96fdcfbdf4aa98b4d"} Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.852540 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mdqd7" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.908327 4764 scope.go:117] "RemoveContainer" containerID="6f25afdbffc91821633a9c1aac68545587801eecb36671bddb6b3a225f8a4b4a" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.970885 4764 scope.go:117] "RemoveContainer" containerID="4ea2b80f206544876b4477dfc25fe84c0def83b59496eb5d6d5b4a2d5a2bad74" Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.975856 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdqd7"] Mar 09 14:30:57 crc kubenswrapper[4764]: I0309 14:30:57.996813 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdqd7"] Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.001326 4764 scope.go:117] "RemoveContainer" containerID="52e00525d2fe3223b29c4d510f146a1dfebeea02e1ec48c4d73131eba5cfe1e3" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.052573 4764 scope.go:117] "RemoveContainer" containerID="6f25afdbffc91821633a9c1aac68545587801eecb36671bddb6b3a225f8a4b4a" Mar 09 14:30:58 crc kubenswrapper[4764]: E0309 14:30:58.053460 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f25afdbffc91821633a9c1aac68545587801eecb36671bddb6b3a225f8a4b4a\": container with ID starting with 6f25afdbffc91821633a9c1aac68545587801eecb36671bddb6b3a225f8a4b4a not found: ID does not exist" containerID="6f25afdbffc91821633a9c1aac68545587801eecb36671bddb6b3a225f8a4b4a" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.053527 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f25afdbffc91821633a9c1aac68545587801eecb36671bddb6b3a225f8a4b4a"} err="failed to get container status \"6f25afdbffc91821633a9c1aac68545587801eecb36671bddb6b3a225f8a4b4a\": rpc error: code = NotFound desc = could not find container \"6f25afdbffc91821633a9c1aac68545587801eecb36671bddb6b3a225f8a4b4a\": container with ID starting with 6f25afdbffc91821633a9c1aac68545587801eecb36671bddb6b3a225f8a4b4a not found: ID does not exist" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.053570 4764 scope.go:117] "RemoveContainer" containerID="4ea2b80f206544876b4477dfc25fe84c0def83b59496eb5d6d5b4a2d5a2bad74" Mar 09 14:30:58 crc kubenswrapper[4764]: E0309 14:30:58.054018 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ea2b80f206544876b4477dfc25fe84c0def83b59496eb5d6d5b4a2d5a2bad74\": container with ID starting with 4ea2b80f206544876b4477dfc25fe84c0def83b59496eb5d6d5b4a2d5a2bad74 not found: ID does not exist" containerID="4ea2b80f206544876b4477dfc25fe84c0def83b59496eb5d6d5b4a2d5a2bad74" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.054079 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ea2b80f206544876b4477dfc25fe84c0def83b59496eb5d6d5b4a2d5a2bad74"} err="failed to get container status \"4ea2b80f206544876b4477dfc25fe84c0def83b59496eb5d6d5b4a2d5a2bad74\": rpc error: code = NotFound desc = could not find container \"4ea2b80f206544876b4477dfc25fe84c0def83b59496eb5d6d5b4a2d5a2bad74\": container with ID starting with 4ea2b80f206544876b4477dfc25fe84c0def83b59496eb5d6d5b4a2d5a2bad74 not found: ID does not exist" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.054132 4764 scope.go:117] "RemoveContainer" containerID="52e00525d2fe3223b29c4d510f146a1dfebeea02e1ec48c4d73131eba5cfe1e3" Mar 09 14:30:58 crc kubenswrapper[4764]: E0309 14:30:58.055196 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52e00525d2fe3223b29c4d510f146a1dfebeea02e1ec48c4d73131eba5cfe1e3\": container with ID starting with 52e00525d2fe3223b29c4d510f146a1dfebeea02e1ec48c4d73131eba5cfe1e3 not found: ID does not exist" containerID="52e00525d2fe3223b29c4d510f146a1dfebeea02e1ec48c4d73131eba5cfe1e3" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.055258 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52e00525d2fe3223b29c4d510f146a1dfebeea02e1ec48c4d73131eba5cfe1e3"} err="failed to get container status \"52e00525d2fe3223b29c4d510f146a1dfebeea02e1ec48c4d73131eba5cfe1e3\": rpc error: code = NotFound desc = could not find container \"52e00525d2fe3223b29c4d510f146a1dfebeea02e1ec48c4d73131eba5cfe1e3\": container with ID starting with 52e00525d2fe3223b29c4d510f146a1dfebeea02e1ec48c4d73131eba5cfe1e3 not found: ID does not exist" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.314202 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2n85d/crc-debug-qhjgj"] Mar 09 14:30:58 crc kubenswrapper[4764]: E0309 14:30:58.315001 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd" containerName="registry-server" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.315027 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd" containerName="registry-server" Mar 09 14:30:58 crc kubenswrapper[4764]: E0309 14:30:58.315044 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd" containerName="extract-utilities" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.315053 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd" containerName="extract-utilities" Mar 09 14:30:58 crc kubenswrapper[4764]: E0309 14:30:58.315075 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25af9291-c5a5-4dff-9eb7-960615c614c1" containerName="container-00" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.315085 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="25af9291-c5a5-4dff-9eb7-960615c614c1" containerName="container-00" Mar 09 14:30:58 crc kubenswrapper[4764]: E0309 14:30:58.315104 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd" containerName="extract-content" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.315112 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd" containerName="extract-content" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.315384 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd" containerName="registry-server" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.315412 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="25af9291-c5a5-4dff-9eb7-960615c614c1" containerName="container-00" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.316579 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/crc-debug-qhjgj" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.478665 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t58w\" (UniqueName: \"kubernetes.io/projected/e08fcbdc-9242-4581-82ee-a3692f6d0d03-kube-api-access-8t58w\") pod \"crc-debug-qhjgj\" (UID: \"e08fcbdc-9242-4581-82ee-a3692f6d0d03\") " pod="openshift-must-gather-2n85d/crc-debug-qhjgj" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.479121 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e08fcbdc-9242-4581-82ee-a3692f6d0d03-host\") pod \"crc-debug-qhjgj\" (UID: \"e08fcbdc-9242-4581-82ee-a3692f6d0d03\") " pod="openshift-must-gather-2n85d/crc-debug-qhjgj" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.581558 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t58w\" (UniqueName: \"kubernetes.io/projected/e08fcbdc-9242-4581-82ee-a3692f6d0d03-kube-api-access-8t58w\") pod \"crc-debug-qhjgj\" (UID: \"e08fcbdc-9242-4581-82ee-a3692f6d0d03\") " pod="openshift-must-gather-2n85d/crc-debug-qhjgj" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.581788 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e08fcbdc-9242-4581-82ee-a3692f6d0d03-host\") pod \"crc-debug-qhjgj\" (UID: \"e08fcbdc-9242-4581-82ee-a3692f6d0d03\") " pod="openshift-must-gather-2n85d/crc-debug-qhjgj" Mar 09 14:30:58 crc kubenswrapper[4764]: I0309 14:30:58.582017 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e08fcbdc-9242-4581-82ee-a3692f6d0d03-host\") pod \"crc-debug-qhjgj\" (UID: \"e08fcbdc-9242-4581-82ee-a3692f6d0d03\") " pod="openshift-must-gather-2n85d/crc-debug-qhjgj" Mar 09 14:30:59 crc kubenswrapper[4764]: I0309 14:30:58.998165 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t58w\" (UniqueName: \"kubernetes.io/projected/e08fcbdc-9242-4581-82ee-a3692f6d0d03-kube-api-access-8t58w\") pod \"crc-debug-qhjgj\" (UID: \"e08fcbdc-9242-4581-82ee-a3692f6d0d03\") " pod="openshift-must-gather-2n85d/crc-debug-qhjgj" Mar 09 14:30:59 crc kubenswrapper[4764]: I0309 14:30:59.237958 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/crc-debug-qhjgj" Mar 09 14:30:59 crc kubenswrapper[4764]: I0309 14:30:59.572669 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd" path="/var/lib/kubelet/pods/cbf30af3-6be9-48e2-8bf6-9fe8a0d4e1cd/volumes" Mar 09 14:30:59 crc kubenswrapper[4764]: I0309 14:30:59.880240 4764 generic.go:334] "Generic (PLEG): container finished" podID="e08fcbdc-9242-4581-82ee-a3692f6d0d03" containerID="35306a339ac014f00b9490fd62d260c20edfb15b36c977f1b4e50e10596530a4" exitCode=0 Mar 09 14:30:59 crc kubenswrapper[4764]: I0309 14:30:59.880310 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2n85d/crc-debug-qhjgj" event={"ID":"e08fcbdc-9242-4581-82ee-a3692f6d0d03","Type":"ContainerDied","Data":"35306a339ac014f00b9490fd62d260c20edfb15b36c977f1b4e50e10596530a4"} Mar 09 14:30:59 crc kubenswrapper[4764]: I0309 14:30:59.880345 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2n85d/crc-debug-qhjgj" event={"ID":"e08fcbdc-9242-4581-82ee-a3692f6d0d03","Type":"ContainerStarted","Data":"13cbfd7cca97d0bf58dc315609fa12973b5144274df00b90524a64d7a54d7be7"} Mar 09 14:31:01 crc kubenswrapper[4764]: I0309 14:31:01.492239 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/crc-debug-qhjgj" Mar 09 14:31:01 crc kubenswrapper[4764]: I0309 14:31:01.558526 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e08fcbdc-9242-4581-82ee-a3692f6d0d03-host\") pod \"e08fcbdc-9242-4581-82ee-a3692f6d0d03\" (UID: \"e08fcbdc-9242-4581-82ee-a3692f6d0d03\") " Mar 09 14:31:01 crc kubenswrapper[4764]: I0309 14:31:01.558687 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e08fcbdc-9242-4581-82ee-a3692f6d0d03-host" (OuterVolumeSpecName: "host") pod "e08fcbdc-9242-4581-82ee-a3692f6d0d03" (UID: "e08fcbdc-9242-4581-82ee-a3692f6d0d03"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:31:01 crc kubenswrapper[4764]: I0309 14:31:01.559172 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t58w\" (UniqueName: \"kubernetes.io/projected/e08fcbdc-9242-4581-82ee-a3692f6d0d03-kube-api-access-8t58w\") pod \"e08fcbdc-9242-4581-82ee-a3692f6d0d03\" (UID: \"e08fcbdc-9242-4581-82ee-a3692f6d0d03\") " Mar 09 14:31:01 crc kubenswrapper[4764]: I0309 14:31:01.559840 4764 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e08fcbdc-9242-4581-82ee-a3692f6d0d03-host\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:01 crc kubenswrapper[4764]: I0309 14:31:01.567165 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e08fcbdc-9242-4581-82ee-a3692f6d0d03-kube-api-access-8t58w" (OuterVolumeSpecName: "kube-api-access-8t58w") pod "e08fcbdc-9242-4581-82ee-a3692f6d0d03" (UID: "e08fcbdc-9242-4581-82ee-a3692f6d0d03"). InnerVolumeSpecName "kube-api-access-8t58w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:31:01 crc kubenswrapper[4764]: I0309 14:31:01.662032 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8t58w\" (UniqueName: \"kubernetes.io/projected/e08fcbdc-9242-4581-82ee-a3692f6d0d03-kube-api-access-8t58w\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:01 crc kubenswrapper[4764]: I0309 14:31:01.913598 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2n85d/crc-debug-qhjgj" event={"ID":"e08fcbdc-9242-4581-82ee-a3692f6d0d03","Type":"ContainerDied","Data":"13cbfd7cca97d0bf58dc315609fa12973b5144274df00b90524a64d7a54d7be7"} Mar 09 14:31:01 crc kubenswrapper[4764]: I0309 14:31:01.913993 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13cbfd7cca97d0bf58dc315609fa12973b5144274df00b90524a64d7a54d7be7" Mar 09 14:31:01 crc kubenswrapper[4764]: I0309 14:31:01.915140 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/crc-debug-qhjgj" Mar 09 14:31:01 crc kubenswrapper[4764]: I0309 14:31:01.968280 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2n85d/crc-debug-qhjgj"] Mar 09 14:31:01 crc kubenswrapper[4764]: I0309 14:31:01.977745 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2n85d/crc-debug-qhjgj"] Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.168906 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2n85d/crc-debug-x4r2v"] Mar 09 14:31:03 crc kubenswrapper[4764]: E0309 14:31:03.169811 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e08fcbdc-9242-4581-82ee-a3692f6d0d03" containerName="container-00" Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.169827 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e08fcbdc-9242-4581-82ee-a3692f6d0d03" containerName="container-00" Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.170089 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e08fcbdc-9242-4581-82ee-a3692f6d0d03" containerName="container-00" Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.170974 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/crc-debug-x4r2v" Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.299673 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-275qw\" (UniqueName: \"kubernetes.io/projected/30f041da-e1c7-4f92-b9b0-7c7b0495ecb6-kube-api-access-275qw\") pod \"crc-debug-x4r2v\" (UID: \"30f041da-e1c7-4f92-b9b0-7c7b0495ecb6\") " pod="openshift-must-gather-2n85d/crc-debug-x4r2v" Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.299926 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30f041da-e1c7-4f92-b9b0-7c7b0495ecb6-host\") pod \"crc-debug-x4r2v\" (UID: \"30f041da-e1c7-4f92-b9b0-7c7b0495ecb6\") " pod="openshift-must-gather-2n85d/crc-debug-x4r2v" Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.402243 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30f041da-e1c7-4f92-b9b0-7c7b0495ecb6-host\") pod \"crc-debug-x4r2v\" (UID: \"30f041da-e1c7-4f92-b9b0-7c7b0495ecb6\") " pod="openshift-must-gather-2n85d/crc-debug-x4r2v" Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.402448 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-275qw\" (UniqueName: \"kubernetes.io/projected/30f041da-e1c7-4f92-b9b0-7c7b0495ecb6-kube-api-access-275qw\") pod \"crc-debug-x4r2v\" (UID: \"30f041da-e1c7-4f92-b9b0-7c7b0495ecb6\") " pod="openshift-must-gather-2n85d/crc-debug-x4r2v" Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.402488 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30f041da-e1c7-4f92-b9b0-7c7b0495ecb6-host\") pod \"crc-debug-x4r2v\" (UID: \"30f041da-e1c7-4f92-b9b0-7c7b0495ecb6\") " pod="openshift-must-gather-2n85d/crc-debug-x4r2v" Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.424365 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-275qw\" (UniqueName: \"kubernetes.io/projected/30f041da-e1c7-4f92-b9b0-7c7b0495ecb6-kube-api-access-275qw\") pod \"crc-debug-x4r2v\" (UID: \"30f041da-e1c7-4f92-b9b0-7c7b0495ecb6\") " pod="openshift-must-gather-2n85d/crc-debug-x4r2v" Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.491283 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/crc-debug-x4r2v" Mar 09 14:31:03 crc kubenswrapper[4764]: W0309 14:31:03.546512 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30f041da_e1c7_4f92_b9b0_7c7b0495ecb6.slice/crio-15f0cbbd2a71e81bfebe228c5291dae19bace1e24e6d6ea39032fc0c8d21a940 WatchSource:0}: Error finding container 15f0cbbd2a71e81bfebe228c5291dae19bace1e24e6d6ea39032fc0c8d21a940: Status 404 returned error can't find the container with id 15f0cbbd2a71e81bfebe228c5291dae19bace1e24e6d6ea39032fc0c8d21a940 Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.571973 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e08fcbdc-9242-4581-82ee-a3692f6d0d03" path="/var/lib/kubelet/pods/e08fcbdc-9242-4581-82ee-a3692f6d0d03/volumes" Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.938072 4764 generic.go:334] "Generic (PLEG): container finished" podID="30f041da-e1c7-4f92-b9b0-7c7b0495ecb6" containerID="3e7d6c7da6785672953242f3e84bfab6bab032496236e65977dca3d2ae3192cd" exitCode=0 Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.938139 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2n85d/crc-debug-x4r2v" event={"ID":"30f041da-e1c7-4f92-b9b0-7c7b0495ecb6","Type":"ContainerDied","Data":"3e7d6c7da6785672953242f3e84bfab6bab032496236e65977dca3d2ae3192cd"} Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.938540 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2n85d/crc-debug-x4r2v" event={"ID":"30f041da-e1c7-4f92-b9b0-7c7b0495ecb6","Type":"ContainerStarted","Data":"15f0cbbd2a71e81bfebe228c5291dae19bace1e24e6d6ea39032fc0c8d21a940"} Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.986974 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2n85d/crc-debug-x4r2v"] Mar 09 14:31:03 crc kubenswrapper[4764]: I0309 14:31:03.999838 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2n85d/crc-debug-x4r2v"] Mar 09 14:31:05 crc kubenswrapper[4764]: I0309 14:31:05.053497 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/crc-debug-x4r2v" Mar 09 14:31:05 crc kubenswrapper[4764]: I0309 14:31:05.147697 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30f041da-e1c7-4f92-b9b0-7c7b0495ecb6-host\") pod \"30f041da-e1c7-4f92-b9b0-7c7b0495ecb6\" (UID: \"30f041da-e1c7-4f92-b9b0-7c7b0495ecb6\") " Mar 09 14:31:05 crc kubenswrapper[4764]: I0309 14:31:05.147842 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-275qw\" (UniqueName: \"kubernetes.io/projected/30f041da-e1c7-4f92-b9b0-7c7b0495ecb6-kube-api-access-275qw\") pod \"30f041da-e1c7-4f92-b9b0-7c7b0495ecb6\" (UID: \"30f041da-e1c7-4f92-b9b0-7c7b0495ecb6\") " Mar 09 14:31:05 crc kubenswrapper[4764]: I0309 14:31:05.148082 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/30f041da-e1c7-4f92-b9b0-7c7b0495ecb6-host" (OuterVolumeSpecName: "host") pod "30f041da-e1c7-4f92-b9b0-7c7b0495ecb6" (UID: "30f041da-e1c7-4f92-b9b0-7c7b0495ecb6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:31:05 crc kubenswrapper[4764]: I0309 14:31:05.148499 4764 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30f041da-e1c7-4f92-b9b0-7c7b0495ecb6-host\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:05 crc kubenswrapper[4764]: I0309 14:31:05.157111 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30f041da-e1c7-4f92-b9b0-7c7b0495ecb6-kube-api-access-275qw" (OuterVolumeSpecName: "kube-api-access-275qw") pod "30f041da-e1c7-4f92-b9b0-7c7b0495ecb6" (UID: "30f041da-e1c7-4f92-b9b0-7c7b0495ecb6"). InnerVolumeSpecName "kube-api-access-275qw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:31:05 crc kubenswrapper[4764]: I0309 14:31:05.250679 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-275qw\" (UniqueName: \"kubernetes.io/projected/30f041da-e1c7-4f92-b9b0-7c7b0495ecb6-kube-api-access-275qw\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:05 crc kubenswrapper[4764]: I0309 14:31:05.574755 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30f041da-e1c7-4f92-b9b0-7c7b0495ecb6" path="/var/lib/kubelet/pods/30f041da-e1c7-4f92-b9b0-7c7b0495ecb6/volumes" Mar 09 14:31:05 crc kubenswrapper[4764]: I0309 14:31:05.960174 4764 scope.go:117] "RemoveContainer" containerID="3e7d6c7da6785672953242f3e84bfab6bab032496236e65977dca3d2ae3192cd" Mar 09 14:31:05 crc kubenswrapper[4764]: I0309 14:31:05.960510 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/crc-debug-x4r2v" Mar 09 14:31:36 crc kubenswrapper[4764]: I0309 14:31:36.010539 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-94887676d-fp9dl_b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19/barbican-api/0.log" Mar 09 14:31:36 crc kubenswrapper[4764]: I0309 14:31:36.901408 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7c6f54974-hws5g_154490f8-97ab-4703-a96c-16b6d5f7a178/barbican-keystone-listener/0.log" Mar 09 14:31:36 crc kubenswrapper[4764]: I0309 14:31:36.925118 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-94887676d-fp9dl_b8389bcb-fcb2-48b4-a1c2-3ae7427ecc19/barbican-api-log/0.log" Mar 09 14:31:37 crc kubenswrapper[4764]: I0309 14:31:37.198676 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-789c56cf69-2dj2c_a18071d3-1164-4080-9095-919bb5349bb8/barbican-worker/0.log" Mar 09 14:31:37 crc kubenswrapper[4764]: I0309 14:31:37.260138 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-789c56cf69-2dj2c_a18071d3-1164-4080-9095-919bb5349bb8/barbican-worker-log/0.log" Mar 09 14:31:37 crc kubenswrapper[4764]: I0309 14:31:37.280161 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7c6f54974-hws5g_154490f8-97ab-4703-a96c-16b6d5f7a178/barbican-keystone-listener-log/0.log" Mar 09 14:31:37 crc kubenswrapper[4764]: I0309 14:31:37.514561 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-7t2tk_7bd401e1-1592-4b49-8eb2-b6dcba296b36/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:31:37 crc kubenswrapper[4764]: I0309 14:31:37.526679 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_182886b1-5569-456a-aa1e-129021e95bfe/ceilometer-central-agent/0.log" Mar 09 14:31:37 crc kubenswrapper[4764]: I0309 14:31:37.746861 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_182886b1-5569-456a-aa1e-129021e95bfe/ceilometer-notification-agent/0.log" Mar 09 14:31:37 crc kubenswrapper[4764]: I0309 14:31:37.804245 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_182886b1-5569-456a-aa1e-129021e95bfe/sg-core/0.log" Mar 09 14:31:37 crc kubenswrapper[4764]: I0309 14:31:37.807082 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_182886b1-5569-456a-aa1e-129021e95bfe/proxy-httpd/0.log" Mar 09 14:31:37 crc kubenswrapper[4764]: I0309 14:31:37.978780 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-lc4lp_e8ac27d6-e52e-4d38-b772-6ada493e746f/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:31:38 crc kubenswrapper[4764]: I0309 14:31:38.106096 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4njtv_cbffc6a1-81df-479c-b40e-3f865c187a73/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:31:38 crc kubenswrapper[4764]: I0309 14:31:38.282142 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9cf43ab7-e625-4ffa-9af4-9f810a43d270/cinder-api/0.log" Mar 09 14:31:38 crc kubenswrapper[4764]: I0309 14:31:38.297784 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9cf43ab7-e625-4ffa-9af4-9f810a43d270/cinder-api-log/0.log" Mar 09 14:31:38 crc kubenswrapper[4764]: I0309 14:31:38.617259 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_e6a8f674-82eb-4474-973d-54a90e5fd1e0/cinder-backup/0.log" Mar 09 14:31:38 crc kubenswrapper[4764]: I0309 14:31:38.623288 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_e6a8f674-82eb-4474-973d-54a90e5fd1e0/probe/0.log" Mar 09 14:31:38 crc kubenswrapper[4764]: I0309 14:31:38.695518 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_05d58314-31c8-4b6a-8c8c-1dc211d9f424/cinder-scheduler/0.log" Mar 09 14:31:39 crc kubenswrapper[4764]: I0309 14:31:39.429962 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_05d58314-31c8-4b6a-8c8c-1dc211d9f424/probe/0.log" Mar 09 14:31:39 crc kubenswrapper[4764]: I0309 14:31:39.489802 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_9aaa370f-a3d5-4fce-9761-873aeb8d7b1f/cinder-volume/0.log" Mar 09 14:31:39 crc kubenswrapper[4764]: I0309 14:31:39.531823 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_9aaa370f-a3d5-4fce-9761-873aeb8d7b1f/probe/0.log" Mar 09 14:31:39 crc kubenswrapper[4764]: I0309 14:31:39.975626 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-trkg2_5141dc8e-4ab7-4488-8ad4-8af7f8c66dcd/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:31:40 crc kubenswrapper[4764]: I0309 14:31:40.162008 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-5b2nl_ea3a2b04-e009-4dcd-8eca-543cc084b329/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:31:40 crc kubenswrapper[4764]: I0309 14:31:40.342597 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-69655fd4bf-qlzqn_1552c7db-c992-4b43-8f1e-2b752d718f36/init/0.log" Mar 09 14:31:40 crc kubenswrapper[4764]: I0309 14:31:40.576202 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-69655fd4bf-qlzqn_1552c7db-c992-4b43-8f1e-2b752d718f36/init/0.log" Mar 09 14:31:40 crc kubenswrapper[4764]: I0309 14:31:40.604002 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-69655fd4bf-qlzqn_1552c7db-c992-4b43-8f1e-2b752d718f36/dnsmasq-dns/0.log" Mar 09 14:31:40 crc kubenswrapper[4764]: I0309 14:31:40.636322 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_22563404-fb5a-4d95-bae1-dd24d6fcc8d1/glance-httpd/0.log" Mar 09 14:31:40 crc kubenswrapper[4764]: I0309 14:31:40.761215 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_22563404-fb5a-4d95-bae1-dd24d6fcc8d1/glance-log/0.log" Mar 09 14:31:40 crc kubenswrapper[4764]: I0309 14:31:40.858280 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_66d58a1b-5d94-4d28-bcb3-0b20f0516eab/glance-httpd/0.log" Mar 09 14:31:40 crc kubenswrapper[4764]: I0309 14:31:40.862502 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_66d58a1b-5d94-4d28-bcb3-0b20f0516eab/glance-log/0.log" Mar 09 14:31:41 crc kubenswrapper[4764]: I0309 14:31:41.151500 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-5qt8s_949d7512-b3be-4068-b05a-20589fbc2b52/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:31:41 crc kubenswrapper[4764]: I0309 14:31:41.156879 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-56bb55c768-vchmw_47ef29f6-4627-4b84-968d-db9d7ed438da/horizon/0.log" Mar 09 14:31:41 crc kubenswrapper[4764]: I0309 14:31:41.264958 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-56bb55c768-vchmw_47ef29f6-4627-4b84-968d-db9d7ed438da/horizon-log/0.log" Mar 09 14:31:41 crc kubenswrapper[4764]: I0309 14:31:41.434814 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-bhp5p_2d2ddcdd-77bf-4dc5-8170-02d297378dcb/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:31:41 crc kubenswrapper[4764]: I0309 14:31:41.609231 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29551081-wz9hv_6ec256c5-cf20-4b12-bb84-0f5d3e02460a/keystone-cron/0.log" Mar 09 14:31:41 crc kubenswrapper[4764]: I0309 14:31:41.688948 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_179736ec-4215-4ad8-9800-a186978a767f/kube-state-metrics/0.log" Mar 09 14:31:41 crc kubenswrapper[4764]: I0309 14:31:41.898147 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-9pl98_0a9ed7f5-c296-41ac-ae0d-5845c66a385a/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:31:42 crc kubenswrapper[4764]: I0309 14:31:42.439006 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-759c9c64fb-nwls6_48b871c4-f2e8-44e9-9268-54920414c084/keystone-api/0.log" Mar 09 14:31:42 crc kubenswrapper[4764]: I0309 14:31:42.527675 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_492d78a8-09ea-4239-a53f-b8d0480fcf36/probe/0.log" Mar 09 14:31:42 crc kubenswrapper[4764]: I0309 14:31:42.641073 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d/manila-api/0.log" Mar 09 14:31:42 crc kubenswrapper[4764]: I0309 14:31:42.667340 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_492d78a8-09ea-4239-a53f-b8d0480fcf36/manila-scheduler/0.log" Mar 09 14:31:42 crc kubenswrapper[4764]: I0309 14:31:42.935448 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_2caafd00-b539-4f40-b1c6-af6957bcb458/probe/0.log" Mar 09 14:31:43 crc kubenswrapper[4764]: I0309 14:31:43.112288 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_2caafd00-b539-4f40-b1c6-af6957bcb458/manila-share/0.log" Mar 09 14:31:43 crc kubenswrapper[4764]: I0309 14:31:43.345268 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_d2c7cfde-f7f3-47d1-ac62-f35f98d5b62d/manila-api-log/0.log" Mar 09 14:31:43 crc kubenswrapper[4764]: I0309 14:31:43.367088 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6b7bfdfd5-56dnz_fd7dadfc-b8e4-479f-8880-4ffeec051d30/neutron-api/0.log" Mar 09 14:31:43 crc kubenswrapper[4764]: I0309 14:31:43.440187 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6b7bfdfd5-56dnz_fd7dadfc-b8e4-479f-8880-4ffeec051d30/neutron-httpd/0.log" Mar 09 14:31:43 crc kubenswrapper[4764]: I0309 14:31:43.592276 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-zd8bj_8a38f1e2-ce88-47d9-883d-4d95c781d181/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:31:43 crc kubenswrapper[4764]: I0309 14:31:43.982950 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ec790643-05dd-4f21-82f8-ad1586087d85/nova-api-log/0.log" Mar 09 14:31:44 crc kubenswrapper[4764]: I0309 14:31:44.123673 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_64bc45ce-7cc3-4d3a-97d7-9e73bfcb4fe9/nova-cell0-conductor-conductor/0.log" Mar 09 14:31:44 crc kubenswrapper[4764]: I0309 14:31:44.255369 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ec790643-05dd-4f21-82f8-ad1586087d85/nova-api-api/0.log" Mar 09 14:31:44 crc kubenswrapper[4764]: I0309 14:31:44.327925 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_959eb23f-c4b4-4f35-b284-38212848a084/nova-cell1-conductor-conductor/0.log" Mar 09 14:31:44 crc kubenswrapper[4764]: I0309 14:31:44.532434 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_6932dd15-578a-4965-bcb9-b506d4e3cd2f/nova-cell1-novncproxy-novncproxy/0.log" Mar 09 14:31:44 crc kubenswrapper[4764]: I0309 14:31:44.672714 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5kztq_eab144b6-e27c-4ffc-9dd5-6236ca12719f/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:31:44 crc kubenswrapper[4764]: I0309 14:31:44.883524 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_d9226790-b0dc-460b-8c06-127effde8c19/nova-metadata-log/0.log" Mar 09 14:31:45 crc kubenswrapper[4764]: I0309 14:31:45.140877 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_7d26ba33-e370-4bc8-bb15-b727c0c9c97f/nova-scheduler-scheduler/0.log" Mar 09 14:31:45 crc kubenswrapper[4764]: I0309 14:31:45.263397 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_103cd40b-aa84-4973-8e47-8a67e5994c80/mysql-bootstrap/0.log" Mar 09 14:31:45 crc kubenswrapper[4764]: I0309 14:31:45.467191 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_103cd40b-aa84-4973-8e47-8a67e5994c80/galera/0.log" Mar 09 14:31:45 crc kubenswrapper[4764]: I0309 14:31:45.473418 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_103cd40b-aa84-4973-8e47-8a67e5994c80/mysql-bootstrap/0.log" Mar 09 14:31:45 crc kubenswrapper[4764]: I0309 14:31:45.746893 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0c87ed75-4285-4084-bce3-ee8dba7671c0/mysql-bootstrap/0.log" Mar 09 14:31:45 crc kubenswrapper[4764]: I0309 14:31:45.919388 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0c87ed75-4285-4084-bce3-ee8dba7671c0/mysql-bootstrap/0.log" Mar 09 14:31:45 crc kubenswrapper[4764]: I0309 14:31:45.962100 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0c87ed75-4285-4084-bce3-ee8dba7671c0/galera/0.log" Mar 09 14:31:46 crc kubenswrapper[4764]: I0309 14:31:46.107806 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_d82ed357-9f4c-478b-b893-ab6ff10fc83c/openstackclient/0.log" Mar 09 14:31:46 crc kubenswrapper[4764]: I0309 14:31:46.295505 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-8ctgr_4db14a6b-d372-48be-86a1-bf651618b4a4/openstack-network-exporter/0.log" Mar 09 14:31:46 crc kubenswrapper[4764]: I0309 14:31:46.474369 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2zkzm_05f9485e-b683-481d-87d3-fb86ebb4a832/ovsdb-server-init/0.log" Mar 09 14:31:46 crc kubenswrapper[4764]: I0309 14:31:46.608802 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_d9226790-b0dc-460b-8c06-127effde8c19/nova-metadata-metadata/0.log" Mar 09 14:31:46 crc kubenswrapper[4764]: I0309 14:31:46.675168 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2zkzm_05f9485e-b683-481d-87d3-fb86ebb4a832/ovs-vswitchd/0.log" Mar 09 14:31:46 crc kubenswrapper[4764]: I0309 14:31:46.689710 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2zkzm_05f9485e-b683-481d-87d3-fb86ebb4a832/ovsdb-server-init/0.log" Mar 09 14:31:46 crc kubenswrapper[4764]: I0309 14:31:46.729293 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2zkzm_05f9485e-b683-481d-87d3-fb86ebb4a832/ovsdb-server/0.log" Mar 09 14:31:46 crc kubenswrapper[4764]: I0309 14:31:46.902743 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-qm7vs_9bbe03cf-76d5-440a-903f-50c382aa3a4e/ovn-controller/0.log" Mar 09 14:31:47 crc kubenswrapper[4764]: I0309 14:31:47.098800 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-bkvx8_ede2526d-593a-4258-9ec2-172270be638a/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:31:47 crc kubenswrapper[4764]: I0309 14:31:47.152088 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_142a1ef0-f024-4a81-85de-72435cd72d9e/openstack-network-exporter/0.log" Mar 09 14:31:47 crc kubenswrapper[4764]: I0309 14:31:47.336772 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_142a1ef0-f024-4a81-85de-72435cd72d9e/ovn-northd/0.log" Mar 09 14:31:47 crc kubenswrapper[4764]: I0309 14:31:47.394281 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e54bd06b-1ee2-452d-80fb-12fd4fb61c7b/openstack-network-exporter/0.log" Mar 09 14:31:47 crc kubenswrapper[4764]: I0309 14:31:47.551886 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e54bd06b-1ee2-452d-80fb-12fd4fb61c7b/ovsdbserver-nb/0.log" Mar 09 14:31:47 crc kubenswrapper[4764]: I0309 14:31:47.651631 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_047aa387-9e35-4ec6-89a9-3be60e47610b/openstack-network-exporter/0.log" Mar 09 14:31:47 crc kubenswrapper[4764]: I0309 14:31:47.689329 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_047aa387-9e35-4ec6-89a9-3be60e47610b/ovsdbserver-sb/0.log" Mar 09 14:31:48 crc kubenswrapper[4764]: I0309 14:31:48.013508 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-f85c59cb-gm4df_2a26a533-a42a-4553-96b3-922ad860ca7a/placement-log/0.log" Mar 09 14:31:48 crc kubenswrapper[4764]: I0309 14:31:48.064335 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-f85c59cb-gm4df_2a26a533-a42a-4553-96b3-922ad860ca7a/placement-api/0.log" Mar 09 14:31:48 crc kubenswrapper[4764]: I0309 14:31:48.191227 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c5579dd7-5380-4042-8c78-c6837d841d5e/setup-container/0.log" Mar 09 14:31:48 crc kubenswrapper[4764]: I0309 14:31:48.399850 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c5579dd7-5380-4042-8c78-c6837d841d5e/setup-container/0.log" Mar 09 14:31:48 crc kubenswrapper[4764]: I0309 14:31:48.422992 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c5579dd7-5380-4042-8c78-c6837d841d5e/rabbitmq/0.log" Mar 09 14:31:48 crc kubenswrapper[4764]: I0309 14:31:48.463604 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b19144b6-cc4c-41d6-ad2e-409c021f657c/setup-container/0.log" Mar 09 14:31:49 crc kubenswrapper[4764]: I0309 14:31:49.113942 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b19144b6-cc4c-41d6-ad2e-409c021f657c/setup-container/0.log" Mar 09 14:31:49 crc kubenswrapper[4764]: I0309 14:31:49.171884 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-6hxsg_6ee5c8cc-9f2b-42f8-aed5-37c3540bd300/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:31:49 crc kubenswrapper[4764]: I0309 14:31:49.256998 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b19144b6-cc4c-41d6-ad2e-409c021f657c/rabbitmq/0.log" Mar 09 14:31:49 crc kubenswrapper[4764]: I0309 14:31:49.492406 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-d46h5_de1bf125-47e1-499c-9cfe-ffbd5c03d194/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:31:49 crc kubenswrapper[4764]: I0309 14:31:49.587573 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-c4rw6_942b7017-cdda-4d7a-8be8-521111f4fcd1/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:31:49 crc kubenswrapper[4764]: I0309 14:31:49.704322 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-x85q2_23319545-4107-4a83-b7e1-955e4648bf7b/ssh-known-hosts-edpm-deployment/0.log" Mar 09 14:31:49 crc kubenswrapper[4764]: I0309 14:31:49.939340 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_5db22a0e-ee1a-4b26-9e49-b26644266834/tempest-tests-tempest-tests-runner/0.log" Mar 09 14:31:50 crc kubenswrapper[4764]: I0309 14:31:50.040818 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_3b233056-629a-4653-8726-76e6b231e58b/test-operator-logs-container/0.log" Mar 09 14:31:50 crc kubenswrapper[4764]: I0309 14:31:50.234598 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-jhs5m_93b0ad6c-7720-4b43-b65c-83b7b7a8c3ab/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 14:32:00 crc kubenswrapper[4764]: I0309 14:32:00.162124 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551112-lcwnw"] Mar 09 14:32:00 crc kubenswrapper[4764]: E0309 14:32:00.163661 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30f041da-e1c7-4f92-b9b0-7c7b0495ecb6" containerName="container-00" Mar 09 14:32:00 crc kubenswrapper[4764]: I0309 14:32:00.163680 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="30f041da-e1c7-4f92-b9b0-7c7b0495ecb6" containerName="container-00" Mar 09 14:32:00 crc kubenswrapper[4764]: I0309 14:32:00.164840 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="30f041da-e1c7-4f92-b9b0-7c7b0495ecb6" containerName="container-00" Mar 09 14:32:00 crc kubenswrapper[4764]: I0309 14:32:00.165826 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551112-lcwnw" Mar 09 14:32:00 crc kubenswrapper[4764]: I0309 14:32:00.172936 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:32:00 crc kubenswrapper[4764]: I0309 14:32:00.173114 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:32:00 crc kubenswrapper[4764]: I0309 14:32:00.180911 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:32:00 crc kubenswrapper[4764]: I0309 14:32:00.194813 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551112-lcwnw"] Mar 09 14:32:00 crc kubenswrapper[4764]: I0309 14:32:00.282084 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvspg\" (UniqueName: \"kubernetes.io/projected/3078e21d-b42c-45f0-94c0-d3980ec27f1f-kube-api-access-qvspg\") pod \"auto-csr-approver-29551112-lcwnw\" (UID: \"3078e21d-b42c-45f0-94c0-d3980ec27f1f\") " pod="openshift-infra/auto-csr-approver-29551112-lcwnw" Mar 09 14:32:00 crc kubenswrapper[4764]: I0309 14:32:00.384364 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvspg\" (UniqueName: \"kubernetes.io/projected/3078e21d-b42c-45f0-94c0-d3980ec27f1f-kube-api-access-qvspg\") pod \"auto-csr-approver-29551112-lcwnw\" (UID: \"3078e21d-b42c-45f0-94c0-d3980ec27f1f\") " pod="openshift-infra/auto-csr-approver-29551112-lcwnw" Mar 09 14:32:00 crc kubenswrapper[4764]: I0309 14:32:00.593816 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvspg\" (UniqueName: \"kubernetes.io/projected/3078e21d-b42c-45f0-94c0-d3980ec27f1f-kube-api-access-qvspg\") pod \"auto-csr-approver-29551112-lcwnw\" (UID: \"3078e21d-b42c-45f0-94c0-d3980ec27f1f\") " pod="openshift-infra/auto-csr-approver-29551112-lcwnw" Mar 09 14:32:00 crc kubenswrapper[4764]: I0309 14:32:00.808040 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551112-lcwnw" Mar 09 14:32:01 crc kubenswrapper[4764]: I0309 14:32:01.372212 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551112-lcwnw"] Mar 09 14:32:01 crc kubenswrapper[4764]: I0309 14:32:01.610148 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551112-lcwnw" event={"ID":"3078e21d-b42c-45f0-94c0-d3980ec27f1f","Type":"ContainerStarted","Data":"7d6114e8bf85a2fac5ce54e9eb8aceb55e8157c40da54ae189ea5e91258cd946"} Mar 09 14:32:04 crc kubenswrapper[4764]: I0309 14:32:04.651133 4764 generic.go:334] "Generic (PLEG): container finished" podID="3078e21d-b42c-45f0-94c0-d3980ec27f1f" containerID="6874ccb9508c9920d5940aa912a69fc71adf07efd3d1ffdbd80a9373286c8c70" exitCode=0 Mar 09 14:32:04 crc kubenswrapper[4764]: I0309 14:32:04.651249 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551112-lcwnw" event={"ID":"3078e21d-b42c-45f0-94c0-d3980ec27f1f","Type":"ContainerDied","Data":"6874ccb9508c9920d5940aa912a69fc71adf07efd3d1ffdbd80a9373286c8c70"} Mar 09 14:32:06 crc kubenswrapper[4764]: I0309 14:32:06.111204 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551112-lcwnw" Mar 09 14:32:06 crc kubenswrapper[4764]: I0309 14:32:06.266238 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvspg\" (UniqueName: \"kubernetes.io/projected/3078e21d-b42c-45f0-94c0-d3980ec27f1f-kube-api-access-qvspg\") pod \"3078e21d-b42c-45f0-94c0-d3980ec27f1f\" (UID: \"3078e21d-b42c-45f0-94c0-d3980ec27f1f\") " Mar 09 14:32:06 crc kubenswrapper[4764]: I0309 14:32:06.274400 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3078e21d-b42c-45f0-94c0-d3980ec27f1f-kube-api-access-qvspg" (OuterVolumeSpecName: "kube-api-access-qvspg") pod "3078e21d-b42c-45f0-94c0-d3980ec27f1f" (UID: "3078e21d-b42c-45f0-94c0-d3980ec27f1f"). InnerVolumeSpecName "kube-api-access-qvspg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:32:06 crc kubenswrapper[4764]: I0309 14:32:06.371574 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvspg\" (UniqueName: \"kubernetes.io/projected/3078e21d-b42c-45f0-94c0-d3980ec27f1f-kube-api-access-qvspg\") on node \"crc\" DevicePath \"\"" Mar 09 14:32:06 crc kubenswrapper[4764]: I0309 14:32:06.716186 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551112-lcwnw" event={"ID":"3078e21d-b42c-45f0-94c0-d3980ec27f1f","Type":"ContainerDied","Data":"7d6114e8bf85a2fac5ce54e9eb8aceb55e8157c40da54ae189ea5e91258cd946"} Mar 09 14:32:06 crc kubenswrapper[4764]: I0309 14:32:06.716251 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d6114e8bf85a2fac5ce54e9eb8aceb55e8157c40da54ae189ea5e91258cd946" Mar 09 14:32:06 crc kubenswrapper[4764]: I0309 14:32:06.716346 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551112-lcwnw" Mar 09 14:32:07 crc kubenswrapper[4764]: I0309 14:32:07.229806 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551106-dnhln"] Mar 09 14:32:07 crc kubenswrapper[4764]: I0309 14:32:07.244213 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551106-dnhln"] Mar 09 14:32:07 crc kubenswrapper[4764]: I0309 14:32:07.572907 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d33fc9b0-e440-4f1b-9522-1abec06eca2a" path="/var/lib/kubelet/pods/d33fc9b0-e440-4f1b-9522-1abec06eca2a/volumes" Mar 09 14:32:07 crc kubenswrapper[4764]: I0309 14:32:07.609702 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_519ac270-ea24-47c1-b4f3-d94b0add96d1/memcached/0.log" Mar 09 14:32:12 crc kubenswrapper[4764]: I0309 14:32:12.253634 4764 scope.go:117] "RemoveContainer" containerID="8f9d999651aa510edfbd48cd8433fc728bf24d4c32451ce578a41c04f581a328" Mar 09 14:32:23 crc kubenswrapper[4764]: I0309 14:32:23.044888 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5_3f89e888-fc0d-48c0-ad4c-978e058ffebd/util/0.log" Mar 09 14:32:23 crc kubenswrapper[4764]: I0309 14:32:23.307228 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5_3f89e888-fc0d-48c0-ad4c-978e058ffebd/pull/0.log" Mar 09 14:32:23 crc kubenswrapper[4764]: I0309 14:32:23.352898 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5_3f89e888-fc0d-48c0-ad4c-978e058ffebd/util/0.log" Mar 09 14:32:23 crc kubenswrapper[4764]: I0309 14:32:23.388266 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5_3f89e888-fc0d-48c0-ad4c-978e058ffebd/pull/0.log" Mar 09 14:32:23 crc kubenswrapper[4764]: I0309 14:32:23.584738 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5_3f89e888-fc0d-48c0-ad4c-978e058ffebd/pull/0.log" Mar 09 14:32:23 crc kubenswrapper[4764]: I0309 14:32:23.626984 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5_3f89e888-fc0d-48c0-ad4c-978e058ffebd/util/0.log" Mar 09 14:32:23 crc kubenswrapper[4764]: I0309 14:32:23.689084 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bb5e27b1881e557677c06c4f4def14a4e9a93f970d90508463799e20a9xpql5_3f89e888-fc0d-48c0-ad4c-978e058ffebd/extract/0.log" Mar 09 14:32:24 crc kubenswrapper[4764]: I0309 14:32:24.733833 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-5d87c9d997-cmtpc_725c0dd0-07d1-4a1c-b223-e8bec76cc7ff/manager/0.log" Mar 09 14:32:25 crc kubenswrapper[4764]: I0309 14:32:25.151952 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-64db6967f8-mjf6m_488ff419-d889-4778-96cf-a11006c49507/manager/0.log" Mar 09 14:32:25 crc kubenswrapper[4764]: I0309 14:32:25.331219 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-cf99c678f-jnmbv_7295db10-1c36-4c17-bf1e-4c4a702c201b/manager/0.log" Mar 09 14:32:25 crc kubenswrapper[4764]: I0309 14:32:25.624774 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-78bc7f9bd9-5xc2s_3da43711-be34-4189-b686-e8e9bc9e7265/manager/0.log" Mar 09 14:32:26 crc kubenswrapper[4764]: I0309 14:32:26.245765 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-545456dc4-hvpbz_5bd11a3c-79f3-4aa2-9d5a-78ec3c18cb31/manager/0.log" Mar 09 14:32:26 crc kubenswrapper[4764]: I0309 14:32:26.519317 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-m58s9_bfda7896-83e3-407c-9eb5-74fbc11104f0/manager/0.log" Mar 09 14:32:26 crc kubenswrapper[4764]: I0309 14:32:26.745123 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-nppjq_4c271ca0-0c25-46d1-b730-e94f68397e29/manager/0.log" Mar 09 14:32:27 crc kubenswrapper[4764]: I0309 14:32:27.265495 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7c789f89c6-wv2rp_32eb5815-c566-4177-8b47-f756807d4a30/manager/0.log" Mar 09 14:32:27 crc kubenswrapper[4764]: I0309 14:32:27.393103 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c7bcbc569-qhpvs_5cd7eb92-2fae-4978-a5e9-58fa87c63e84/manager/0.log" Mar 09 14:32:27 crc kubenswrapper[4764]: I0309 14:32:27.580720 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-7b6bfb6475-vkns5_da851ddd-2b27-45f0-b149-de32ae21ad91/manager/0.log" Mar 09 14:32:27 crc kubenswrapper[4764]: I0309 14:32:27.771231 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54688575f-cgv66_2ddf1e89-9c89-4052-aa1b-6fb84438b86d/manager/0.log" Mar 09 14:32:27 crc kubenswrapper[4764]: I0309 14:32:27.903282 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-74b6b5dc96-dm7rn_26535a82-8d70-4623-b2b4-7dd1546d48d6/manager/0.log" Mar 09 14:32:28 crc kubenswrapper[4764]: I0309 14:32:28.053601 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5d86c7ddb7-gv2sm_b54e2237-603a-44ad-a129-04736cf749b2/manager/0.log" Mar 09 14:32:28 crc kubenswrapper[4764]: I0309 14:32:28.149941 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9c94wvm_47bd7072-a414-4ce8-800b-753b7054be23/manager/0.log" Mar 09 14:32:28 crc kubenswrapper[4764]: I0309 14:32:28.370390 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:32:28 crc kubenswrapper[4764]: I0309 14:32:28.370473 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:32:28 crc kubenswrapper[4764]: I0309 14:32:28.540790 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6754b7f846-ns9zn_67c57635-59f1-48a2-9823-c86732eabbf6/operator/0.log" Mar 09 14:32:28 crc kubenswrapper[4764]: I0309 14:32:28.624999 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-lvrg9_8e6c087a-8aaa-427c-822b-a274e19cc440/registry-server/0.log" Mar 09 14:32:28 crc kubenswrapper[4764]: I0309 14:32:28.913364 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-75684d597f-jfgzw_615473d3-072e-4685-8f32-73a44badf1e2/manager/0.log" Mar 09 14:32:28 crc kubenswrapper[4764]: I0309 14:32:28.940257 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-648564c9fc-8ms5w_c44e76b2-0de9-4a5b-93ee-536c6300157f/manager/0.log" Mar 09 14:32:29 crc kubenswrapper[4764]: I0309 14:32:29.288827 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-6v2sq_01ea99aa-eb21-4799-9557-42c3fb55945a/operator/0.log" Mar 09 14:32:29 crc kubenswrapper[4764]: I0309 14:32:29.389102 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9b9ff9f4d-bf8w8_003210d3-5572-44bd-aae5-d5e24aac16a5/manager/0.log" Mar 09 14:32:29 crc kubenswrapper[4764]: I0309 14:32:29.696022 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5fdb694969-4cpsz_c6a51c61-770b-4b46-9dd3-1f61e6f5ebc8/manager/0.log" Mar 09 14:32:29 crc kubenswrapper[4764]: I0309 14:32:29.766173 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-55b5ff4dbb-d65xp_867908a2-f085-4f3d-b569-84c915f730b1/manager/0.log" Mar 09 14:32:29 crc kubenswrapper[4764]: I0309 14:32:29.973321 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-7f8nr_f705ec78-e960-4200-b5a6-f3d4310f1bd5/manager/0.log" Mar 09 14:32:30 crc kubenswrapper[4764]: I0309 14:32:30.608935 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6746d697b-lr6nx_e11f44d8-58a5-4fc7-b05b-e2e688647d01/manager/0.log" Mar 09 14:32:34 crc kubenswrapper[4764]: I0309 14:32:34.306442 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6db6876945-82cg8_e220a3f1-4dbe-4ee6-9b19-26985fa998cf/manager/0.log" Mar 09 14:32:54 crc kubenswrapper[4764]: I0309 14:32:54.525064 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-9k28f_4ba55602-0e3f-4722-b437-546732351bc4/control-plane-machine-set-operator/0.log" Mar 09 14:32:54 crc kubenswrapper[4764]: I0309 14:32:54.776547 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t8ft9_4125448d-5832-43c2-8dba-d95adde7458a/kube-rbac-proxy/0.log" Mar 09 14:32:54 crc kubenswrapper[4764]: I0309 14:32:54.776846 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t8ft9_4125448d-5832-43c2-8dba-d95adde7458a/machine-api-operator/0.log" Mar 09 14:32:58 crc kubenswrapper[4764]: I0309 14:32:58.370781 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:32:58 crc kubenswrapper[4764]: I0309 14:32:58.371736 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:33:10 crc kubenswrapper[4764]: I0309 14:33:10.823750 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-lhpfw_2eef62f2-5973-47e2-b921-9e1a05b9f8fb/cert-manager-controller/0.log" Mar 09 14:33:11 crc kubenswrapper[4764]: I0309 14:33:11.485510 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-qr7fv_0a35f012-3965-4680-aa01-9fa97f956c68/cert-manager-cainjector/0.log" Mar 09 14:33:11 crc kubenswrapper[4764]: I0309 14:33:11.655431 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-j8rlp_09aeffa2-590d-4062-95ff-40dbdda54df7/cert-manager-webhook/0.log" Mar 09 14:33:27 crc kubenswrapper[4764]: I0309 14:33:27.493788 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-qpjz4_fc521772-06d5-47ec-85d0-6162bb98af30/nmstate-console-plugin/0.log" Mar 09 14:33:27 crc kubenswrapper[4764]: I0309 14:33:27.808469 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-sl5hn_6dc5759c-db8c-4025-bc16-a07e4dc6278a/nmstate-handler/0.log" Mar 09 14:33:28 crc kubenswrapper[4764]: I0309 14:33:28.179816 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-jschs_abca721f-d47f-4e38-ab9e-0832de2c70e6/nmstate-metrics/0.log" Mar 09 14:33:28 crc kubenswrapper[4764]: I0309 14:33:28.222304 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-jschs_abca721f-d47f-4e38-ab9e-0832de2c70e6/kube-rbac-proxy/0.log" Mar 09 14:33:28 crc kubenswrapper[4764]: I0309 14:33:28.333320 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-q5kvf_33e9b814-6368-46c6-aae2-5a3df1839d29/nmstate-operator/0.log" Mar 09 14:33:28 crc kubenswrapper[4764]: I0309 14:33:28.397694 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:33:28 crc kubenswrapper[4764]: I0309 14:33:28.398069 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:33:28 crc kubenswrapper[4764]: I0309 14:33:28.398221 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 14:33:28 crc kubenswrapper[4764]: I0309 14:33:28.399302 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 14:33:28 crc kubenswrapper[4764]: I0309 14:33:28.399449 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" gracePeriod=600 Mar 09 14:33:28 crc kubenswrapper[4764]: I0309 14:33:28.492839 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-wv755_f339e495-f347-45b8-b9da-2cd832ac4300/nmstate-webhook/0.log" Mar 09 14:33:28 crc kubenswrapper[4764]: E0309 14:33:28.523754 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:33:28 crc kubenswrapper[4764]: I0309 14:33:28.562655 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" exitCode=0 Mar 09 14:33:28 crc kubenswrapper[4764]: I0309 14:33:28.562729 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2"} Mar 09 14:33:28 crc kubenswrapper[4764]: I0309 14:33:28.562879 4764 scope.go:117] "RemoveContainer" containerID="956798d3640b6af2f25ed443a7b5b3c26e0da670aff756c574b1a30cc50879dd" Mar 09 14:33:28 crc kubenswrapper[4764]: I0309 14:33:28.564085 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:33:28 crc kubenswrapper[4764]: E0309 14:33:28.564462 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:33:42 crc kubenswrapper[4764]: I0309 14:33:42.560407 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:33:42 crc kubenswrapper[4764]: E0309 14:33:42.561520 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:33:43 crc kubenswrapper[4764]: I0309 14:33:43.776479 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qzlmx"] Mar 09 14:33:43 crc kubenswrapper[4764]: E0309 14:33:43.777470 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3078e21d-b42c-45f0-94c0-d3980ec27f1f" containerName="oc" Mar 09 14:33:43 crc kubenswrapper[4764]: I0309 14:33:43.777489 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3078e21d-b42c-45f0-94c0-d3980ec27f1f" containerName="oc" Mar 09 14:33:43 crc kubenswrapper[4764]: I0309 14:33:43.777725 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3078e21d-b42c-45f0-94c0-d3980ec27f1f" containerName="oc" Mar 09 14:33:43 crc kubenswrapper[4764]: I0309 14:33:43.805828 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qzlmx"] Mar 09 14:33:43 crc kubenswrapper[4764]: I0309 14:33:43.805994 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:33:43 crc kubenswrapper[4764]: I0309 14:33:43.825637 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2efcd9d-6a11-4afd-8903-0f280284cdaa-catalog-content\") pod \"redhat-operators-qzlmx\" (UID: \"f2efcd9d-6a11-4afd-8903-0f280284cdaa\") " pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:33:43 crc kubenswrapper[4764]: I0309 14:33:43.825820 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2efcd9d-6a11-4afd-8903-0f280284cdaa-utilities\") pod \"redhat-operators-qzlmx\" (UID: \"f2efcd9d-6a11-4afd-8903-0f280284cdaa\") " pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:33:43 crc kubenswrapper[4764]: I0309 14:33:43.828035 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv8kk\" (UniqueName: \"kubernetes.io/projected/f2efcd9d-6a11-4afd-8903-0f280284cdaa-kube-api-access-lv8kk\") pod \"redhat-operators-qzlmx\" (UID: \"f2efcd9d-6a11-4afd-8903-0f280284cdaa\") " pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:33:43 crc kubenswrapper[4764]: I0309 14:33:43.930448 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2efcd9d-6a11-4afd-8903-0f280284cdaa-catalog-content\") pod \"redhat-operators-qzlmx\" (UID: \"f2efcd9d-6a11-4afd-8903-0f280284cdaa\") " pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:33:43 crc kubenswrapper[4764]: I0309 14:33:43.930838 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2efcd9d-6a11-4afd-8903-0f280284cdaa-utilities\") pod \"redhat-operators-qzlmx\" (UID: \"f2efcd9d-6a11-4afd-8903-0f280284cdaa\") " pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:33:43 crc kubenswrapper[4764]: I0309 14:33:43.930950 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv8kk\" (UniqueName: \"kubernetes.io/projected/f2efcd9d-6a11-4afd-8903-0f280284cdaa-kube-api-access-lv8kk\") pod \"redhat-operators-qzlmx\" (UID: \"f2efcd9d-6a11-4afd-8903-0f280284cdaa\") " pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:33:43 crc kubenswrapper[4764]: I0309 14:33:43.931411 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2efcd9d-6a11-4afd-8903-0f280284cdaa-catalog-content\") pod \"redhat-operators-qzlmx\" (UID: \"f2efcd9d-6a11-4afd-8903-0f280284cdaa\") " pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:33:43 crc kubenswrapper[4764]: I0309 14:33:43.931620 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2efcd9d-6a11-4afd-8903-0f280284cdaa-utilities\") pod \"redhat-operators-qzlmx\" (UID: \"f2efcd9d-6a11-4afd-8903-0f280284cdaa\") " pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:33:43 crc kubenswrapper[4764]: I0309 14:33:43.955237 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv8kk\" (UniqueName: \"kubernetes.io/projected/f2efcd9d-6a11-4afd-8903-0f280284cdaa-kube-api-access-lv8kk\") pod \"redhat-operators-qzlmx\" (UID: \"f2efcd9d-6a11-4afd-8903-0f280284cdaa\") " pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:33:44 crc kubenswrapper[4764]: I0309 14:33:44.157685 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:33:44 crc kubenswrapper[4764]: I0309 14:33:44.775672 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qzlmx"] Mar 09 14:33:45 crc kubenswrapper[4764]: I0309 14:33:45.740530 4764 generic.go:334] "Generic (PLEG): container finished" podID="f2efcd9d-6a11-4afd-8903-0f280284cdaa" containerID="edfafef8d0234c71f87bec73e4123a65431ec22582183b56a7ee1a6dea625511" exitCode=0 Mar 09 14:33:45 crc kubenswrapper[4764]: I0309 14:33:45.740918 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzlmx" event={"ID":"f2efcd9d-6a11-4afd-8903-0f280284cdaa","Type":"ContainerDied","Data":"edfafef8d0234c71f87bec73e4123a65431ec22582183b56a7ee1a6dea625511"} Mar 09 14:33:45 crc kubenswrapper[4764]: I0309 14:33:45.740953 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzlmx" event={"ID":"f2efcd9d-6a11-4afd-8903-0f280284cdaa","Type":"ContainerStarted","Data":"e9d31a776ca5ada6ff61eb8dd1dad9ae7a8028937181ba856028bc7288265bfe"} Mar 09 14:33:47 crc kubenswrapper[4764]: I0309 14:33:47.765891 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzlmx" event={"ID":"f2efcd9d-6a11-4afd-8903-0f280284cdaa","Type":"ContainerStarted","Data":"86c4ef2cdaf5c0b3f5f3860085ce0371469edd4c05bde54fbade9ae91ce3c8fe"} Mar 09 14:33:49 crc kubenswrapper[4764]: I0309 14:33:49.786322 4764 generic.go:334] "Generic (PLEG): container finished" podID="f2efcd9d-6a11-4afd-8903-0f280284cdaa" containerID="86c4ef2cdaf5c0b3f5f3860085ce0371469edd4c05bde54fbade9ae91ce3c8fe" exitCode=0 Mar 09 14:33:49 crc kubenswrapper[4764]: I0309 14:33:49.786408 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzlmx" event={"ID":"f2efcd9d-6a11-4afd-8903-0f280284cdaa","Type":"ContainerDied","Data":"86c4ef2cdaf5c0b3f5f3860085ce0371469edd4c05bde54fbade9ae91ce3c8fe"} Mar 09 14:33:50 crc kubenswrapper[4764]: I0309 14:33:50.803440 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzlmx" event={"ID":"f2efcd9d-6a11-4afd-8903-0f280284cdaa","Type":"ContainerStarted","Data":"6bd50070f5a28319990ff792d02341394570cf6f918cac77f2bd667b387c03b7"} Mar 09 14:33:53 crc kubenswrapper[4764]: I0309 14:33:53.560501 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:33:53 crc kubenswrapper[4764]: E0309 14:33:53.561411 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:33:54 crc kubenswrapper[4764]: I0309 14:33:54.157858 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:33:54 crc kubenswrapper[4764]: I0309 14:33:54.158363 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:33:55 crc kubenswrapper[4764]: I0309 14:33:55.215694 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qzlmx" podUID="f2efcd9d-6a11-4afd-8903-0f280284cdaa" containerName="registry-server" probeResult="failure" output=< Mar 09 14:33:55 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Mar 09 14:33:55 crc kubenswrapper[4764]: > Mar 09 14:34:00 crc kubenswrapper[4764]: I0309 14:34:00.144741 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qzlmx" podStartSLOduration=12.665717086 podStartE2EDuration="17.1447196s" podCreationTimestamp="2026-03-09 14:33:43 +0000 UTC" firstStartedPulling="2026-03-09 14:33:45.743481696 +0000 UTC m=+4380.993653604" lastFinishedPulling="2026-03-09 14:33:50.22248421 +0000 UTC m=+4385.472656118" observedRunningTime="2026-03-09 14:33:50.833136723 +0000 UTC m=+4386.083308631" watchObservedRunningTime="2026-03-09 14:34:00.1447196 +0000 UTC m=+4395.394891508" Mar 09 14:34:00 crc kubenswrapper[4764]: I0309 14:34:00.150042 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551114-9w2wl"] Mar 09 14:34:00 crc kubenswrapper[4764]: I0309 14:34:00.151581 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551114-9w2wl" Mar 09 14:34:00 crc kubenswrapper[4764]: I0309 14:34:00.154092 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:34:00 crc kubenswrapper[4764]: I0309 14:34:00.155287 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:34:00 crc kubenswrapper[4764]: I0309 14:34:00.159780 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:34:00 crc kubenswrapper[4764]: I0309 14:34:00.162771 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551114-9w2wl"] Mar 09 14:34:00 crc kubenswrapper[4764]: I0309 14:34:00.247532 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggjnq\" (UniqueName: \"kubernetes.io/projected/3b063d56-a0eb-4b2d-8b53-2c63feead99e-kube-api-access-ggjnq\") pod \"auto-csr-approver-29551114-9w2wl\" (UID: \"3b063d56-a0eb-4b2d-8b53-2c63feead99e\") " pod="openshift-infra/auto-csr-approver-29551114-9w2wl" Mar 09 14:34:00 crc kubenswrapper[4764]: I0309 14:34:00.350330 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggjnq\" (UniqueName: \"kubernetes.io/projected/3b063d56-a0eb-4b2d-8b53-2c63feead99e-kube-api-access-ggjnq\") pod \"auto-csr-approver-29551114-9w2wl\" (UID: \"3b063d56-a0eb-4b2d-8b53-2c63feead99e\") " pod="openshift-infra/auto-csr-approver-29551114-9w2wl" Mar 09 14:34:00 crc kubenswrapper[4764]: I0309 14:34:00.373849 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggjnq\" (UniqueName: \"kubernetes.io/projected/3b063d56-a0eb-4b2d-8b53-2c63feead99e-kube-api-access-ggjnq\") pod \"auto-csr-approver-29551114-9w2wl\" (UID: \"3b063d56-a0eb-4b2d-8b53-2c63feead99e\") " pod="openshift-infra/auto-csr-approver-29551114-9w2wl" Mar 09 14:34:00 crc kubenswrapper[4764]: I0309 14:34:00.475222 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551114-9w2wl" Mar 09 14:34:00 crc kubenswrapper[4764]: I0309 14:34:00.992454 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551114-9w2wl"] Mar 09 14:34:01 crc kubenswrapper[4764]: W0309 14:34:01.002826 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b063d56_a0eb_4b2d_8b53_2c63feead99e.slice/crio-4af4ce420cb2f651ee10eebaabb6884f6f65c51443c9a04bcf3b4f43c4049c40 WatchSource:0}: Error finding container 4af4ce420cb2f651ee10eebaabb6884f6f65c51443c9a04bcf3b4f43c4049c40: Status 404 returned error can't find the container with id 4af4ce420cb2f651ee10eebaabb6884f6f65c51443c9a04bcf3b4f43c4049c40 Mar 09 14:34:01 crc kubenswrapper[4764]: I0309 14:34:01.921262 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551114-9w2wl" event={"ID":"3b063d56-a0eb-4b2d-8b53-2c63feead99e","Type":"ContainerStarted","Data":"4af4ce420cb2f651ee10eebaabb6884f6f65c51443c9a04bcf3b4f43c4049c40"} Mar 09 14:34:03 crc kubenswrapper[4764]: I0309 14:34:03.685262 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-lgrkv_709e786e-5c7d-45d3-ac38-78351dfbec81/kube-rbac-proxy/0.log" Mar 09 14:34:03 crc kubenswrapper[4764]: I0309 14:34:03.817628 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-lgrkv_709e786e-5c7d-45d3-ac38-78351dfbec81/controller/0.log" Mar 09 14:34:03 crc kubenswrapper[4764]: I0309 14:34:03.946216 4764 generic.go:334] "Generic (PLEG): container finished" podID="3b063d56-a0eb-4b2d-8b53-2c63feead99e" containerID="ea38c17635079a211c8b307953c91fb9e37a06f4db03c9c46e225e504f02afec" exitCode=0 Mar 09 14:34:03 crc kubenswrapper[4764]: I0309 14:34:03.946267 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551114-9w2wl" event={"ID":"3b063d56-a0eb-4b2d-8b53-2c63feead99e","Type":"ContainerDied","Data":"ea38c17635079a211c8b307953c91fb9e37a06f4db03c9c46e225e504f02afec"} Mar 09 14:34:04 crc kubenswrapper[4764]: I0309 14:34:04.020250 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/cp-frr-files/0.log" Mar 09 14:34:04 crc kubenswrapper[4764]: I0309 14:34:04.206534 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/cp-metrics/0.log" Mar 09 14:34:04 crc kubenswrapper[4764]: I0309 14:34:04.216533 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/cp-frr-files/0.log" Mar 09 14:34:04 crc kubenswrapper[4764]: I0309 14:34:04.224430 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:34:04 crc kubenswrapper[4764]: I0309 14:34:04.252060 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/cp-reloader/0.log" Mar 09 14:34:04 crc kubenswrapper[4764]: I0309 14:34:04.283302 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:34:04 crc kubenswrapper[4764]: I0309 14:34:04.304876 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/cp-reloader/0.log" Mar 09 14:34:04 crc kubenswrapper[4764]: I0309 14:34:04.475049 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qzlmx"] Mar 09 14:34:04 crc kubenswrapper[4764]: I0309 14:34:04.775917 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/cp-reloader/0.log" Mar 09 14:34:04 crc kubenswrapper[4764]: I0309 14:34:04.850736 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/cp-frr-files/0.log" Mar 09 14:34:04 crc kubenswrapper[4764]: I0309 14:34:04.863356 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/cp-metrics/0.log" Mar 09 14:34:04 crc kubenswrapper[4764]: I0309 14:34:04.877933 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/cp-metrics/0.log" Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.141590 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/cp-reloader/0.log" Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.206398 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/cp-metrics/0.log" Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.210441 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/controller/0.log" Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.217692 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/cp-frr-files/0.log" Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.381477 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551114-9w2wl" Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.488200 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggjnq\" (UniqueName: \"kubernetes.io/projected/3b063d56-a0eb-4b2d-8b53-2c63feead99e-kube-api-access-ggjnq\") pod \"3b063d56-a0eb-4b2d-8b53-2c63feead99e\" (UID: \"3b063d56-a0eb-4b2d-8b53-2c63feead99e\") " Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.525931 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b063d56-a0eb-4b2d-8b53-2c63feead99e-kube-api-access-ggjnq" (OuterVolumeSpecName: "kube-api-access-ggjnq") pod "3b063d56-a0eb-4b2d-8b53-2c63feead99e" (UID: "3b063d56-a0eb-4b2d-8b53-2c63feead99e"). InnerVolumeSpecName "kube-api-access-ggjnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.580752 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/kube-rbac-proxy/0.log" Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.590240 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/kube-rbac-proxy-frr/0.log" Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.593060 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggjnq\" (UniqueName: \"kubernetes.io/projected/3b063d56-a0eb-4b2d-8b53-2c63feead99e-kube-api-access-ggjnq\") on node \"crc\" DevicePath \"\"" Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.600541 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/frr-metrics/0.log" Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.813653 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/reloader/0.log" Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.883703 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-wqd8z_72efa175-2568-4c62-a97e-35893887fe82/frr-k8s-webhook-server/0.log" Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.973118 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qzlmx" podUID="f2efcd9d-6a11-4afd-8903-0f280284cdaa" containerName="registry-server" containerID="cri-o://6bd50070f5a28319990ff792d02341394570cf6f918cac77f2bd667b387c03b7" gracePeriod=2 Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.973620 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551114-9w2wl" Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.973674 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551114-9w2wl" event={"ID":"3b063d56-a0eb-4b2d-8b53-2c63feead99e","Type":"ContainerDied","Data":"4af4ce420cb2f651ee10eebaabb6884f6f65c51443c9a04bcf3b4f43c4049c40"} Mar 09 14:34:05 crc kubenswrapper[4764]: I0309 14:34:05.973729 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4af4ce420cb2f651ee10eebaabb6884f6f65c51443c9a04bcf3b4f43c4049c40" Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.285109 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-659b995f59-8s255_b89770ec-e502-4b3a-8233-8c9aa76d55de/manager/0.log" Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.467243 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7bccb4c96-wqdrv_ed37a5d1-5d4b-41fb-8476-189def32c909/webhook-server/0.log" Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.468001 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551108-8tqjz"] Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.483326 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551108-8tqjz"] Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.536818 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.573714 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:34:06 crc kubenswrapper[4764]: E0309 14:34:06.574533 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.631465 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lv8kk\" (UniqueName: \"kubernetes.io/projected/f2efcd9d-6a11-4afd-8903-0f280284cdaa-kube-api-access-lv8kk\") pod \"f2efcd9d-6a11-4afd-8903-0f280284cdaa\" (UID: \"f2efcd9d-6a11-4afd-8903-0f280284cdaa\") " Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.631615 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2efcd9d-6a11-4afd-8903-0f280284cdaa-utilities\") pod \"f2efcd9d-6a11-4afd-8903-0f280284cdaa\" (UID: \"f2efcd9d-6a11-4afd-8903-0f280284cdaa\") " Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.631897 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2efcd9d-6a11-4afd-8903-0f280284cdaa-catalog-content\") pod \"f2efcd9d-6a11-4afd-8903-0f280284cdaa\" (UID: \"f2efcd9d-6a11-4afd-8903-0f280284cdaa\") " Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.634274 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2efcd9d-6a11-4afd-8903-0f280284cdaa-utilities" (OuterVolumeSpecName: "utilities") pod "f2efcd9d-6a11-4afd-8903-0f280284cdaa" (UID: "f2efcd9d-6a11-4afd-8903-0f280284cdaa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.642036 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2efcd9d-6a11-4afd-8903-0f280284cdaa-kube-api-access-lv8kk" (OuterVolumeSpecName: "kube-api-access-lv8kk") pod "f2efcd9d-6a11-4afd-8903-0f280284cdaa" (UID: "f2efcd9d-6a11-4afd-8903-0f280284cdaa"). InnerVolumeSpecName "kube-api-access-lv8kk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.734109 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lv8kk\" (UniqueName: \"kubernetes.io/projected/f2efcd9d-6a11-4afd-8903-0f280284cdaa-kube-api-access-lv8kk\") on node \"crc\" DevicePath \"\"" Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.734559 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2efcd9d-6a11-4afd-8903-0f280284cdaa-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.757617 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-2z5wp_bfd899d4-a0df-47e3-aa36-1cf690235c45/kube-rbac-proxy/0.log" Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.784310 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2efcd9d-6a11-4afd-8903-0f280284cdaa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f2efcd9d-6a11-4afd-8903-0f280284cdaa" (UID: "f2efcd9d-6a11-4afd-8903-0f280284cdaa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.838424 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2efcd9d-6a11-4afd-8903-0f280284cdaa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.986759 4764 generic.go:334] "Generic (PLEG): container finished" podID="f2efcd9d-6a11-4afd-8903-0f280284cdaa" containerID="6bd50070f5a28319990ff792d02341394570cf6f918cac77f2bd667b387c03b7" exitCode=0 Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.986819 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzlmx" event={"ID":"f2efcd9d-6a11-4afd-8903-0f280284cdaa","Type":"ContainerDied","Data":"6bd50070f5a28319990ff792d02341394570cf6f918cac77f2bd667b387c03b7"} Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.986858 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzlmx" event={"ID":"f2efcd9d-6a11-4afd-8903-0f280284cdaa","Type":"ContainerDied","Data":"e9d31a776ca5ada6ff61eb8dd1dad9ae7a8028937181ba856028bc7288265bfe"} Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.986881 4764 scope.go:117] "RemoveContainer" containerID="6bd50070f5a28319990ff792d02341394570cf6f918cac77f2bd667b387c03b7" Mar 09 14:34:06 crc kubenswrapper[4764]: I0309 14:34:06.986901 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qzlmx" Mar 09 14:34:07 crc kubenswrapper[4764]: I0309 14:34:07.013928 4764 scope.go:117] "RemoveContainer" containerID="86c4ef2cdaf5c0b3f5f3860085ce0371469edd4c05bde54fbade9ae91ce3c8fe" Mar 09 14:34:07 crc kubenswrapper[4764]: I0309 14:34:07.045028 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qzlmx"] Mar 09 14:34:07 crc kubenswrapper[4764]: I0309 14:34:07.058934 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qzlmx"] Mar 09 14:34:07 crc kubenswrapper[4764]: I0309 14:34:07.060054 4764 scope.go:117] "RemoveContainer" containerID="edfafef8d0234c71f87bec73e4123a65431ec22582183b56a7ee1a6dea625511" Mar 09 14:34:07 crc kubenswrapper[4764]: I0309 14:34:07.121456 4764 scope.go:117] "RemoveContainer" containerID="6bd50070f5a28319990ff792d02341394570cf6f918cac77f2bd667b387c03b7" Mar 09 14:34:07 crc kubenswrapper[4764]: E0309 14:34:07.122072 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bd50070f5a28319990ff792d02341394570cf6f918cac77f2bd667b387c03b7\": container with ID starting with 6bd50070f5a28319990ff792d02341394570cf6f918cac77f2bd667b387c03b7 not found: ID does not exist" containerID="6bd50070f5a28319990ff792d02341394570cf6f918cac77f2bd667b387c03b7" Mar 09 14:34:07 crc kubenswrapper[4764]: I0309 14:34:07.122122 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bd50070f5a28319990ff792d02341394570cf6f918cac77f2bd667b387c03b7"} err="failed to get container status \"6bd50070f5a28319990ff792d02341394570cf6f918cac77f2bd667b387c03b7\": rpc error: code = NotFound desc = could not find container \"6bd50070f5a28319990ff792d02341394570cf6f918cac77f2bd667b387c03b7\": container with ID starting with 6bd50070f5a28319990ff792d02341394570cf6f918cac77f2bd667b387c03b7 not found: ID does not exist" Mar 09 14:34:07 crc kubenswrapper[4764]: I0309 14:34:07.122156 4764 scope.go:117] "RemoveContainer" containerID="86c4ef2cdaf5c0b3f5f3860085ce0371469edd4c05bde54fbade9ae91ce3c8fe" Mar 09 14:34:07 crc kubenswrapper[4764]: E0309 14:34:07.122877 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86c4ef2cdaf5c0b3f5f3860085ce0371469edd4c05bde54fbade9ae91ce3c8fe\": container with ID starting with 86c4ef2cdaf5c0b3f5f3860085ce0371469edd4c05bde54fbade9ae91ce3c8fe not found: ID does not exist" containerID="86c4ef2cdaf5c0b3f5f3860085ce0371469edd4c05bde54fbade9ae91ce3c8fe" Mar 09 14:34:07 crc kubenswrapper[4764]: I0309 14:34:07.122907 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86c4ef2cdaf5c0b3f5f3860085ce0371469edd4c05bde54fbade9ae91ce3c8fe"} err="failed to get container status \"86c4ef2cdaf5c0b3f5f3860085ce0371469edd4c05bde54fbade9ae91ce3c8fe\": rpc error: code = NotFound desc = could not find container \"86c4ef2cdaf5c0b3f5f3860085ce0371469edd4c05bde54fbade9ae91ce3c8fe\": container with ID starting with 86c4ef2cdaf5c0b3f5f3860085ce0371469edd4c05bde54fbade9ae91ce3c8fe not found: ID does not exist" Mar 09 14:34:07 crc kubenswrapper[4764]: I0309 14:34:07.122928 4764 scope.go:117] "RemoveContainer" containerID="edfafef8d0234c71f87bec73e4123a65431ec22582183b56a7ee1a6dea625511" Mar 09 14:34:07 crc kubenswrapper[4764]: E0309 14:34:07.123194 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edfafef8d0234c71f87bec73e4123a65431ec22582183b56a7ee1a6dea625511\": container with ID starting with edfafef8d0234c71f87bec73e4123a65431ec22582183b56a7ee1a6dea625511 not found: ID does not exist" containerID="edfafef8d0234c71f87bec73e4123a65431ec22582183b56a7ee1a6dea625511" Mar 09 14:34:07 crc kubenswrapper[4764]: I0309 14:34:07.123223 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edfafef8d0234c71f87bec73e4123a65431ec22582183b56a7ee1a6dea625511"} err="failed to get container status \"edfafef8d0234c71f87bec73e4123a65431ec22582183b56a7ee1a6dea625511\": rpc error: code = NotFound desc = could not find container \"edfafef8d0234c71f87bec73e4123a65431ec22582183b56a7ee1a6dea625511\": container with ID starting with edfafef8d0234c71f87bec73e4123a65431ec22582183b56a7ee1a6dea625511 not found: ID does not exist" Mar 09 14:34:07 crc kubenswrapper[4764]: I0309 14:34:07.312266 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kl47c_9333a95c-85e4-4e7d-a142-ae2dd06b4146/frr/0.log" Mar 09 14:34:07 crc kubenswrapper[4764]: I0309 14:34:07.359591 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-2z5wp_bfd899d4-a0df-47e3-aa36-1cf690235c45/speaker/0.log" Mar 09 14:34:07 crc kubenswrapper[4764]: I0309 14:34:07.575474 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f49d558-1317-4099-abb0-bb57895b3917" path="/var/lib/kubelet/pods/1f49d558-1317-4099-abb0-bb57895b3917/volumes" Mar 09 14:34:07 crc kubenswrapper[4764]: I0309 14:34:07.576291 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2efcd9d-6a11-4afd-8903-0f280284cdaa" path="/var/lib/kubelet/pods/f2efcd9d-6a11-4afd-8903-0f280284cdaa/volumes" Mar 09 14:34:12 crc kubenswrapper[4764]: I0309 14:34:12.794400 4764 scope.go:117] "RemoveContainer" containerID="3d8746bde1498da01929d78c37701a3fb1698a0044a00e71ddbcca0d4465fb3c" Mar 09 14:34:17 crc kubenswrapper[4764]: I0309 14:34:17.560763 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:34:17 crc kubenswrapper[4764]: E0309 14:34:17.561860 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:34:22 crc kubenswrapper[4764]: I0309 14:34:22.007282 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_704ddae7-42eb-4609-b4a3-64d5078c2126/util/0.log" Mar 09 14:34:22 crc kubenswrapper[4764]: I0309 14:34:22.307040 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_704ddae7-42eb-4609-b4a3-64d5078c2126/util/0.log" Mar 09 14:34:22 crc kubenswrapper[4764]: I0309 14:34:22.323239 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_704ddae7-42eb-4609-b4a3-64d5078c2126/pull/0.log" Mar 09 14:34:22 crc kubenswrapper[4764]: I0309 14:34:22.369751 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_704ddae7-42eb-4609-b4a3-64d5078c2126/pull/0.log" Mar 09 14:34:22 crc kubenswrapper[4764]: I0309 14:34:22.495513 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_704ddae7-42eb-4609-b4a3-64d5078c2126/util/0.log" Mar 09 14:34:22 crc kubenswrapper[4764]: I0309 14:34:22.529663 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_704ddae7-42eb-4609-b4a3-64d5078c2126/pull/0.log" Mar 09 14:34:22 crc kubenswrapper[4764]: I0309 14:34:22.544817 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a825xk5v_704ddae7-42eb-4609-b4a3-64d5078c2126/extract/0.log" Mar 09 14:34:22 crc kubenswrapper[4764]: I0309 14:34:22.718040 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bbwx5_a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa/extract-utilities/0.log" Mar 09 14:34:22 crc kubenswrapper[4764]: I0309 14:34:22.929680 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bbwx5_a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa/extract-utilities/0.log" Mar 09 14:34:22 crc kubenswrapper[4764]: I0309 14:34:22.959789 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bbwx5_a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa/extract-content/0.log" Mar 09 14:34:22 crc kubenswrapper[4764]: I0309 14:34:22.965424 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bbwx5_a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa/extract-content/0.log" Mar 09 14:34:23 crc kubenswrapper[4764]: I0309 14:34:23.149920 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bbwx5_a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa/extract-content/0.log" Mar 09 14:34:23 crc kubenswrapper[4764]: I0309 14:34:23.150765 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bbwx5_a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa/extract-utilities/0.log" Mar 09 14:34:23 crc kubenswrapper[4764]: I0309 14:34:23.437170 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ggs99_176982f0-3e86-471c-8054-13490ee485bb/extract-utilities/0.log" Mar 09 14:34:23 crc kubenswrapper[4764]: I0309 14:34:23.672680 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ggs99_176982f0-3e86-471c-8054-13490ee485bb/extract-content/0.log" Mar 09 14:34:23 crc kubenswrapper[4764]: I0309 14:34:23.691895 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ggs99_176982f0-3e86-471c-8054-13490ee485bb/extract-utilities/0.log" Mar 09 14:34:23 crc kubenswrapper[4764]: I0309 14:34:23.731984 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ggs99_176982f0-3e86-471c-8054-13490ee485bb/extract-content/0.log" Mar 09 14:34:23 crc kubenswrapper[4764]: I0309 14:34:23.984036 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ggs99_176982f0-3e86-471c-8054-13490ee485bb/extract-utilities/0.log" Mar 09 14:34:23 crc kubenswrapper[4764]: I0309 14:34:23.991825 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ggs99_176982f0-3e86-471c-8054-13490ee485bb/extract-content/0.log" Mar 09 14:34:24 crc kubenswrapper[4764]: I0309 14:34:24.031295 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bbwx5_a6d68e16-c0d2-4f98-9b3f-d1d392bf67fa/registry-server/0.log" Mar 09 14:34:24 crc kubenswrapper[4764]: I0309 14:34:24.192468 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ggs99_176982f0-3e86-471c-8054-13490ee485bb/registry-server/0.log" Mar 09 14:34:24 crc kubenswrapper[4764]: I0309 14:34:24.216282 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg_413f45cc-5916-45bb-a2a2-7b33029445af/util/0.log" Mar 09 14:34:24 crc kubenswrapper[4764]: I0309 14:34:24.505434 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg_413f45cc-5916-45bb-a2a2-7b33029445af/pull/0.log" Mar 09 14:34:24 crc kubenswrapper[4764]: I0309 14:34:24.515452 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg_413f45cc-5916-45bb-a2a2-7b33029445af/util/0.log" Mar 09 14:34:24 crc kubenswrapper[4764]: I0309 14:34:24.561829 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg_413f45cc-5916-45bb-a2a2-7b33029445af/pull/0.log" Mar 09 14:34:24 crc kubenswrapper[4764]: I0309 14:34:24.716375 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg_413f45cc-5916-45bb-a2a2-7b33029445af/util/0.log" Mar 09 14:34:24 crc kubenswrapper[4764]: I0309 14:34:24.762337 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg_413f45cc-5916-45bb-a2a2-7b33029445af/pull/0.log" Mar 09 14:34:24 crc kubenswrapper[4764]: I0309 14:34:24.762584 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4gl4pg_413f45cc-5916-45bb-a2a2-7b33029445af/extract/0.log" Mar 09 14:34:24 crc kubenswrapper[4764]: I0309 14:34:24.916448 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-m2gc7_4351c9fc-c207-4d15-b8a6-f51c0651fe83/marketplace-operator/0.log" Mar 09 14:34:25 crc kubenswrapper[4764]: I0309 14:34:25.029715 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-whs64_26dd13d2-9d2e-4f59-97a6-e31b76ccf74c/extract-utilities/0.log" Mar 09 14:34:25 crc kubenswrapper[4764]: I0309 14:34:25.198422 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-whs64_26dd13d2-9d2e-4f59-97a6-e31b76ccf74c/extract-utilities/0.log" Mar 09 14:34:25 crc kubenswrapper[4764]: I0309 14:34:25.225337 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-whs64_26dd13d2-9d2e-4f59-97a6-e31b76ccf74c/extract-content/0.log" Mar 09 14:34:25 crc kubenswrapper[4764]: I0309 14:34:25.263220 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-whs64_26dd13d2-9d2e-4f59-97a6-e31b76ccf74c/extract-content/0.log" Mar 09 14:34:25 crc kubenswrapper[4764]: I0309 14:34:25.975386 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-whs64_26dd13d2-9d2e-4f59-97a6-e31b76ccf74c/extract-utilities/0.log" Mar 09 14:34:25 crc kubenswrapper[4764]: I0309 14:34:25.995523 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-whs64_26dd13d2-9d2e-4f59-97a6-e31b76ccf74c/extract-content/0.log" Mar 09 14:34:26 crc kubenswrapper[4764]: I0309 14:34:26.158684 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-whs64_26dd13d2-9d2e-4f59-97a6-e31b76ccf74c/registry-server/0.log" Mar 09 14:34:26 crc kubenswrapper[4764]: I0309 14:34:26.219155 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xn8sz_621cdc4e-d896-4775-b654-2d6606097cb9/extract-utilities/0.log" Mar 09 14:34:26 crc kubenswrapper[4764]: I0309 14:34:26.493037 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xn8sz_621cdc4e-d896-4775-b654-2d6606097cb9/extract-content/0.log" Mar 09 14:34:26 crc kubenswrapper[4764]: I0309 14:34:26.502054 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xn8sz_621cdc4e-d896-4775-b654-2d6606097cb9/extract-utilities/0.log" Mar 09 14:34:26 crc kubenswrapper[4764]: I0309 14:34:26.508129 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xn8sz_621cdc4e-d896-4775-b654-2d6606097cb9/extract-content/0.log" Mar 09 14:34:26 crc kubenswrapper[4764]: I0309 14:34:26.683133 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xn8sz_621cdc4e-d896-4775-b654-2d6606097cb9/extract-utilities/0.log" Mar 09 14:34:26 crc kubenswrapper[4764]: I0309 14:34:26.711810 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xn8sz_621cdc4e-d896-4775-b654-2d6606097cb9/extract-content/0.log" Mar 09 14:34:27 crc kubenswrapper[4764]: I0309 14:34:27.519059 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xn8sz_621cdc4e-d896-4775-b654-2d6606097cb9/registry-server/0.log" Mar 09 14:34:28 crc kubenswrapper[4764]: I0309 14:34:28.560239 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:34:28 crc kubenswrapper[4764]: E0309 14:34:28.560679 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:34:42 crc kubenswrapper[4764]: I0309 14:34:42.560015 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:34:42 crc kubenswrapper[4764]: E0309 14:34:42.561312 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:34:57 crc kubenswrapper[4764]: I0309 14:34:57.560198 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:34:57 crc kubenswrapper[4764]: E0309 14:34:57.561149 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:35:09 crc kubenswrapper[4764]: I0309 14:35:09.560561 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:35:09 crc kubenswrapper[4764]: E0309 14:35:09.561830 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:35:21 crc kubenswrapper[4764]: I0309 14:35:21.561026 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:35:21 crc kubenswrapper[4764]: E0309 14:35:21.562289 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:35:34 crc kubenswrapper[4764]: I0309 14:35:34.560361 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:35:34 crc kubenswrapper[4764]: E0309 14:35:34.561586 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:35:45 crc kubenswrapper[4764]: I0309 14:35:45.569695 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:35:45 crc kubenswrapper[4764]: E0309 14:35:45.570801 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:35:57 crc kubenswrapper[4764]: I0309 14:35:57.560950 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:35:57 crc kubenswrapper[4764]: E0309 14:35:57.562031 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.157308 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551116-plhlc"] Mar 09 14:36:00 crc kubenswrapper[4764]: E0309 14:36:00.158780 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b063d56-a0eb-4b2d-8b53-2c63feead99e" containerName="oc" Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.158803 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b063d56-a0eb-4b2d-8b53-2c63feead99e" containerName="oc" Mar 09 14:36:00 crc kubenswrapper[4764]: E0309 14:36:00.158828 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2efcd9d-6a11-4afd-8903-0f280284cdaa" containerName="extract-utilities" Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.158840 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2efcd9d-6a11-4afd-8903-0f280284cdaa" containerName="extract-utilities" Mar 09 14:36:00 crc kubenswrapper[4764]: E0309 14:36:00.158858 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2efcd9d-6a11-4afd-8903-0f280284cdaa" containerName="registry-server" Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.158866 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2efcd9d-6a11-4afd-8903-0f280284cdaa" containerName="registry-server" Mar 09 14:36:00 crc kubenswrapper[4764]: E0309 14:36:00.158884 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2efcd9d-6a11-4afd-8903-0f280284cdaa" containerName="extract-content" Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.158894 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2efcd9d-6a11-4afd-8903-0f280284cdaa" containerName="extract-content" Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.159251 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2efcd9d-6a11-4afd-8903-0f280284cdaa" containerName="registry-server" Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.159276 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b063d56-a0eb-4b2d-8b53-2c63feead99e" containerName="oc" Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.160391 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551116-plhlc" Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.163021 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.163501 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.163612 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.174899 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551116-plhlc"] Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.256165 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88bdr\" (UniqueName: \"kubernetes.io/projected/25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6-kube-api-access-88bdr\") pod \"auto-csr-approver-29551116-plhlc\" (UID: \"25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6\") " pod="openshift-infra/auto-csr-approver-29551116-plhlc" Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.359017 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88bdr\" (UniqueName: \"kubernetes.io/projected/25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6-kube-api-access-88bdr\") pod \"auto-csr-approver-29551116-plhlc\" (UID: \"25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6\") " pod="openshift-infra/auto-csr-approver-29551116-plhlc" Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.388933 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88bdr\" (UniqueName: \"kubernetes.io/projected/25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6-kube-api-access-88bdr\") pod \"auto-csr-approver-29551116-plhlc\" (UID: \"25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6\") " pod="openshift-infra/auto-csr-approver-29551116-plhlc" Mar 09 14:36:00 crc kubenswrapper[4764]: I0309 14:36:00.487057 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551116-plhlc" Mar 09 14:36:01 crc kubenswrapper[4764]: I0309 14:36:01.005047 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551116-plhlc"] Mar 09 14:36:01 crc kubenswrapper[4764]: I0309 14:36:01.028603 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 14:36:01 crc kubenswrapper[4764]: I0309 14:36:01.199051 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551116-plhlc" event={"ID":"25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6","Type":"ContainerStarted","Data":"0839267fab67af7136fa26619fbb2672ecbc2c656e8177e7e69640966981f45a"} Mar 09 14:36:03 crc kubenswrapper[4764]: I0309 14:36:03.227918 4764 generic.go:334] "Generic (PLEG): container finished" podID="25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6" containerID="ea7fd0d4a7f1c688c8ce67675688ffbac7999aee93247d1046a8ee856d2e349c" exitCode=0 Mar 09 14:36:03 crc kubenswrapper[4764]: I0309 14:36:03.227993 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551116-plhlc" event={"ID":"25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6","Type":"ContainerDied","Data":"ea7fd0d4a7f1c688c8ce67675688ffbac7999aee93247d1046a8ee856d2e349c"} Mar 09 14:36:04 crc kubenswrapper[4764]: I0309 14:36:04.602127 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551116-plhlc" Mar 09 14:36:04 crc kubenswrapper[4764]: I0309 14:36:04.677552 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88bdr\" (UniqueName: \"kubernetes.io/projected/25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6-kube-api-access-88bdr\") pod \"25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6\" (UID: \"25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6\") " Mar 09 14:36:05 crc kubenswrapper[4764]: I0309 14:36:05.193005 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6-kube-api-access-88bdr" (OuterVolumeSpecName: "kube-api-access-88bdr") pod "25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6" (UID: "25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6"). InnerVolumeSpecName "kube-api-access-88bdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:36:05 crc kubenswrapper[4764]: I0309 14:36:05.249599 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551116-plhlc" Mar 09 14:36:05 crc kubenswrapper[4764]: I0309 14:36:05.249510 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551116-plhlc" event={"ID":"25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6","Type":"ContainerDied","Data":"0839267fab67af7136fa26619fbb2672ecbc2c656e8177e7e69640966981f45a"} Mar 09 14:36:05 crc kubenswrapper[4764]: I0309 14:36:05.261160 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0839267fab67af7136fa26619fbb2672ecbc2c656e8177e7e69640966981f45a" Mar 09 14:36:05 crc kubenswrapper[4764]: I0309 14:36:05.292108 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88bdr\" (UniqueName: \"kubernetes.io/projected/25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6-kube-api-access-88bdr\") on node \"crc\" DevicePath \"\"" Mar 09 14:36:05 crc kubenswrapper[4764]: I0309 14:36:05.685599 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551110-sjw8t"] Mar 09 14:36:05 crc kubenswrapper[4764]: I0309 14:36:05.694409 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551110-sjw8t"] Mar 09 14:36:07 crc kubenswrapper[4764]: I0309 14:36:07.571277 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44063b01-0b96-488c-98af-43cdb752467e" path="/var/lib/kubelet/pods/44063b01-0b96-488c-98af-43cdb752467e/volumes" Mar 09 14:36:12 crc kubenswrapper[4764]: I0309 14:36:12.560472 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:36:12 crc kubenswrapper[4764]: E0309 14:36:12.561708 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:36:12 crc kubenswrapper[4764]: I0309 14:36:12.935194 4764 scope.go:117] "RemoveContainer" containerID="b7f1a91e8b51914a12830dcf15f240318b51a9141210e70d3ec57af5cc977455" Mar 09 14:36:24 crc kubenswrapper[4764]: I0309 14:36:24.563295 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:36:24 crc kubenswrapper[4764]: E0309 14:36:24.564365 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:36:39 crc kubenswrapper[4764]: I0309 14:36:39.560835 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:36:39 crc kubenswrapper[4764]: E0309 14:36:39.562172 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:36:42 crc kubenswrapper[4764]: I0309 14:36:42.649533 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4069cd4-c4ea-4c35-a8e3-231f40655d27" containerID="3751c9a7f60cedf52e5109a5d05ced7856abcae8f6d88733462034a07970747a" exitCode=0 Mar 09 14:36:42 crc kubenswrapper[4764]: I0309 14:36:42.649721 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2n85d/must-gather-hmzh8" event={"ID":"f4069cd4-c4ea-4c35-a8e3-231f40655d27","Type":"ContainerDied","Data":"3751c9a7f60cedf52e5109a5d05ced7856abcae8f6d88733462034a07970747a"} Mar 09 14:36:42 crc kubenswrapper[4764]: I0309 14:36:42.651310 4764 scope.go:117] "RemoveContainer" containerID="3751c9a7f60cedf52e5109a5d05ced7856abcae8f6d88733462034a07970747a" Mar 09 14:36:42 crc kubenswrapper[4764]: I0309 14:36:42.798005 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2n85d_must-gather-hmzh8_f4069cd4-c4ea-4c35-a8e3-231f40655d27/gather/0.log" Mar 09 14:36:51 crc kubenswrapper[4764]: I0309 14:36:51.560346 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:36:51 crc kubenswrapper[4764]: E0309 14:36:51.561592 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:36:51 crc kubenswrapper[4764]: I0309 14:36:51.589063 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2n85d/must-gather-hmzh8"] Mar 09 14:36:51 crc kubenswrapper[4764]: I0309 14:36:51.589402 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-2n85d/must-gather-hmzh8" podUID="f4069cd4-c4ea-4c35-a8e3-231f40655d27" containerName="copy" containerID="cri-o://7a1d05f1a79149bd8bced16f5f4fb67221a689028074b5bf7fc3e37c2411e2d9" gracePeriod=2 Mar 09 14:36:51 crc kubenswrapper[4764]: I0309 14:36:51.605606 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2n85d/must-gather-hmzh8"] Mar 09 14:36:51 crc kubenswrapper[4764]: I0309 14:36:51.746887 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2n85d_must-gather-hmzh8_f4069cd4-c4ea-4c35-a8e3-231f40655d27/copy/0.log" Mar 09 14:36:51 crc kubenswrapper[4764]: I0309 14:36:51.747766 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4069cd4-c4ea-4c35-a8e3-231f40655d27" containerID="7a1d05f1a79149bd8bced16f5f4fb67221a689028074b5bf7fc3e37c2411e2d9" exitCode=143 Mar 09 14:36:52 crc kubenswrapper[4764]: I0309 14:36:52.049522 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2n85d_must-gather-hmzh8_f4069cd4-c4ea-4c35-a8e3-231f40655d27/copy/0.log" Mar 09 14:36:52 crc kubenswrapper[4764]: I0309 14:36:52.049980 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/must-gather-hmzh8" Mar 09 14:36:52 crc kubenswrapper[4764]: I0309 14:36:52.138124 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f4069cd4-c4ea-4c35-a8e3-231f40655d27-must-gather-output\") pod \"f4069cd4-c4ea-4c35-a8e3-231f40655d27\" (UID: \"f4069cd4-c4ea-4c35-a8e3-231f40655d27\") " Mar 09 14:36:52 crc kubenswrapper[4764]: I0309 14:36:52.138544 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bslj\" (UniqueName: \"kubernetes.io/projected/f4069cd4-c4ea-4c35-a8e3-231f40655d27-kube-api-access-8bslj\") pod \"f4069cd4-c4ea-4c35-a8e3-231f40655d27\" (UID: \"f4069cd4-c4ea-4c35-a8e3-231f40655d27\") " Mar 09 14:36:52 crc kubenswrapper[4764]: I0309 14:36:52.191657 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4069cd4-c4ea-4c35-a8e3-231f40655d27-kube-api-access-8bslj" (OuterVolumeSpecName: "kube-api-access-8bslj") pod "f4069cd4-c4ea-4c35-a8e3-231f40655d27" (UID: "f4069cd4-c4ea-4c35-a8e3-231f40655d27"). InnerVolumeSpecName "kube-api-access-8bslj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:36:52 crc kubenswrapper[4764]: I0309 14:36:52.242532 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bslj\" (UniqueName: \"kubernetes.io/projected/f4069cd4-c4ea-4c35-a8e3-231f40655d27-kube-api-access-8bslj\") on node \"crc\" DevicePath \"\"" Mar 09 14:36:52 crc kubenswrapper[4764]: I0309 14:36:52.381446 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4069cd4-c4ea-4c35-a8e3-231f40655d27-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "f4069cd4-c4ea-4c35-a8e3-231f40655d27" (UID: "f4069cd4-c4ea-4c35-a8e3-231f40655d27"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:36:52 crc kubenswrapper[4764]: I0309 14:36:52.447007 4764 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f4069cd4-c4ea-4c35-a8e3-231f40655d27-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 09 14:36:52 crc kubenswrapper[4764]: I0309 14:36:52.759014 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2n85d_must-gather-hmzh8_f4069cd4-c4ea-4c35-a8e3-231f40655d27/copy/0.log" Mar 09 14:36:52 crc kubenswrapper[4764]: I0309 14:36:52.759425 4764 scope.go:117] "RemoveContainer" containerID="7a1d05f1a79149bd8bced16f5f4fb67221a689028074b5bf7fc3e37c2411e2d9" Mar 09 14:36:52 crc kubenswrapper[4764]: I0309 14:36:52.759461 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2n85d/must-gather-hmzh8" Mar 09 14:36:52 crc kubenswrapper[4764]: I0309 14:36:52.788005 4764 scope.go:117] "RemoveContainer" containerID="3751c9a7f60cedf52e5109a5d05ced7856abcae8f6d88733462034a07970747a" Mar 09 14:36:53 crc kubenswrapper[4764]: I0309 14:36:53.572883 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4069cd4-c4ea-4c35-a8e3-231f40655d27" path="/var/lib/kubelet/pods/f4069cd4-c4ea-4c35-a8e3-231f40655d27/volumes" Mar 09 14:37:03 crc kubenswrapper[4764]: I0309 14:37:03.560772 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:37:03 crc kubenswrapper[4764]: E0309 14:37:03.564012 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:37:13 crc kubenswrapper[4764]: I0309 14:37:13.009509 4764 scope.go:117] "RemoveContainer" containerID="35306a339ac014f00b9490fd62d260c20edfb15b36c977f1b4e50e10596530a4" Mar 09 14:37:18 crc kubenswrapper[4764]: I0309 14:37:18.560670 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:37:18 crc kubenswrapper[4764]: E0309 14:37:18.562994 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:37:33 crc kubenswrapper[4764]: I0309 14:37:33.560855 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:37:33 crc kubenswrapper[4764]: E0309 14:37:33.561925 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:37:45 crc kubenswrapper[4764]: I0309 14:37:45.567181 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:37:45 crc kubenswrapper[4764]: E0309 14:37:45.568313 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:37:58 crc kubenswrapper[4764]: I0309 14:37:58.560193 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:37:58 crc kubenswrapper[4764]: E0309 14:37:58.561279 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:38:00 crc kubenswrapper[4764]: I0309 14:38:00.154320 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551118-kjz4g"] Mar 09 14:38:00 crc kubenswrapper[4764]: E0309 14:38:00.155778 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4069cd4-c4ea-4c35-a8e3-231f40655d27" containerName="copy" Mar 09 14:38:00 crc kubenswrapper[4764]: I0309 14:38:00.155797 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4069cd4-c4ea-4c35-a8e3-231f40655d27" containerName="copy" Mar 09 14:38:00 crc kubenswrapper[4764]: E0309 14:38:00.155815 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6" containerName="oc" Mar 09 14:38:00 crc kubenswrapper[4764]: I0309 14:38:00.155824 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6" containerName="oc" Mar 09 14:38:00 crc kubenswrapper[4764]: E0309 14:38:00.155853 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4069cd4-c4ea-4c35-a8e3-231f40655d27" containerName="gather" Mar 09 14:38:00 crc kubenswrapper[4764]: I0309 14:38:00.156001 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4069cd4-c4ea-4c35-a8e3-231f40655d27" containerName="gather" Mar 09 14:38:00 crc kubenswrapper[4764]: I0309 14:38:00.156308 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4069cd4-c4ea-4c35-a8e3-231f40655d27" containerName="copy" Mar 09 14:38:00 crc kubenswrapper[4764]: I0309 14:38:00.156340 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4069cd4-c4ea-4c35-a8e3-231f40655d27" containerName="gather" Mar 09 14:38:00 crc kubenswrapper[4764]: I0309 14:38:00.156363 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6" containerName="oc" Mar 09 14:38:00 crc kubenswrapper[4764]: I0309 14:38:00.157396 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551118-kjz4g" Mar 09 14:38:00 crc kubenswrapper[4764]: I0309 14:38:00.160255 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:38:00 crc kubenswrapper[4764]: I0309 14:38:00.160262 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:38:00 crc kubenswrapper[4764]: I0309 14:38:00.161685 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:38:00 crc kubenswrapper[4764]: I0309 14:38:00.164076 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551118-kjz4g"] Mar 09 14:38:00 crc kubenswrapper[4764]: I0309 14:38:00.331539 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pjvr\" (UniqueName: \"kubernetes.io/projected/f04448f5-694b-46ae-9599-546c7bbe0c14-kube-api-access-5pjvr\") pod \"auto-csr-approver-29551118-kjz4g\" (UID: \"f04448f5-694b-46ae-9599-546c7bbe0c14\") " pod="openshift-infra/auto-csr-approver-29551118-kjz4g" Mar 09 14:38:00 crc kubenswrapper[4764]: I0309 14:38:00.433343 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pjvr\" (UniqueName: \"kubernetes.io/projected/f04448f5-694b-46ae-9599-546c7bbe0c14-kube-api-access-5pjvr\") pod \"auto-csr-approver-29551118-kjz4g\" (UID: \"f04448f5-694b-46ae-9599-546c7bbe0c14\") " pod="openshift-infra/auto-csr-approver-29551118-kjz4g" Mar 09 14:38:00 crc kubenswrapper[4764]: I0309 14:38:00.885711 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pjvr\" (UniqueName: \"kubernetes.io/projected/f04448f5-694b-46ae-9599-546c7bbe0c14-kube-api-access-5pjvr\") pod \"auto-csr-approver-29551118-kjz4g\" (UID: \"f04448f5-694b-46ae-9599-546c7bbe0c14\") " pod="openshift-infra/auto-csr-approver-29551118-kjz4g" Mar 09 14:38:01 crc kubenswrapper[4764]: I0309 14:38:01.078703 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551118-kjz4g" Mar 09 14:38:01 crc kubenswrapper[4764]: I0309 14:38:01.575667 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551118-kjz4g"] Mar 09 14:38:02 crc kubenswrapper[4764]: I0309 14:38:02.571554 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551118-kjz4g" event={"ID":"f04448f5-694b-46ae-9599-546c7bbe0c14","Type":"ContainerStarted","Data":"0ed8f0ec55a294187ea8be7f43ebadf8b14bdebd77fa0465ade1a0a2c4234e1a"} Mar 09 14:38:03 crc kubenswrapper[4764]: I0309 14:38:03.584701 4764 generic.go:334] "Generic (PLEG): container finished" podID="f04448f5-694b-46ae-9599-546c7bbe0c14" containerID="71c2dc76a892ba46131f8abbf7343dc29824edff40c7100aa8bae5d80219ca9b" exitCode=0 Mar 09 14:38:03 crc kubenswrapper[4764]: I0309 14:38:03.584815 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551118-kjz4g" event={"ID":"f04448f5-694b-46ae-9599-546c7bbe0c14","Type":"ContainerDied","Data":"71c2dc76a892ba46131f8abbf7343dc29824edff40c7100aa8bae5d80219ca9b"} Mar 09 14:38:04 crc kubenswrapper[4764]: I0309 14:38:04.998012 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551118-kjz4g" Mar 09 14:38:05 crc kubenswrapper[4764]: I0309 14:38:05.151512 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pjvr\" (UniqueName: \"kubernetes.io/projected/f04448f5-694b-46ae-9599-546c7bbe0c14-kube-api-access-5pjvr\") pod \"f04448f5-694b-46ae-9599-546c7bbe0c14\" (UID: \"f04448f5-694b-46ae-9599-546c7bbe0c14\") " Mar 09 14:38:05 crc kubenswrapper[4764]: I0309 14:38:05.161046 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f04448f5-694b-46ae-9599-546c7bbe0c14-kube-api-access-5pjvr" (OuterVolumeSpecName: "kube-api-access-5pjvr") pod "f04448f5-694b-46ae-9599-546c7bbe0c14" (UID: "f04448f5-694b-46ae-9599-546c7bbe0c14"). InnerVolumeSpecName "kube-api-access-5pjvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:38:05 crc kubenswrapper[4764]: I0309 14:38:05.255157 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pjvr\" (UniqueName: \"kubernetes.io/projected/f04448f5-694b-46ae-9599-546c7bbe0c14-kube-api-access-5pjvr\") on node \"crc\" DevicePath \"\"" Mar 09 14:38:05 crc kubenswrapper[4764]: I0309 14:38:05.613432 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551118-kjz4g" event={"ID":"f04448f5-694b-46ae-9599-546c7bbe0c14","Type":"ContainerDied","Data":"0ed8f0ec55a294187ea8be7f43ebadf8b14bdebd77fa0465ade1a0a2c4234e1a"} Mar 09 14:38:05 crc kubenswrapper[4764]: I0309 14:38:05.614113 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ed8f0ec55a294187ea8be7f43ebadf8b14bdebd77fa0465ade1a0a2c4234e1a" Mar 09 14:38:05 crc kubenswrapper[4764]: I0309 14:38:05.613799 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551118-kjz4g" Mar 09 14:38:06 crc kubenswrapper[4764]: I0309 14:38:06.075067 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551112-lcwnw"] Mar 09 14:38:06 crc kubenswrapper[4764]: I0309 14:38:06.084590 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551112-lcwnw"] Mar 09 14:38:07 crc kubenswrapper[4764]: I0309 14:38:07.574412 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3078e21d-b42c-45f0-94c0-d3980ec27f1f" path="/var/lib/kubelet/pods/3078e21d-b42c-45f0-94c0-d3980ec27f1f/volumes" Mar 09 14:38:12 crc kubenswrapper[4764]: I0309 14:38:12.560132 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:38:12 crc kubenswrapper[4764]: E0309 14:38:12.561372 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:38:13 crc kubenswrapper[4764]: I0309 14:38:13.142054 4764 scope.go:117] "RemoveContainer" containerID="6874ccb9508c9920d5940aa912a69fc71adf07efd3d1ffdbd80a9373286c8c70" Mar 09 14:38:25 crc kubenswrapper[4764]: I0309 14:38:25.567108 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:38:25 crc kubenswrapper[4764]: E0309 14:38:25.568486 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xxczl_openshift-machine-config-operator(6bcdd179-43c2-427c-9fac-7155c122e922)\"" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" Mar 09 14:38:38 crc kubenswrapper[4764]: I0309 14:38:38.559669 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:38:38 crc kubenswrapper[4764]: I0309 14:38:38.953859 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"ad419096bfe6f5c82a80213c43472ce393de4b635e5b73b0e84a78e63ff05edb"} Mar 09 14:40:00 crc kubenswrapper[4764]: I0309 14:40:00.163567 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551120-m5thp"] Mar 09 14:40:00 crc kubenswrapper[4764]: E0309 14:40:00.165105 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f04448f5-694b-46ae-9599-546c7bbe0c14" containerName="oc" Mar 09 14:40:00 crc kubenswrapper[4764]: I0309 14:40:00.165125 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f04448f5-694b-46ae-9599-546c7bbe0c14" containerName="oc" Mar 09 14:40:00 crc kubenswrapper[4764]: I0309 14:40:00.165490 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f04448f5-694b-46ae-9599-546c7bbe0c14" containerName="oc" Mar 09 14:40:00 crc kubenswrapper[4764]: I0309 14:40:00.166516 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551120-m5thp" Mar 09 14:40:00 crc kubenswrapper[4764]: I0309 14:40:00.170019 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:40:00 crc kubenswrapper[4764]: I0309 14:40:00.170095 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:40:00 crc kubenswrapper[4764]: I0309 14:40:00.170394 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:40:00 crc kubenswrapper[4764]: I0309 14:40:00.189390 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551120-m5thp"] Mar 09 14:40:00 crc kubenswrapper[4764]: I0309 14:40:00.269497 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh8gd\" (UniqueName: \"kubernetes.io/projected/0816f954-d7d8-485c-80c2-f37396ccc846-kube-api-access-xh8gd\") pod \"auto-csr-approver-29551120-m5thp\" (UID: \"0816f954-d7d8-485c-80c2-f37396ccc846\") " pod="openshift-infra/auto-csr-approver-29551120-m5thp" Mar 09 14:40:00 crc kubenswrapper[4764]: I0309 14:40:00.371719 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh8gd\" (UniqueName: \"kubernetes.io/projected/0816f954-d7d8-485c-80c2-f37396ccc846-kube-api-access-xh8gd\") pod \"auto-csr-approver-29551120-m5thp\" (UID: \"0816f954-d7d8-485c-80c2-f37396ccc846\") " pod="openshift-infra/auto-csr-approver-29551120-m5thp" Mar 09 14:40:00 crc kubenswrapper[4764]: I0309 14:40:00.393523 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh8gd\" (UniqueName: \"kubernetes.io/projected/0816f954-d7d8-485c-80c2-f37396ccc846-kube-api-access-xh8gd\") pod \"auto-csr-approver-29551120-m5thp\" (UID: \"0816f954-d7d8-485c-80c2-f37396ccc846\") " pod="openshift-infra/auto-csr-approver-29551120-m5thp" Mar 09 14:40:00 crc kubenswrapper[4764]: I0309 14:40:00.491735 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551120-m5thp" Mar 09 14:40:01 crc kubenswrapper[4764]: I0309 14:40:01.006936 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551120-m5thp"] Mar 09 14:40:01 crc kubenswrapper[4764]: I0309 14:40:01.829778 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551120-m5thp" event={"ID":"0816f954-d7d8-485c-80c2-f37396ccc846","Type":"ContainerStarted","Data":"418d95bf9922ee6c68fc6ee54bf92d132a0fa80cbe02cb056aa7e9af26613f17"} Mar 09 14:40:02 crc kubenswrapper[4764]: I0309 14:40:02.454368 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fwqd6"] Mar 09 14:40:02 crc kubenswrapper[4764]: I0309 14:40:02.457178 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:02 crc kubenswrapper[4764]: I0309 14:40:02.465447 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fwqd6"] Mar 09 14:40:02 crc kubenswrapper[4764]: I0309 14:40:02.531152 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9f449aa-58d0-4541-b02d-f7240113d330-utilities\") pod \"certified-operators-fwqd6\" (UID: \"a9f449aa-58d0-4541-b02d-f7240113d330\") " pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:02 crc kubenswrapper[4764]: I0309 14:40:02.531528 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9f449aa-58d0-4541-b02d-f7240113d330-catalog-content\") pod \"certified-operators-fwqd6\" (UID: \"a9f449aa-58d0-4541-b02d-f7240113d330\") " pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:02 crc kubenswrapper[4764]: I0309 14:40:02.531750 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6dqh\" (UniqueName: \"kubernetes.io/projected/a9f449aa-58d0-4541-b02d-f7240113d330-kube-api-access-g6dqh\") pod \"certified-operators-fwqd6\" (UID: \"a9f449aa-58d0-4541-b02d-f7240113d330\") " pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:02 crc kubenswrapper[4764]: I0309 14:40:02.634052 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9f449aa-58d0-4541-b02d-f7240113d330-utilities\") pod \"certified-operators-fwqd6\" (UID: \"a9f449aa-58d0-4541-b02d-f7240113d330\") " pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:02 crc kubenswrapper[4764]: I0309 14:40:02.634120 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9f449aa-58d0-4541-b02d-f7240113d330-catalog-content\") pod \"certified-operators-fwqd6\" (UID: \"a9f449aa-58d0-4541-b02d-f7240113d330\") " pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:02 crc kubenswrapper[4764]: I0309 14:40:02.634153 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6dqh\" (UniqueName: \"kubernetes.io/projected/a9f449aa-58d0-4541-b02d-f7240113d330-kube-api-access-g6dqh\") pod \"certified-operators-fwqd6\" (UID: \"a9f449aa-58d0-4541-b02d-f7240113d330\") " pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:02 crc kubenswrapper[4764]: I0309 14:40:02.635508 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9f449aa-58d0-4541-b02d-f7240113d330-catalog-content\") pod \"certified-operators-fwqd6\" (UID: \"a9f449aa-58d0-4541-b02d-f7240113d330\") " pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:02 crc kubenswrapper[4764]: I0309 14:40:02.636097 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9f449aa-58d0-4541-b02d-f7240113d330-utilities\") pod \"certified-operators-fwqd6\" (UID: \"a9f449aa-58d0-4541-b02d-f7240113d330\") " pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:02 crc kubenswrapper[4764]: I0309 14:40:02.665495 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6dqh\" (UniqueName: \"kubernetes.io/projected/a9f449aa-58d0-4541-b02d-f7240113d330-kube-api-access-g6dqh\") pod \"certified-operators-fwqd6\" (UID: \"a9f449aa-58d0-4541-b02d-f7240113d330\") " pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:02 crc kubenswrapper[4764]: I0309 14:40:02.817168 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:02 crc kubenswrapper[4764]: I0309 14:40:02.848869 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551120-m5thp" event={"ID":"0816f954-d7d8-485c-80c2-f37396ccc846","Type":"ContainerStarted","Data":"47825136b03ce00e9b8dbc1f9567349d8325a39be5d441fbdcc3a705bf988cac"} Mar 09 14:40:03 crc kubenswrapper[4764]: I0309 14:40:03.165415 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551120-m5thp" podStartSLOduration=1.892993332 podStartE2EDuration="3.165391616s" podCreationTimestamp="2026-03-09 14:40:00 +0000 UTC" firstStartedPulling="2026-03-09 14:40:01.014022091 +0000 UTC m=+4756.264194159" lastFinishedPulling="2026-03-09 14:40:02.286420535 +0000 UTC m=+4757.536592443" observedRunningTime="2026-03-09 14:40:02.874172296 +0000 UTC m=+4758.124344204" watchObservedRunningTime="2026-03-09 14:40:03.165391616 +0000 UTC m=+4758.415563544" Mar 09 14:40:03 crc kubenswrapper[4764]: I0309 14:40:03.173788 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fwqd6"] Mar 09 14:40:03 crc kubenswrapper[4764]: W0309 14:40:03.191454 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9f449aa_58d0_4541_b02d_f7240113d330.slice/crio-c486688c64fb082ba7d8532410236dfd36e9116f3c410bc3d5d352e3a5c715f8 WatchSource:0}: Error finding container c486688c64fb082ba7d8532410236dfd36e9116f3c410bc3d5d352e3a5c715f8: Status 404 returned error can't find the container with id c486688c64fb082ba7d8532410236dfd36e9116f3c410bc3d5d352e3a5c715f8 Mar 09 14:40:03 crc kubenswrapper[4764]: I0309 14:40:03.862198 4764 generic.go:334] "Generic (PLEG): container finished" podID="0816f954-d7d8-485c-80c2-f37396ccc846" containerID="47825136b03ce00e9b8dbc1f9567349d8325a39be5d441fbdcc3a705bf988cac" exitCode=0 Mar 09 14:40:03 crc kubenswrapper[4764]: I0309 14:40:03.862325 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551120-m5thp" event={"ID":"0816f954-d7d8-485c-80c2-f37396ccc846","Type":"ContainerDied","Data":"47825136b03ce00e9b8dbc1f9567349d8325a39be5d441fbdcc3a705bf988cac"} Mar 09 14:40:03 crc kubenswrapper[4764]: I0309 14:40:03.866765 4764 generic.go:334] "Generic (PLEG): container finished" podID="a9f449aa-58d0-4541-b02d-f7240113d330" containerID="25d756717148db03276e3bb732fb1351545554969bae0b614cc5f1920c670594" exitCode=0 Mar 09 14:40:03 crc kubenswrapper[4764]: I0309 14:40:03.866801 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwqd6" event={"ID":"a9f449aa-58d0-4541-b02d-f7240113d330","Type":"ContainerDied","Data":"25d756717148db03276e3bb732fb1351545554969bae0b614cc5f1920c670594"} Mar 09 14:40:03 crc kubenswrapper[4764]: I0309 14:40:03.866825 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwqd6" event={"ID":"a9f449aa-58d0-4541-b02d-f7240113d330","Type":"ContainerStarted","Data":"c486688c64fb082ba7d8532410236dfd36e9116f3c410bc3d5d352e3a5c715f8"} Mar 09 14:40:05 crc kubenswrapper[4764]: I0309 14:40:05.328764 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551120-m5thp" Mar 09 14:40:05 crc kubenswrapper[4764]: I0309 14:40:05.421948 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh8gd\" (UniqueName: \"kubernetes.io/projected/0816f954-d7d8-485c-80c2-f37396ccc846-kube-api-access-xh8gd\") pod \"0816f954-d7d8-485c-80c2-f37396ccc846\" (UID: \"0816f954-d7d8-485c-80c2-f37396ccc846\") " Mar 09 14:40:05 crc kubenswrapper[4764]: I0309 14:40:05.430105 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0816f954-d7d8-485c-80c2-f37396ccc846-kube-api-access-xh8gd" (OuterVolumeSpecName: "kube-api-access-xh8gd") pod "0816f954-d7d8-485c-80c2-f37396ccc846" (UID: "0816f954-d7d8-485c-80c2-f37396ccc846"). InnerVolumeSpecName "kube-api-access-xh8gd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:40:05 crc kubenswrapper[4764]: I0309 14:40:05.524818 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh8gd\" (UniqueName: \"kubernetes.io/projected/0816f954-d7d8-485c-80c2-f37396ccc846-kube-api-access-xh8gd\") on node \"crc\" DevicePath \"\"" Mar 09 14:40:05 crc kubenswrapper[4764]: I0309 14:40:05.894306 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwqd6" event={"ID":"a9f449aa-58d0-4541-b02d-f7240113d330","Type":"ContainerStarted","Data":"72f615c2613239454787d89a4736ccba1f726574b352ae01d064ac1a12aa73a8"} Mar 09 14:40:05 crc kubenswrapper[4764]: I0309 14:40:05.897725 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551120-m5thp" event={"ID":"0816f954-d7d8-485c-80c2-f37396ccc846","Type":"ContainerDied","Data":"418d95bf9922ee6c68fc6ee54bf92d132a0fa80cbe02cb056aa7e9af26613f17"} Mar 09 14:40:05 crc kubenswrapper[4764]: I0309 14:40:05.897798 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="418d95bf9922ee6c68fc6ee54bf92d132a0fa80cbe02cb056aa7e9af26613f17" Mar 09 14:40:05 crc kubenswrapper[4764]: I0309 14:40:05.897847 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551120-m5thp" Mar 09 14:40:05 crc kubenswrapper[4764]: I0309 14:40:05.983088 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551114-9w2wl"] Mar 09 14:40:05 crc kubenswrapper[4764]: I0309 14:40:05.996293 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551114-9w2wl"] Mar 09 14:40:06 crc kubenswrapper[4764]: I0309 14:40:06.911196 4764 generic.go:334] "Generic (PLEG): container finished" podID="a9f449aa-58d0-4541-b02d-f7240113d330" containerID="72f615c2613239454787d89a4736ccba1f726574b352ae01d064ac1a12aa73a8" exitCode=0 Mar 09 14:40:06 crc kubenswrapper[4764]: I0309 14:40:06.911256 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwqd6" event={"ID":"a9f449aa-58d0-4541-b02d-f7240113d330","Type":"ContainerDied","Data":"72f615c2613239454787d89a4736ccba1f726574b352ae01d064ac1a12aa73a8"} Mar 09 14:40:07 crc kubenswrapper[4764]: I0309 14:40:07.577776 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b063d56-a0eb-4b2d-8b53-2c63feead99e" path="/var/lib/kubelet/pods/3b063d56-a0eb-4b2d-8b53-2c63feead99e/volumes" Mar 09 14:40:07 crc kubenswrapper[4764]: I0309 14:40:07.928983 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwqd6" event={"ID":"a9f449aa-58d0-4541-b02d-f7240113d330","Type":"ContainerStarted","Data":"974be5246029785143e51ade81934acd7da98f7f6ac1d264e3c5b0a2924db155"} Mar 09 14:40:07 crc kubenswrapper[4764]: I0309 14:40:07.965130 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fwqd6" podStartSLOduration=2.443424111 podStartE2EDuration="5.965101926s" podCreationTimestamp="2026-03-09 14:40:02 +0000 UTC" firstStartedPulling="2026-03-09 14:40:03.869787755 +0000 UTC m=+4759.119959673" lastFinishedPulling="2026-03-09 14:40:07.39146558 +0000 UTC m=+4762.641637488" observedRunningTime="2026-03-09 14:40:07.949386146 +0000 UTC m=+4763.199558054" watchObservedRunningTime="2026-03-09 14:40:07.965101926 +0000 UTC m=+4763.215273834" Mar 09 14:40:12 crc kubenswrapper[4764]: I0309 14:40:12.818876 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:12 crc kubenswrapper[4764]: I0309 14:40:12.819990 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:12 crc kubenswrapper[4764]: I0309 14:40:12.880691 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:13 crc kubenswrapper[4764]: I0309 14:40:13.036043 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:13 crc kubenswrapper[4764]: I0309 14:40:13.125407 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fwqd6"] Mar 09 14:40:13 crc kubenswrapper[4764]: I0309 14:40:13.269340 4764 scope.go:117] "RemoveContainer" containerID="ea38c17635079a211c8b307953c91fb9e37a06f4db03c9c46e225e504f02afec" Mar 09 14:40:15 crc kubenswrapper[4764]: I0309 14:40:15.003705 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fwqd6" podUID="a9f449aa-58d0-4541-b02d-f7240113d330" containerName="registry-server" containerID="cri-o://974be5246029785143e51ade81934acd7da98f7f6ac1d264e3c5b0a2924db155" gracePeriod=2 Mar 09 14:40:15 crc kubenswrapper[4764]: I0309 14:40:15.474877 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:15 crc kubenswrapper[4764]: I0309 14:40:15.584796 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9f449aa-58d0-4541-b02d-f7240113d330-catalog-content\") pod \"a9f449aa-58d0-4541-b02d-f7240113d330\" (UID: \"a9f449aa-58d0-4541-b02d-f7240113d330\") " Mar 09 14:40:15 crc kubenswrapper[4764]: I0309 14:40:15.585070 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9f449aa-58d0-4541-b02d-f7240113d330-utilities\") pod \"a9f449aa-58d0-4541-b02d-f7240113d330\" (UID: \"a9f449aa-58d0-4541-b02d-f7240113d330\") " Mar 09 14:40:15 crc kubenswrapper[4764]: I0309 14:40:15.585188 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6dqh\" (UniqueName: \"kubernetes.io/projected/a9f449aa-58d0-4541-b02d-f7240113d330-kube-api-access-g6dqh\") pod \"a9f449aa-58d0-4541-b02d-f7240113d330\" (UID: \"a9f449aa-58d0-4541-b02d-f7240113d330\") " Mar 09 14:40:15 crc kubenswrapper[4764]: I0309 14:40:15.587989 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9f449aa-58d0-4541-b02d-f7240113d330-utilities" (OuterVolumeSpecName: "utilities") pod "a9f449aa-58d0-4541-b02d-f7240113d330" (UID: "a9f449aa-58d0-4541-b02d-f7240113d330"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:40:15 crc kubenswrapper[4764]: I0309 14:40:15.595024 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9f449aa-58d0-4541-b02d-f7240113d330-kube-api-access-g6dqh" (OuterVolumeSpecName: "kube-api-access-g6dqh") pod "a9f449aa-58d0-4541-b02d-f7240113d330" (UID: "a9f449aa-58d0-4541-b02d-f7240113d330"). InnerVolumeSpecName "kube-api-access-g6dqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:40:15 crc kubenswrapper[4764]: I0309 14:40:15.687331 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9f449aa-58d0-4541-b02d-f7240113d330-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:40:15 crc kubenswrapper[4764]: I0309 14:40:15.687369 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6dqh\" (UniqueName: \"kubernetes.io/projected/a9f449aa-58d0-4541-b02d-f7240113d330-kube-api-access-g6dqh\") on node \"crc\" DevicePath \"\"" Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.020940 4764 generic.go:334] "Generic (PLEG): container finished" podID="a9f449aa-58d0-4541-b02d-f7240113d330" containerID="974be5246029785143e51ade81934acd7da98f7f6ac1d264e3c5b0a2924db155" exitCode=0 Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.021039 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fwqd6" Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.021054 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwqd6" event={"ID":"a9f449aa-58d0-4541-b02d-f7240113d330","Type":"ContainerDied","Data":"974be5246029785143e51ade81934acd7da98f7f6ac1d264e3c5b0a2924db155"} Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.021750 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwqd6" event={"ID":"a9f449aa-58d0-4541-b02d-f7240113d330","Type":"ContainerDied","Data":"c486688c64fb082ba7d8532410236dfd36e9116f3c410bc3d5d352e3a5c715f8"} Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.021800 4764 scope.go:117] "RemoveContainer" containerID="974be5246029785143e51ade81934acd7da98f7f6ac1d264e3c5b0a2924db155" Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.050501 4764 scope.go:117] "RemoveContainer" containerID="72f615c2613239454787d89a4736ccba1f726574b352ae01d064ac1a12aa73a8" Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.081994 4764 scope.go:117] "RemoveContainer" containerID="25d756717148db03276e3bb732fb1351545554969bae0b614cc5f1920c670594" Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.134440 4764 scope.go:117] "RemoveContainer" containerID="974be5246029785143e51ade81934acd7da98f7f6ac1d264e3c5b0a2924db155" Mar 09 14:40:16 crc kubenswrapper[4764]: E0309 14:40:16.135463 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"974be5246029785143e51ade81934acd7da98f7f6ac1d264e3c5b0a2924db155\": container with ID starting with 974be5246029785143e51ade81934acd7da98f7f6ac1d264e3c5b0a2924db155 not found: ID does not exist" containerID="974be5246029785143e51ade81934acd7da98f7f6ac1d264e3c5b0a2924db155" Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.135501 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"974be5246029785143e51ade81934acd7da98f7f6ac1d264e3c5b0a2924db155"} err="failed to get container status \"974be5246029785143e51ade81934acd7da98f7f6ac1d264e3c5b0a2924db155\": rpc error: code = NotFound desc = could not find container \"974be5246029785143e51ade81934acd7da98f7f6ac1d264e3c5b0a2924db155\": container with ID starting with 974be5246029785143e51ade81934acd7da98f7f6ac1d264e3c5b0a2924db155 not found: ID does not exist" Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.135528 4764 scope.go:117] "RemoveContainer" containerID="72f615c2613239454787d89a4736ccba1f726574b352ae01d064ac1a12aa73a8" Mar 09 14:40:16 crc kubenswrapper[4764]: E0309 14:40:16.136159 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72f615c2613239454787d89a4736ccba1f726574b352ae01d064ac1a12aa73a8\": container with ID starting with 72f615c2613239454787d89a4736ccba1f726574b352ae01d064ac1a12aa73a8 not found: ID does not exist" containerID="72f615c2613239454787d89a4736ccba1f726574b352ae01d064ac1a12aa73a8" Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.136215 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72f615c2613239454787d89a4736ccba1f726574b352ae01d064ac1a12aa73a8"} err="failed to get container status \"72f615c2613239454787d89a4736ccba1f726574b352ae01d064ac1a12aa73a8\": rpc error: code = NotFound desc = could not find container \"72f615c2613239454787d89a4736ccba1f726574b352ae01d064ac1a12aa73a8\": container with ID starting with 72f615c2613239454787d89a4736ccba1f726574b352ae01d064ac1a12aa73a8 not found: ID does not exist" Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.136252 4764 scope.go:117] "RemoveContainer" containerID="25d756717148db03276e3bb732fb1351545554969bae0b614cc5f1920c670594" Mar 09 14:40:16 crc kubenswrapper[4764]: E0309 14:40:16.136580 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25d756717148db03276e3bb732fb1351545554969bae0b614cc5f1920c670594\": container with ID starting with 25d756717148db03276e3bb732fb1351545554969bae0b614cc5f1920c670594 not found: ID does not exist" containerID="25d756717148db03276e3bb732fb1351545554969bae0b614cc5f1920c670594" Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.136609 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25d756717148db03276e3bb732fb1351545554969bae0b614cc5f1920c670594"} err="failed to get container status \"25d756717148db03276e3bb732fb1351545554969bae0b614cc5f1920c670594\": rpc error: code = NotFound desc = could not find container \"25d756717148db03276e3bb732fb1351545554969bae0b614cc5f1920c670594\": container with ID starting with 25d756717148db03276e3bb732fb1351545554969bae0b614cc5f1920c670594 not found: ID does not exist" Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.518109 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9f449aa-58d0-4541-b02d-f7240113d330-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9f449aa-58d0-4541-b02d-f7240113d330" (UID: "a9f449aa-58d0-4541-b02d-f7240113d330"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.609543 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9f449aa-58d0-4541-b02d-f7240113d330-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.661495 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fwqd6"] Mar 09 14:40:16 crc kubenswrapper[4764]: I0309 14:40:16.671554 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fwqd6"] Mar 09 14:40:17 crc kubenswrapper[4764]: I0309 14:40:17.572541 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9f449aa-58d0-4541-b02d-f7240113d330" path="/var/lib/kubelet/pods/a9f449aa-58d0-4541-b02d-f7240113d330/volumes" Mar 09 14:40:36 crc kubenswrapper[4764]: I0309 14:40:36.786191 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="0c87ed75-4285-4084-bce3-ee8dba7671c0" containerName="galera" probeResult="failure" output="command timed out" Mar 09 14:40:36 crc kubenswrapper[4764]: I0309 14:40:36.787555 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="0c87ed75-4285-4084-bce3-ee8dba7671c0" containerName="galera" probeResult="failure" output="command timed out" Mar 09 14:40:58 crc kubenswrapper[4764]: I0309 14:40:58.371056 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:40:58 crc kubenswrapper[4764]: I0309 14:40:58.371666 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:41:28 crc kubenswrapper[4764]: I0309 14:41:28.370432 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:41:28 crc kubenswrapper[4764]: I0309 14:41:28.371120 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:41:58 crc kubenswrapper[4764]: I0309 14:41:58.369940 4764 patch_prober.go:28] interesting pod/machine-config-daemon-xxczl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:41:58 crc kubenswrapper[4764]: I0309 14:41:58.370723 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:41:58 crc kubenswrapper[4764]: I0309 14:41:58.370800 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" Mar 09 14:41:58 crc kubenswrapper[4764]: I0309 14:41:58.371999 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ad419096bfe6f5c82a80213c43472ce393de4b635e5b73b0e84a78e63ff05edb"} pod="openshift-machine-config-operator/machine-config-daemon-xxczl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 14:41:58 crc kubenswrapper[4764]: I0309 14:41:58.372066 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" podUID="6bcdd179-43c2-427c-9fac-7155c122e922" containerName="machine-config-daemon" containerID="cri-o://ad419096bfe6f5c82a80213c43472ce393de4b635e5b73b0e84a78e63ff05edb" gracePeriod=600 Mar 09 14:41:59 crc kubenswrapper[4764]: I0309 14:41:59.140170 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bcdd179-43c2-427c-9fac-7155c122e922" containerID="ad419096bfe6f5c82a80213c43472ce393de4b635e5b73b0e84a78e63ff05edb" exitCode=0 Mar 09 14:41:59 crc kubenswrapper[4764]: I0309 14:41:59.140248 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerDied","Data":"ad419096bfe6f5c82a80213c43472ce393de4b635e5b73b0e84a78e63ff05edb"} Mar 09 14:41:59 crc kubenswrapper[4764]: I0309 14:41:59.140742 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xxczl" event={"ID":"6bcdd179-43c2-427c-9fac-7155c122e922","Type":"ContainerStarted","Data":"6f849a80c39140014ef86b4ec21be0dd7b8adce14a39bc297e27a838dcb61209"} Mar 09 14:41:59 crc kubenswrapper[4764]: I0309 14:41:59.140777 4764 scope.go:117] "RemoveContainer" containerID="8dbfa76f047a7530b5728e4d5ecc2633e67095c7831f8ebe288b219074642ce2" Mar 09 14:42:00 crc kubenswrapper[4764]: I0309 14:42:00.179547 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551122-mch6k"] Mar 09 14:42:00 crc kubenswrapper[4764]: E0309 14:42:00.180486 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f449aa-58d0-4541-b02d-f7240113d330" containerName="extract-utilities" Mar 09 14:42:00 crc kubenswrapper[4764]: I0309 14:42:00.180503 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f449aa-58d0-4541-b02d-f7240113d330" containerName="extract-utilities" Mar 09 14:42:00 crc kubenswrapper[4764]: E0309 14:42:00.180512 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f449aa-58d0-4541-b02d-f7240113d330" containerName="extract-content" Mar 09 14:42:00 crc kubenswrapper[4764]: I0309 14:42:00.180519 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f449aa-58d0-4541-b02d-f7240113d330" containerName="extract-content" Mar 09 14:42:00 crc kubenswrapper[4764]: E0309 14:42:00.180533 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f449aa-58d0-4541-b02d-f7240113d330" containerName="registry-server" Mar 09 14:42:00 crc kubenswrapper[4764]: I0309 14:42:00.180542 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f449aa-58d0-4541-b02d-f7240113d330" containerName="registry-server" Mar 09 14:42:00 crc kubenswrapper[4764]: E0309 14:42:00.180570 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0816f954-d7d8-485c-80c2-f37396ccc846" containerName="oc" Mar 09 14:42:00 crc kubenswrapper[4764]: I0309 14:42:00.180576 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0816f954-d7d8-485c-80c2-f37396ccc846" containerName="oc" Mar 09 14:42:00 crc kubenswrapper[4764]: I0309 14:42:00.180787 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0816f954-d7d8-485c-80c2-f37396ccc846" containerName="oc" Mar 09 14:42:00 crc kubenswrapper[4764]: I0309 14:42:00.180818 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9f449aa-58d0-4541-b02d-f7240113d330" containerName="registry-server" Mar 09 14:42:00 crc kubenswrapper[4764]: I0309 14:42:00.181614 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551122-mch6k" Mar 09 14:42:00 crc kubenswrapper[4764]: I0309 14:42:00.186093 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:42:00 crc kubenswrapper[4764]: I0309 14:42:00.186363 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:42:00 crc kubenswrapper[4764]: I0309 14:42:00.186542 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4s7mr" Mar 09 14:42:00 crc kubenswrapper[4764]: I0309 14:42:00.189822 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551122-mch6k"] Mar 09 14:42:00 crc kubenswrapper[4764]: I0309 14:42:00.252731 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6npdp\" (UniqueName: \"kubernetes.io/projected/0e8d0c58-5954-4429-80ac-c71005261407-kube-api-access-6npdp\") pod \"auto-csr-approver-29551122-mch6k\" (UID: \"0e8d0c58-5954-4429-80ac-c71005261407\") " pod="openshift-infra/auto-csr-approver-29551122-mch6k" Mar 09 14:42:00 crc kubenswrapper[4764]: I0309 14:42:00.355126 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6npdp\" (UniqueName: \"kubernetes.io/projected/0e8d0c58-5954-4429-80ac-c71005261407-kube-api-access-6npdp\") pod \"auto-csr-approver-29551122-mch6k\" (UID: \"0e8d0c58-5954-4429-80ac-c71005261407\") " pod="openshift-infra/auto-csr-approver-29551122-mch6k" Mar 09 14:42:00 crc kubenswrapper[4764]: I0309 14:42:00.390812 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6npdp\" (UniqueName: \"kubernetes.io/projected/0e8d0c58-5954-4429-80ac-c71005261407-kube-api-access-6npdp\") pod \"auto-csr-approver-29551122-mch6k\" (UID: \"0e8d0c58-5954-4429-80ac-c71005261407\") " pod="openshift-infra/auto-csr-approver-29551122-mch6k" Mar 09 14:42:00 crc kubenswrapper[4764]: I0309 14:42:00.514926 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551122-mch6k" Mar 09 14:42:01 crc kubenswrapper[4764]: I0309 14:42:01.029998 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551122-mch6k"] Mar 09 14:42:01 crc kubenswrapper[4764]: W0309 14:42:01.037550 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e8d0c58_5954_4429_80ac_c71005261407.slice/crio-66f9afd69aa99c6bfb609c0142357e7708988173a330210de7ab6fd5a143142f WatchSource:0}: Error finding container 66f9afd69aa99c6bfb609c0142357e7708988173a330210de7ab6fd5a143142f: Status 404 returned error can't find the container with id 66f9afd69aa99c6bfb609c0142357e7708988173a330210de7ab6fd5a143142f Mar 09 14:42:01 crc kubenswrapper[4764]: I0309 14:42:01.041545 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 14:42:01 crc kubenswrapper[4764]: I0309 14:42:01.174302 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551122-mch6k" event={"ID":"0e8d0c58-5954-4429-80ac-c71005261407","Type":"ContainerStarted","Data":"66f9afd69aa99c6bfb609c0142357e7708988173a330210de7ab6fd5a143142f"} Mar 09 14:42:03 crc kubenswrapper[4764]: I0309 14:42:03.200148 4764 generic.go:334] "Generic (PLEG): container finished" podID="0e8d0c58-5954-4429-80ac-c71005261407" containerID="82e217f9c3ee76f8d80ca2ba24326c1f5ee571a8453e5e8852a52d9a48ac11ad" exitCode=0 Mar 09 14:42:03 crc kubenswrapper[4764]: I0309 14:42:03.200216 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551122-mch6k" event={"ID":"0e8d0c58-5954-4429-80ac-c71005261407","Type":"ContainerDied","Data":"82e217f9c3ee76f8d80ca2ba24326c1f5ee571a8453e5e8852a52d9a48ac11ad"} Mar 09 14:42:04 crc kubenswrapper[4764]: I0309 14:42:04.622843 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551122-mch6k" Mar 09 14:42:04 crc kubenswrapper[4764]: I0309 14:42:04.679229 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6npdp\" (UniqueName: \"kubernetes.io/projected/0e8d0c58-5954-4429-80ac-c71005261407-kube-api-access-6npdp\") pod \"0e8d0c58-5954-4429-80ac-c71005261407\" (UID: \"0e8d0c58-5954-4429-80ac-c71005261407\") " Mar 09 14:42:04 crc kubenswrapper[4764]: I0309 14:42:04.690976 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e8d0c58-5954-4429-80ac-c71005261407-kube-api-access-6npdp" (OuterVolumeSpecName: "kube-api-access-6npdp") pod "0e8d0c58-5954-4429-80ac-c71005261407" (UID: "0e8d0c58-5954-4429-80ac-c71005261407"). InnerVolumeSpecName "kube-api-access-6npdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:42:04 crc kubenswrapper[4764]: I0309 14:42:04.781804 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6npdp\" (UniqueName: \"kubernetes.io/projected/0e8d0c58-5954-4429-80ac-c71005261407-kube-api-access-6npdp\") on node \"crc\" DevicePath \"\"" Mar 09 14:42:05 crc kubenswrapper[4764]: I0309 14:42:05.224151 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551122-mch6k" event={"ID":"0e8d0c58-5954-4429-80ac-c71005261407","Type":"ContainerDied","Data":"66f9afd69aa99c6bfb609c0142357e7708988173a330210de7ab6fd5a143142f"} Mar 09 14:42:05 crc kubenswrapper[4764]: I0309 14:42:05.225027 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66f9afd69aa99c6bfb609c0142357e7708988173a330210de7ab6fd5a143142f" Mar 09 14:42:05 crc kubenswrapper[4764]: I0309 14:42:05.224267 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551122-mch6k" Mar 09 14:42:05 crc kubenswrapper[4764]: I0309 14:42:05.734819 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551116-plhlc"] Mar 09 14:42:05 crc kubenswrapper[4764]: I0309 14:42:05.744331 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551116-plhlc"] Mar 09 14:42:07 crc kubenswrapper[4764]: I0309 14:42:07.591852 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6" path="/var/lib/kubelet/pods/25d1b986-e3dd-4d1d-9c3f-1c7c3a5c22f6/volumes" Mar 09 14:42:13 crc kubenswrapper[4764]: I0309 14:42:13.389342 4764 scope.go:117] "RemoveContainer" containerID="ea7fd0d4a7f1c688c8ce67675688ffbac7999aee93247d1046a8ee856d2e349c" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515153556126024456 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015153556127017374 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015153544122016507 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015153544122015457 5ustar corecore